Girls' Fund - Plan International USA

by Plan International USA
Play Video
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Girls' Fund - Plan International USA
Dec 20, 2022

Cracking the code for equality in AI

Vidya speaks at GirlCon
Vidya speaks at GirlCon

Vidya, age 19, is the director of education at Encode Justice: a coalition of youth activists and change-makers fighting for human rights, accountability and justice under artificial intelligence. On International Day of the Girl, she led a discussion at a Plan-led virtual forum called Girls vs. the Machine: The Algorithms are Sexist, which explored how the internet’s algorithms and artificial intelligence are perpetuating cultural gendered stereotypes and misrepresenting girls in society.

Here, Vidya dives into her work with Encode Justice, her experience with Plan on International Day of the Girl and why youth activism in the tech space is so vital for our future. 

Q: Thank you for taking the time to chat with us about your work! To start, can you please share a bit about who you are and how you got interested in youth activism around technology?

A: My name is Vidya. I’m a freshman at the University of Illinois Urbana-Champaign. I’m studying computer science here. Throughout high school, I was pretty interested in technology. I knew I always wanted to be involved in the tech space … both my parents are software engineers, so I was pretty exposed to it growing up. But I think it was really my junior year when I heard more about the ethical side of this space, which is when Encode Justice had slowly started … The founder, Sneha Revanur, reached out to me because I do a lot of work with equity and diversity in tech.

Q: What do you do as part of Encode Justice?

A: I sort of cultivated a whole education sector [at Encode Justice], where we created a workshop curriculum for high school students. We used to go around to different high schools in the country, educating students about the implications of artificial intelligence, talking about what AI is, how it has a potential bias and how it’s impacting each of us in our everyday lives.

Especially during the pandemic, we saw technology in basically every field. So, having students talk to other students definitely made it help them understand it a lot better, and they valued the information that we talked about.

I learned a lot in this process of creating this curriculum and talking to other students … we’re not really exposed to the biases [in AI] and how we can be impacted by this … they talked about facial recognition technology — a lot of them had been profiled themselves. And so, listening to their stories and talking about how they’ve felt discriminated against or see that their social media algorithms are acting a certain way, and actually hearing their testimonials was really impactful.

All of our workshop curriculums center around different topics. I try to focus on AI and its intersections, such as healthcare and policy … if we’re talking about policy, I usually focus on facial recognition technology. So then I would talk about how in criminal court cases or in our criminal history, there’s a lot of bias within how people have been targeted in the past.

Q: Can you give an example of how technology can discriminate based on human bias?

A: Minority groups, people of color and women have been discriminated against in our criminal data. And so, when we put that into facial recognition technologies and it’s trying to profile people for criminal activity, it’s obviously more biased because of like that … the topic becomes a lot more serious and more relevant when we put it in that context.

Q: How did you get involved with Plan?

A: Plan reached out to Encode Justice right before International Day of the Girl. I was really interested because I actually run GirlCon, an international conference focused on tech and women’s empowerment … this machine algorithm bias and diversification of the tech field was something I’m really interested in. So, I thought it would be cool to work with Plan to for their event.

Vidya speaks at GirlCon, a conference that strives to fix the lack of diversity in STEM.

I really liked the tone of the whole [Plan virtual forum] and sort of the message that we brought out at the end. This topic can be really serious and really scary … I thought that was really interesting on how we felt empowered after that to take charge and make a change.

We can’t stop artificial intelligence, or we can’t stop the development of it. We always talk about how it can sound like we’re trying to stop the development of technology — we’re really not … We just want people to be more conscious of what they’re doing. You know, if it’s harming people, then there’s no use in that. So that’s sort of what I wanted to take away from that panel is [for people to know] it’s up to us to be conscious of that.

Q: Have you experienced gender inequality in your own life? How? 

A: Especially as a [computer science] student right now, I can see that throughout my progression of high school and now in college there is that gender gap … I definitely feel outnumbered and I think there’s an unspoken double standard. Like, I go into a meeting or I go into any of these tech clubs and you feel like there’s something to prove or something to show as a girl — you feel like you have to prove something extra. And I think forums like [Plan’s International Day of the Girl event] are definitely helpful in talking about that, and helping girls know that they’re not the only one that’s facing that kind of thing.

Q:  Why do you think it’s important for young people to get involved with activism and technology?

A: We are the people that will eventually take over these companies and will eventually start running these things. So, I think it’s really, really important that initiatives like this exist. I think for us, we feel like we’re qualified [to take the lead] because we are facing these problems every day — we’re surrounded by the algorithms we grew up with it, which makes our voice really important to listen to.

Links:

Share on Twitter Share on Facebook
Comments:

About Project Reports

Project Reports on GlobalGiving are posted directly to globalgiving.org by Project Leaders as they are completed, generally every 3-4 months. To protect the integrity of these documents, GlobalGiving does not alter them; therefore you may find some language or formatting issues.

If you donate to this project or have donated to this project, you will get an e-mail when this project posts a report. You can also subscribe for reports via e-mail without donating.

Get Reports via Email

We'll only email you new reports and updates about this project.

Organization Information

Plan International USA

Location: Providence, RI - USA
Website:
Facebook: Facebook Page
Twitter: @PlanUSA
Project Leader:
Catalina Fischer
Warwick, RI United States
$32,991 raised of $100,000 goal
 
700 donations
$67,009 to go
Donate Now
lock
Donating through GlobalGiving is safe, secure, and easy with many payment options to choose from. View other ways to donate

Plan International USA has earned this recognition on GlobalGiving:

Help raise money!

Support this important cause by creating a personalized fundraising page.

Start a Fundraiser

Learn more about GlobalGiving

Teenage Science Students
Vetting +
Due Diligence

Snorkeler
Our
Impact

Woman Holding a Gift Card
Give
Gift Cards

Young Girl with a Bicycle
GlobalGiving
Guarantee

Sign up for the GlobalGiving Newsletter

WARNING: Javascript is currently disabled or is not available in your browser. GlobalGiving makes extensive use of Javascript and will not function properly with Javascript disabled. Please enable Javascript and refresh this page.