Meta Tanks Diverse Hiring and Reveals the Dark Side of Its AI Profiles - DINT 136
The start to Meta's year isn't that great. For one, Financial Times reminded the world of Meta's biased, stereotypical AI profiles and the company very publicly denounced any diverse hiring efforts.
This Week in Tech, Race and Gender
We start the year off with yet another discovery of a test project at a tech company that features caricatures of Black people.
This time, the culprit is Meta. Meta is the parent company of some of the largest, most popular apps in the world, Facebook, Instagram, WhatsApp, and Threads. Meta also delves into the world of AI with its Llama large language model, which many believe to be superior to other technology in this space.
With all of this success, Meta has the bandwidth to carefully consider its testing structures and use due diligence when creating public-facing technology. This isn’t the case. In one of the most egregious instances of tech racism yet, Meta created fake profiles of Black people featuring stereotypes.
(Also of note: Meta eliminated programs dedicated to creating a diverse workforce effective immediately, according to reporting by TechCrunch.)
–
2 recent examples of Google’s challenges in launching inclusive products
Google’s Photos Search Renders Black People as Gorillas
Google’s AI Overview Leaves Black Users Behind
Excerpt from our coverage of this issue in 2024: [Dr. Joy] Buolamwini speaks about being ex-coded, being denied opportunities because one is unaware of the algorithms being designed to exclude them. She also mentions the coded gaze, which Buolamwini defines as the evidence of encoded discrimination and exclusion in tech products.
Timeline of other companies and their erasure of Black and Brown people in product development
2023 - Report: Self-Driving Cars Can’t See Darker-Skinned People
Authors of a recently released study on bias in AI systems is calling on government regulators and car makers to share more data about development and testing of systems autonomous vehicles use to identify people.
2023 - OpenAI’s ChatGPT creates biased responses to research questions …
… minimizing the impact of Black people and women in the tech industry.
2022 - Apple Watch Doesn’t Give Accurate Readings to Dark-Skinned Customers
Apple’s Apple Watch had a feature to test blood oxygen levels that didn’t work on people with darker skin. The firm is now a defendant in a class-action lawsuit brought by Apple Watch users who received false results because of their skin tone. (USA Today)
This timeline gives a high-level view of a problem Black and Brown people face every day. We are not considered. If we are considered it’s because it’s too publicly embarrassing not to. As in the case of Google, when we bring up issues we get half-hearted attempts not to fix problems but to avoid future litigation.
2019 - Google Cuts Corners to Trick People of Color into Giving Up Biometric Data
Google prepped for the release of the Pixel 4 (and its new face unlock feature). The firm hired contractors to take photos of Black and Brown people only identified homeless people and students as Black or with darker skin. They did this to quickly and easily convince the person to agree to the facial recognition scanning needed to fill the database with enough images to create a pattern for the phone unlock system. Contractors were instructed to lie or mislead facial recognition subject to get them to provide their face for Google’s system. (The New York Daily News)
2015 - Google’s photo algorithm tags Black people as gorillas.
Google responded by removing the ability to search for any primates at all and has yet to calibrate its AI system to differentiate between a Black person and a gorilla (The New York Times)
2009 - HP Blames Faulty Facial Recognition on Standard Algorithm
HP’s webcam facial tracking tech didn’t identify Black people at all.
—
What’s your take on the start of 2025 in tech? Optimistic things will be come more inclusive or doubtful things will change?
Let’s not forget this coverage as well:
DINT #56 – Diversity Director Drops Dime on Meta, Gets Fired and Followed by Employer
Andrea Dorch thought she was simply doing her job when she reported inconsistencies in a development’s awards to subcontractors. Little did she know that only months later she would be out of work and under surveillance by her former employer.
Palate Cleanser
My favorite teacher returns with a peek inside his classroom and the dreaded name pronunciation challenges teachers face.