DINT

DINT

Share this post

DINT
DINT
DINT #72 - Big Tech Refuses to Be Inclusive, 4 Recent Examples
Copy link
Facebook
Email
Notes
More

DINT #72 - Big Tech Refuses to Be Inclusive, 4 Recent Examples

Plus: YouTube wins a dismissal of racism lawsuit brought by Black and Latinx content creators.

Aug 18, 2023
∙ Paid
2

Share this post

DINT
DINT
DINT #72 - Big Tech Refuses to Be Inclusive, 4 Recent Examples
Copy link
Facebook
Email
Notes
More
Share

News Briefs

Computing and Its Growing Effect on the Environment (The Economist)

Worldcoin Refused Kenya’s Order to Stop Iris Scans (TechCrunch)

Amazon’s $25USD Per Video Offer Amuses TikTok Influencers (Amazon)

YouTube’s Racism Lawsuit Dismissed by Federal Judge (Reuters)

Facial Rec in Schools Threatens Student Safety: NY State Report (People of Color in Tech)



Friday Feature

In 2015, Google introduced its Photos app. One of its features was the ability to search for images in your library without having to tag them yourself. Type in ‘water’ and you see all photos you have in your Google account with water in it. In a perfect world, this tech would simply work for all people. Sadly, the Big Tech industry again exposes its approach to inclusivity and equality in product development. Here’s an example:

A Black photographer used the app to search for images of gorillas. The algorithm produced images of him and his friends (who are also Black). Google pulled any mention of gorillas from the Photos AI image detection system. It even extended that ban to all primates, promising to fix the issue at a later date.

To this day if you have images of primates in your photos on your Google account, you won’t be able to find any search results at all.

Google’s new photo app, Lens, also has the primate-blindness when searching the web.

This erasure of functionality for a system designed for everyone to use amounts to an erasure of the very people affected by the error. But tech has a mindset that’s quite different. The quote from a former researcher for the firm offers keen look into the mindset at Google:

“These systems are never foolproof,” Dr. Margaret Mitchell told the New York Times in May. “Because billions of people use Google’s services, even rare glitches that happen to only one person out of a billion users will surface.

*Emphasis added

“It only takes one mistake to have massive social ramifications,” she said, referring to it as “the poisoned needle in a haystack.”

What Dr. Mitchel didn’t mention is that more than one person is affected by categorizing Black people as gorillas and many other missteps Google takes when launching AI-powered products.

As those affected by it know, racism is systematic not episodic. One episode gives visibility to how sinister systems are when they ignore categorizing racial tropes.

“You have to think about how often someone needs to label a gorilla versus perpetuating harmful stereotypes,” Dr. Mitchell went on to tell The Times. “The benefits don’t outweigh the potential harms of doing it wrong.”

Dr. Mitchell gives voice to the actions of Google and other Big Tech giants who refuse to make inclusion a priority before releasing AI-related tools to the public.

This isn’t Google’s first time misidentifying a Black person as an animal. The Times also reports problems with Google’s Nest product identified Black people as animals. The feature was fixed before launch, at least.

Are Black People Being Erased?

These misidentifications aren’t isolated but rather they reveal an ongoing pattern of non-inclusion, giving the impression Black and Brown people don’t factor into product development at all in Big Tech. Here’s why:

2009 - HP Blames Faulty Facial Recognition on Standard Algorithm

HP’s webcam facial tracking tech didn’t identify Black people at all. This video shows the incident:

HP responded with the excuse explanation the system was built on a standard algorithm and that they were researching the problem.

Keep reading with a 7-day free trial

Subscribe to DINT to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 The Panoply Group
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More