Meta launches AI-first sunglasses with Oakley - This Week in Tech, Race, and Gender - DINT 157
Meta will release its newest product with Oakley next month and it’s set to replace smart phones according to Yahoo Finance tech editor Dan Howley.
Called the Oakley Meta HSTN, it’s a set of sunglasses that produce sound for your ears only, they take photos and record video, they also work using Meta AI’s voice commands. Meta invested in a marketing campaign featuring soccer star Kylian Mbappé and Super Bowl champ footballer Patrick Mahomes.
All the promotional items show physical activities: surfing, skateboarding, basketball. Users can post images and videos of their experiences right to Instagram, as well.
This development aligns with Sinead Bovell’s latest article predicting smartphones will go the way of the fax machine, to be replaced by an AI-first device.
OpenAI is said to be developing an AI-first product, putting a fire under industry peers to ship their latest AI-driven innovations.
Related Coverage:
While this announcement marks a turning point for Meta, and possibly for the tech industry as a whole, there’s still no public mention of the ways these glasses can be misused. We covered a story about researchers who turned AI glasses into surveillance devices that pulled in the personal data of the people the user was looking at that moment.
Will these devices infringe on privacy? Or, will they help with accountability in government, law enforcement, with legal matters, and more?
The Meta Oakley HSTN performance glasses start at $499 and will be available July 11.
More News at the Intersection of Tech Race and Gender
Who gets hired when AI makes the first decision?
Many people have no idea their resume is being evaluated by an algorithm. Yet AI is present at nearly every step of the hiring funnel: screening, personality assessments, video interviews, even background checks that evaluate social media activity.
Source: Digital Journal
Understanding Subgroup Fairness in Machine Learning ML
Subgroup-level performance analysis helps reveal unintended biases that may be embedded in the data or model design. Understanding this requires careful interpretation because fairness isn’t just about statistical parity—it’s also about ensuring that predictions lead to equitable outcomes when deployed in real-world systems.
Source: Market Post
How AI Perpetuates Gender Bias
Representation matters in AI because machine learning processes are often encoded with programmers’ own biases, even if inadvertently. Something as simple as using a skewed dataset in which female speakers are underrepresented could make a machine learning model less accurate when working with women..
Source: Women’s Media Center