If you haven’t heard, AI now has eyes, and Meta has unveiled some enhancements to its AI-equipped Ray-Ban Meta glasses. Wearers of the smart specs can now customize Meta AI to give detailed responses based on what’s in the surrounding environment, Meta said in a blog post for Global Accessibility Awareness Day.
Artificial intelligence is opening a whole new world for accessibility, with lots of features appearing. Tech giants like Google, Apple and Meta are engaged in efforts to create a world where people with disabilities, such as low or no vision, can more easily interact with the world around them.
Though Live AI for the Meta glasses has been around, the additional enhancements for low vision users will undoubtedly be welcomed.
Below are some of the other highlights from Meta’s accessibility-focused blog post. For more, check out the glimpse of brain accessibility features headed to Apple devices.
‘Call a volunteer’ feature expanding to 18 countries
Though it isn’t AI-focused, the Meta and Be My Eyes feature Call a Volunteer will soon be expanding to all 18 countries where Meta AI is available. Call a Volunteer launched in November 2024, in the US, Canada, the UK, Ireland and Australia, and the expansion promises to be a handy (and hands-free) feature for low vision users.
Once it’s set up, a Meta glasses user can simply ask AI to have the specs “be my eyes.” Then that person is connected to one of more than 8 million volunteers, who’ll be able to view the live camera stream from the person’s glasses and provide real-time assistance. The feature is scheduled to be available to all supported countries later this month.
Additional Meta accessibility features and research
Meta also detailed some of the existing features and research geared toward expanding accessibility for its products, especially in the extended reality space.
- Features like live captions and live speech are currently available on devices like the Quest and the Meta Horizon, and in Horizon Worlds.
- Also shown was a WhatsApp chatbot from Sign-Speaks that uses its API and Meta’s Llama AI models. The chatbot allows live translation of American Sign Language to text, and vice versa, to create easier communication between deaf and hard of hearing individuals.
For more, don’t miss the handful of new accessibility features announced by Apple.