Latest Ray-Ban Meta Glasses Update Brings Live AI and Smarter Descriptive Responses

ray-ban-meta-smart-glasses-azmotech

The Ray-Ban Meta smart glasses are already a hit with users who enjoy snapping photos or getting instant help from Meta AI at the press of a button. Recently, Meta expanded support for Meta AI to more countries, including India, coinciding with the official launch of the glasses there. One particularly meaningful use case assisting people with visual impairments just received a major upgrade. Meta has introduced new features, including enhanced descriptive responses, making the glasses even more helpful as a wearable assistant.

In a recent blog post, Meta announced that users can now customize Meta AI on Ray-Ban smart glasses to deliver more detailed and descriptive responses. For those unfamiliar, one of the standout features of these glasses is their ability to describe the surroundings an invaluable tool for users with low vision or blindness. With the new update, the AI can offer richer, more nuanced descriptions, potentially making it easier for visually impaired users to navigate and interact with their environment more confidently.

Descriptive-Responses-azmotech

The feature is currently rolling out to users in the US and Canada, with plans to expand to additional countries soon. To enable it, users can navigate to the Settings > Accessibility section within the Meta AI app. Additionally, the Be My Eyes feature is also expanding to more regions, enhancing accessibility support globally.

Meta recently released version 15 of the Ray-Ban Meta smart glasses, introducing Live AI functionality in the US and Canada. This update also adds Live Translation to all countries where Meta AI is available, enabling users to translate conversations in real time. The update further enhances personalization options and extends Meta AI’s availability to more European countries, including Austria, Ireland, and Sweden.

Meta-Ray-Ban-Version-15-azmotech

In the post, Meta also revealed a new wristband device designed to improve human-computer interaction, along with a Live Speech feature that converts spoken words into real-time text, similar to Google’s Live Captions. Additionally, Meta introduced Sign-speak, a new WhatsApp chatbot that translates American Sign Language.

What do you think about Meta AI’s new descriptive responses for the Ray-Ban smart glasses? Share your thoughts with us in the comments below.

Share this article
Shareable URL
Leave a Reply

Your email address will not be published. Required fields are marked *

Read next
0
Share