Meta’s Smart Glasses Introduce Parking Assistance Feature
In a recent update, Meta has rolled out new features for its Ray-Ban smart glasses, enhancing their functionality for users in the US and Canada. The update brings improved natural language recognition, which means users can communicate with the AI assistant without the cumbersome phrasing of previous commands. Alongside this, functionalities such as voice messages, timers, and reminders have been introduced, allowing users to easily set reminders, including finding their parked cars.
Meta’s CTO, Andrew Bosworth, made the announcement through Threads, also noting that users can use voice commands to call phone numbers or scan QR codes directly through the glasses. Although these updates are significant, the anticipated live translation feature has not yet been released, and a timeline for its arrival remains unconfirmed.
Recently, the glasses also became the center of controversy when two Harvard students used the facial recognition capabilities integrated with language processing models to identify private information about individuals through Instagram streams.
This latest upgrade is part of Meta’s ongoing efforts to innovate its smart wearable technology, promising a blend of convenience and functionality for users. As the world increasingly leans towards augmented reality solutions, these advancements in smart glasses could pave the way for broader adoption and a host of new applications.