MENLO PARK — Meta unveiled its Horizon Lens augmented reality glasses at a developer conference on Thursday, a sleek device that packs cameras, microphones, speakers, and an on-device AI processor into a frame that the company says is indistinguishable from ordinary eyewear at a distance.

The glasses can identify objects in a user's field of view, translate foreign-language text overlaid on the lens in real time, suggest contextual information about people the user is speaking with (using opt-in contact recognition), and respond to spoken questions without requiring a phone.

On-Device AI

The device runs a compressed version of Meta's Llama 4 model directly on a custom neuromorphic chip embedded in the arm of the glasses. On-device processing means most interactions do not require a network connection, addressing a major privacy concern that plagued earlier smart-glass products.

"We made a deliberate decision: if it can run on the device, it should run on the device," said Meta's VP of Reality Labs, Dr Sunita Kapoor. "The cloud is a fallback, not the default."

Privacy Debate

Critics and privacy researchers were quick to raise concerns. The glasses are equipped with cameras capable of facial recognition, and while Meta says this feature is disabled by default and requires explicit opt-in, civil liberties groups argue that covert activation remains a risk. Senator Alicia Greene called for Congressional hearings on wearable AI before commercial launch.