Meta’s flagship event for the year is focusing on AI-first wearable products and the software to support them. Look for marquee smart glasses news that includes the rumored Hypernova as well as a third-gen Ray-Ban line, along with deeper integration of Meta AI and an attempt to bring Horizon OS beyond what’s currently only available on Meta hardware. This is what’s important, what is different from last year’s version, and the specs and strategies we’ll be keeping an eye on most.
Hypernova smart glasses: what will be delivered
Various reports, including thorough reporting from Bloomberg’s Mark Gurman, describe a pair of high-end smart glasses for Meta that is reportedly codenamed Hypernova. The marquee feature: a monocular display that shoves a graphical interface into the lower-right corner of the wearer’s field of view, emphasizing glanceable utility over full-scene augmented reality. Think notifications, maps, and media controls you glance down at — not holograms tethered to the world.
- Hypernova smart glasses: what will be delivered
- Ray-Ban 3 and heads-up display frames: what to expect
- Meta AI across devices: on-device speed and privacy gains
- Ceres neural wristband for subtle, private gesture control
- Horizon OS opening to third-party headsets and OEMs
- Why these Meta Connect announcements could matter most
- Key specs to watch and open questions for Meta’s lineup

Price whispers range from $800 to north of $1,000, which accounts for the price tag of compact optics, sensors, and a beefier compute stack. Early info indicates preloaded apps (camera, gallery, maps) and close phone tethering for texts and alerts. If Meta hits the weight, thermal comfort, and battery life sweet spot, Hypernova could produce a new class of in-between eyewear that falls between camera-first eyewear and full AR headsets.
Ray-Ban 3 and heads-up display frames: what to expect
A separate leak mentioned a “Ray-Ban Display” model that would benefit from a small heads-up display for navigation, translations, messages, and Meta AI announcements. The design is chunkier than the present Ray-Ban lineup — presumably to house the projector and optics — but also much closer to normal eyewear than visor-style AR. Importantly, the UI you see on that lens is static; it’s not spatially anchored, and it requires much less power than moving around 3D objects.
There are also murmurs of a dual-axle Oakley path, with a model that reportedly has a camera mounted in the middle. That setup would favor action-sports capture, while the Ray-Ban line veers more toward lifestyle and into hands-free assistance. The bifurcation would be similar to the way that camera glasses are starting to segment based on their use case more than raw specs.
Meta AI across devices: on-device speed and privacy gains
Last year’s software leap also brought multimodal video understanding to the glasses, live translations, and more natural language. The next logical step is accelerated, more private, on-device AI. We can also expect Meta, however it sells the device, to spotlight Llama-first designs that can take care of low-latency tasks on your face locally — instant visual Q&A, scene descriptions, or “what am I looking at?” moments — with increasing dependence on the cloud only when necessary.
This matters for two reasons. You also have to minimize latency as much as possible; assistants that respond in fewer than around 1,000 milliseconds feel quasi-magical, and anything slower breaks flow. Second, privacy is not just policy — it’s architecture. On-device inference reduces the requirement to ship video frames to the cloud, which is a top barrier to smart glasses according to surveys of regulators and consumers from groups like the Electronic Frontier Foundation and Pew Research.
Ceres neural wristband for subtle, private gesture control
Meta is also believed to couple Hypernova with a neural wristband, speculated to be codenamed Ceres. Building upon CTRL‑Labs’ electromyography research (which Meta acquired), the band reads small electrical signals from the muscles in your wrist and turns those signals into input. Twist to scroll; pinch to select; tap for shortcuts — all without waving your hands around in midair.

The pledge here is a light touch, private control that is nearer to thought than it ever could be to touch. Confirmation from the band’s haptics can be offered so that users aren’t reliant on audio. More importantly, EMG works when your hands are in your pockets, on the subway, or at a desk — all of which could be why this is perhaps the most practical XR input since the scroll wheel.
Horizon OS opening to third-party headsets and OEMs
Meta opened Horizon OS up to partners, and the first results may come in an ASUS ROG headset apparently called Tarius. Early leaks hint that the focus could be on gaming-first features, such as eye and face tracking as well as a high-contrast display (micro‑OLED or QD‑LED with local dimming has been floated). The strategic piece is not just the hardware — it is the platform. A multi-OEM Horizon OS would mirror what Android is trying to accomplish, and widen the funnel for developers and content.
If Meta brings a single non-Meta device to market that can stand toe-to-toe with the best of what’s available at launch, its case for not identifying potential rivals here becomes stronger, as does the argument for a unified store and social graph in an eventual combined metaverse.
That, in turn, reduces costs of development and may do something to help get XR content out of its walled gardens — a frequent criticism identified by IDC and CCS Insight analysts.
Why these Meta Connect announcements could matter most
Smart glasses are moving from camera gadgets to assistant-first devices. The Ray-Ban line demonstrated a true need for hands-free capture and translation; a glanceable HUD with more on-device AI could drive higher daily active use than any visor-style AR hardware has yet produced. Meanwhile, Hypernova becomes a new benchmark for advanced eyewear computing, and Ceres is the solution to input with no strain or spectacle.
The bigger picture: XR is moving from moonshot AR to practical, wearable AI. That reframe guides our expectations toward utility, battery life, and comfort — the metrics that actually determine whether people throw on all this tech garbage outside the house.
Key specs to watch and open questions for Meta’s lineup
- Display: field of view in degrees, peak brightness in nits, and blur at a glance vs. full-task reading.
- Comfort: total weight, temple pressure, and heat with assistant use.
- AI: what runs on-device, how visual data is treated, and granular privacy controls.
- All-controls-disabled fault on Ceres.
- Accuracy across skin tones and conditions.
- Accessibility fallback controls without the band.
- Horizon OS partner lineup, app store terms, and SDK access for glasses-only UX.
If Meta really does deliver on these fronts, Hypernova and Ray-Ban 3 could be the point at which smart glasses cease to be a novel piece of face clutter and start to feel more like the next personal computer — something you wear, rather than carry.