Amazon is working on two smart glasses that will use augmented reality, one for the common consumer and the other for the company’s warehouse workers, according to the report released by The Information. If successful, the two-pronged approach could weave AR directly into how people shop and how packages make it to their doors.
Two devices, two missions
The consumer version, which is said to be in development with the codename “Jayhawk,” is meant to be a smaller, sleek pair of glasses with an augmented reality display with full-color, curved lenses, microphones and speakers for voice assistants and calls and a camera for snapshots and video calls. Think glanceable overlays for navigation, product info, or media controls —which are far beyond what Amazon’s current-generation Echo Frames, which has audio but none of the AR or visual displays, offers.

The driver-focused edition, also referred to as Amelia internally, has emphasized function over design. Instead of lush graphics, we hear it will focus on hands-free guidance: routing, package validation, entry instructions, and doorstep confirmation even quicker. The guts-y tech supposedly is shared with Jayhawk, but the consumer version is supposed to be much slimmer and Amelia values a rugged shell and long battery life over looks.
A strategic partner and a playbook he knows
The hardware is described as being developed with Meta-Bounds, a China-based AR technology firm. That partnership implies that Amazon desires a mature optics stack, and does not want to reinvent waveguides and projection modules in house—akin to the pragmatic decision-making that has led to it historically marrying software and services to third-party hardware in products like Echo devices or Fire TV hardware.
It also fits Amazon’s larger strategy: invest in hardware platforms, then if they gain traction, apply Alexa, computer vision and shopping layers.
The import of a device like the Dash Wand goes beyond the item itself, or even how it improves workplace lunch procurement. Jayhawk might serve as an anchor for a consumer AR experience that could be linked to visual search, hands-free Alexa and home services, while Amelia could be positioned as a productivity tool for Amazon Logistics.
Timelines, testing and what to expect
The driver glasses could hit the road far before their consumer cousin, which follows a longer runway, The Information reported. Jayhawk’s pricing is said to be unknown. A staged rollout would reflect how other companies test AR in controlled settings first, in which there is a clear return on investment.
If Amelia works out, there’s actual operational upside to it. Smart glasses have delivered double-digit productivity increases for DHL in its “vision picking” pilots, the delivery and logistics company says, pointing to around 15% productivity gains in some of its warehouses. But even tiny bits of time saved per stop add up to droves of efficiencies for Amazon, which delivers billions and billions of packages in the U.S. each year, according to leading business publications.

Its success depends on safety and privacy
For drivers, any AR system would have to be less distracting, not more so. Studies from the Virginia Tech Transportation Institute have correlated even momentary eyes-off-road episodes with increased danger of a crash. That sets the bar for Amelia’s interface design: spare glances, simple prompts and audio-first guidance. Anticipate heavy testing around glance time and cognitive load before broadening the release.
From the consumer end, it’s also a privacy concern having a camera pointed at your face. Amazon will probably need rigorous controls around bystander recording, visible capture indicators and clear data policies. Previous missteps across the industry — from head-mounted cameras of the early internet era to always-listening devices — demonstrate that social acceptability depends on transparency and clear indicators that a device promotes safety.
How it fits with rest of the AR landecape
AR eyewear is silently transitioning from gimmick to useful. Meta’s Ray-Ban and camera and AI features were squeezed into a stylish frame; Xreal has cultivated fans around lightweight displays for work and entertainment. Google Glass tripped up in consumer markets but has a home in enterprise workflows. Amazon, armed with Alexa and retail integrations, could thread both worlds if it nails comfort, battery life and clear use cases.
But the most promising use cases for consumers are simple and commonplace: turn-by-turn navigation, rapid-fire translations, music and calling, and shopping tasks like scanning barcodes or gaining context on products. For drivers, it appears two ways: Handoffs take less time, and confirmations happen more quickly, while problem-solving at the doorstep is easier.
Key unknowns to watch
Key details are still a mystery: how bright the display will be in full sun, field of view, weight, battery life between charges, the proportion of processing power on the glasses versus in the cloud and how deeply the glasses will connect with Alexa and retail services. Enterprise buyers will also insist on management tools and ruggedness; consumers will expect comfort and a price tag that seems more like the price of a phone accessory than the premium packaged pick of the pack.
If Amazon can demonstrate that AR actually saves time for drivers and creates value without social friction for customers, it could do what few before it have accomplished: turn smart glasses from a niche experiment to an everyday utility. The gadgets are only half the battle; trust, security and usefulness must follow.
