Amazon is developing two augmented reality smart glasses tailored for different users—one for everyday shoppers and another for its vast network of delivery drivers—according to reporting from The Information. If successful, the dual-track approach could knit AR directly into how people shop and how packages get to their doors.
Two devices, two missions
The consumer model, reportedly codenamed Jayhawk, is designed as a sleeker pair of glasses with a full-color display in one lens, plus microphones, speakers, and a camera. Think glanceable overlays for navigation, product info, or media controls—features that go well beyond Amazon’s existing Echo Frames, which are audio-only and lack any AR display.

The driver-focused version, internally known as Amelia, prioritizes utility over style. Rather than lush visuals, it’s expected to emphasize hands-free guidance: routing, package verification, entry instructions, and faster doorstep confirmation. The core tech reportedly overlaps with Jayhawk, but the consumer variant is expected to be slimmer while Amelia trades aesthetics for ruggedness and battery endurance.
A strategic partner and a familiar playbook
Amazon is said to be working with Meta-Bounds, a China-based AR technology company, on the hardware. That partnership suggests Amazon wants a mature optics stack rather than reinventing waveguides and projection modules in-house—a pragmatic move similar to how it has historically paired its software and services with third-party components in Echo devices and Fire TV hardware.
The plan fits Amazon’s broader strategy: seed a hardware platform, then layer Alexa, computer vision, and shopping services on top. Jayhawk could anchor a consumer AR experience that ties into visual search, hands-free Alexa, and home services, while Amelia becomes a productivity tool tuned to Amazon Logistics.
Timelines, testing, and what to expect
According to The Information, the driver glasses could enter service significantly earlier than the consumer pair, with the latter tracking a longer runway. Pricing for Jayhawk is reportedly undecided. A staged rollout would mirror how other companies pilot AR in controlled environments first, where the return on investment is easier to measure.
There is real operational upside if Amelia delivers. DHL’s “vision picking” pilots with smart glasses have reported double-digit productivity gains, with the company citing around 15% improvements in some warehouses. Even small time savings per stop scale quickly for Amazon, which delivers billions of packages annually in the U.S., as reported by major business outlets.
Safety and privacy will make or break it
For drivers, any AR system must reduce distraction, not add to it. Research from the Virginia Tech Transportation Institute has linked even brief eyes-off-road intervals with elevated crash risk. That raises the bar for Amelia’s interface design: minimal glances, clear prompts, and audio-first guidance. Expect robust testing around glance time and cognitive load before wide deployment.
On the consumer side, a camera on your face invites privacy scrutiny. Amazon will likely need strict controls for bystander recording, visible capture indicators, and clear data policies. Past missteps across the industry—from early head-mounted cameras to always-listening devices—show that social acceptance hinges on transparency and obvious safety cues.
How it fits into the AR landscape
AR eyewear is quietly moving from novelty to utility. Meta’s Ray-Ban line pushed camera and AI features into a fashionable frame; Xreal has built a following around lightweight displays for work and entertainment. Google Glass stumbled in consumer markets but found a home in enterprise workflows. Amazon, with Alexa and retail integrations, could thread both worlds if it nails comfort, battery life, and clear use cases.
The most compelling consumer scenarios are simple and frequent: turn-by-turn navigation, quick translations, music and calling, and shopping-related tasks like scanning barcodes or getting context on products. For drivers, success looks like fewer device handoffs, faster confirmations, and simplified problem-solving at the doorstep.
Key unknowns to watch
Critical details remain under wraps: display brightness outdoors, field of view, weight, battery life, on-device processing versus the cloud, and how tightly the glasses will integrate with Alexa and retail services. Enterprise buyers will want management tools and durability; consumers will demand comfort and a price that feels more like a phone accessory than a luxury gadget.
If Amazon can show that AR genuinely saves time for drivers and adds value without social friction for shoppers, it could do what few have managed: turn smart glasses from a niche experiment into an everyday tool. The hardware is only half the story; trust, safety, and utility must carry the rest.