The front camera on Apple’s new iPhone 17 lineup and the latest AirPods have two AI-inspired enhancements that subtly reinvent everyday usage: the front camera is now a smarter shape-making device, and live translation flows seamlessly through your earbuds.
Neither characteristic screams “AI,” but both feel instantly practical — the sort of intelligence you actually perceive.

Smarter selfies with Auto Rotate and Auto Zoom
The iPhone 17’s new front camera employs a square sensor that takes in 24MP of data and outputs at 18MP, allowing the phone to reframe without sacrificing sharpness. When Auto Rotate and Auto Zoom come on, the camera applies on-device machine learning to locate faces, make a call on whether it’s a vertical or horizontal shot, and dynamically zoom in or out to fit everyone in crisp detail.
In practice, this means preventing that awkward shuffle when a friend photobombs from behind or you’re moving from a self-portrait to a group. You can even leave the phone in the orientation of your choice and just tap rotate — no wrist gymnastics. Apple slaps the behavior under its well-known “Center Stage” moniker, but on iPhone 17 it’s for photos as well as in video calls.
The benefit isn’t theoretical. The company claims that iPhone users took about 500 billion selfies last year (a number I believe, because it includes all those Snapchat filter swaps), and this is the first selfie system that feels aimed at how people actually shoot. For creators, that means fewer missed frames; for families, wider angles without the fuzz; for everyone else, less fiddling and more of life preserved.
Live translation in your ear with AirPods
Now, paired with an iPhone 17, AirPods can provide live translation built for authentic conversation. Just switch it on, and Active Noise Cancellation automatically kicks in, filtering out background noise — then there’s Decipher mode with ATT, which means the iPhone can handle processing on-device, and you hear a clear translation whatever the native language. You can look at the person you’re talking to and not be forced to pass a phone back and forth.

For now, Apple is emphasizing quality and utility over a huge language catalog. It is currently available in a core set of languages (English, Spanish, French, German, and Portuguese) and focuses on low latency and reliable turn-taking. Google’s translation system now spans more than 240 languages, thanks in part to research-driven expansions of its capabilities, but even Google’s live demonstrations have exposed how far away real-time conversation still is. Apple’s approach shaves the flash and doubles down on consistency.
The result is a better flow: You hear spoken dialogue with minimal delay, the background language is damped so that your brain doesn’t have to juggle two streams at once, and the iPhone’s on-device intelligence reduces reliance on spotty connections. The importance of this — especially the push toward efficient, on-device models — has come to be recognized, and it will only continue to grow in momentum (the latest iteration of Stanford’s AI Index features some data around the trend). And this is a textbook case for why it matters: privacy, latency, and battery life all get better when processing remains close to the metal.
Compatibility, Setup, and Quick Tips for iPhone 17 and AirPods
- Camera AI: Open the front camera, turn on Auto Rotate and Auto Zoom in the options, and let the phone do the framing. Should you desire manual control, you can tap to rotate the orientation on the fly. This feature is available across the entire iPhone 17 family, not just the Pro versions.
- Live translation: Update to the latest iOS, pair your AirPods, and select your languages in Translate on iPhone. Once you start a session, ANC will turn on automatically in your AirPods. It works best if you’re facing your conversation partner, the iPhone’s microphone is unblocked, and you have downloaded language packs for offline use while traveling. Apple says the feature is available with AirPods Pro 3 and is also compatible with AirPods Pro 2 and AirPods 4 when using them on a device that has Apple’s on-device intelligence.
Why these two features matter for everyday use
AI capabilities are frequently solutions in search of a problem. These do the opposite: they take away friction you already have. Composition is not relegated to something done on the sly at the expense of image quality, and multilingual chats no longer feel like impersonations. It’s that combination — less effort, more output — that wins everybody over.
There’s also a wider industry context to consider. Audio wearables remain a primary interface. GSMA Intelligence has been following the rise of audio wearables; there’s a feedback loop being created with live translation. Computer photography, at the same time, is increasingly replacing human skill with algorithms. Apple’s ploy is to keep the intelligence hidden: nothing for you to learn, no new UI to memorize — only better results.
Of all the AI tricks, these two are the ones we recommend trying on day one. Your selfies will appear as you meant them to, and speaking with folks overseas never feels like you’re just visiting.