Ring, Amazon’s camera-connected doorbell service that has exploded in popularity, launched a feature called “Familiar Faces” this week that lets those with the higher-end $250 model of the doorbell tell the system to record images of familiar visitors. The company describes the update as an effort to reduce duplicate alerts and provide useful context, though the move into biometric scanning is already coming under scrutiny from privacy experts and digital rights organizations.
What Familiar Faces Is Supposed to Do for Users
Familiar Faces lets Ring owners tag regular visitors — family, roommates, neighbors — in the app so it can identify who it believes to be in a clip and adjust notifications as needed. Ring frames this as a quality-of-life enhancement, converting nebulous motion pings into specific, searchable events attached to people you know.
- What Familiar Faces Is Supposed to Do for Users
- Consent and the Privacy Catch for Bystanders and Guests
- Accuracy Limits and Real-World Performance Concerns
- The Legal and Regulatory Terrain Is Rapidly Changing
- What Owners Can Check and Adjust in Settings Now
- The Bottom Line on Ring’s New Facial Recognition Push
The feature joins a suite of brand-new automation tools. Product reviewers at Wirecutter point out that Ring is now able to utilize Alexa-powered greetings to inform visitors (or burglars) in different ways, while a separate AI-driven “Search Party” feature encourages you to search for lost pets (which outlets like The Verge reported is enabled by default). Together, these tools propel Ring past passive recording and into active, personalized engagement driven by who — or what — is in the frame.
Consent and the Privacy Catch for Bystanders and Guests
While Familiar Faces is optional for device owners, the issue, privacy advocates say, is that the people being scanned on your doorstep never opted in. As The Washington Post reported, some legal and civil liberties groups argue that capturing face templates of delivery drivers, children, or pedestrians conscripts an entire group into a database of biometric information without those individuals’ consent.
The concern isn’t theoretical. Ring was penalized by the Federal Trade Commission in 2023 for previous privacy and security lapses, and mandated to step up its security and minimize data. Though that case did not have to do with face recognition, it underscored a broader issue: how long the recordings are kept, who has access to them, and what safeguards exist against abuse or breach.
Then there’s Ring’s partnership with law enforcement, lingering in the background. The company has dialed back certain means for police to request footage through its neighborhood app, although authorities can still collect it from users who choose to share videos, or by receiving subpoenas and warrants. Adding facial recognition on top of an already widely deployed network of cameras raises the stakes for how easily identities can be inferred from pedestrian doorbell clips.
Accuracy Limits and Real-World Performance Concerns
Facial recognition fails even under the best circumstances outside lab conditions. Doorbells and outdoor cams work at odd angles from varying distances, heavy backlighting, and that grainy light of night. These are exactly the environments in which false matches can soar.
Independent testing backs that caution. The National Institute of Standards and Technology’s evaluations of facial recognition have found significant disparities in the accuracy of various algorithms, with some having higher false positive rates for certain demographic groups. Performance has gotten better over the years, but experts stress that error rates are not uniform — and that outdoor, consumer-grade scenarios can widen them.
That means, for users, Familiar Faces could mistakenly identify a stranger as a guest, or an actual household member could be overlooked. For non-users it means being tagged in ways they cannot themselves confirm or correct. The technical exceptions matter when labels are the de facto filter for what owners end up watching, saving, or sharing.
The Legal and Regulatory Terrain Is Rapidly Changing
Biometric privacy laws are proliferating, and not all treat facial recognition the same. Illinois’s Biometric Information Privacy Act is one of the most stringent in the country, and lawsuits there have taken aim at companies that generate face templates without permission. Other states have their own biometric statutes, including Texas and Washington, while comprehensive privacy laws in locales like California impose disclosure and opt-out requirements that might apply to the derived biometric data.
In countries other than the U.S., Europe’s General Data Protection Regulation considers biometric information to be particularly sensitive and mandates explicit consent for most forms of its use. And that patchwork could mean that a rollout of Familiar Faces on doorbells is subject to varying rules — and potentially varying liabilities — depending on where a doorbell is installed, or who happens to appear in the footage.
What Owners Can Check and Adjust in Settings Now
If you want to allow Familiar Faces, start with the basics: check your video storage settings, decide who in your household will be allowed to view the “face library,” and limit sharing permissions among other users. You might also put up a little sign that says recording is in effect — something some jurisdictions encourage or require — and use privacy zones or motion zones to limit what your camera sees beyond your actual property.
While you’re at it, audit other AI features. If you don’t need Search Party or personalized greetings, turn them off or limit their triggers. Be sure to check these settings occasionally, because smart home platforms can change rapidly and defaults may be updated in new software.
The Bottom Line on Ring’s New Facial Recognition Push
Ring’s facial recognition technology makes it more informative — and thus more consequential. On the homeowner side, it means fewer nuisance alerts and less painful video review. For every other person who crosses the camera’s field of view, it introduces an involuntary biometric check. Until Amazon offers fuller transparency on how face data is generated, stored, and shared, and until regulators set clear boundaries, our familiar faces will be shaped as much by risks as by conveniences.