Amazon’s Ring has started to release a feature that uses AI-driven facial-recognition technology to help users identify strangers who appear on video from the smart doorbells installed in millions of homes. The feature, called Familiar Faces, is able to recognize people who approach a user’s door and alert them based on that person instead of sending general motion notifications. Supporters argue that it reduces alert fatigue and offers convenience; critics counter that it promotes biometric tracking on neighborhood streets.
How Ring’s Familiar Faces feature works on doorbells
Ring says the feature is opt-in and can store as many as 50 people—such as family members, friends, delivery drivers, or frequent guests—that users designate in an app. Once you set that up, the doorbell can send custom notifications like “Mom at Front Door.” Users can edit or delete entries, merge duplicate entries, and curate alerts for each person.
- How Ring’s Familiar Faces feature works on doorbells
- Privacy concerns and legal limits around face recognition
- Law enforcement ties and third-party access to Ring data
- Accuracy, bias, and the risk we face in the real world
- How Ring’s approach compares to Google Nest and Apple
- What Ring owners can do today to manage privacy
The company says that face data is encrypted and that unknown faces are automatically deleted after 30 days. It also states that the biometric processing takes place in the cloud, and information from Familiar Faces is not used to train larger AI models elsewhere. Given how beloved a feature it is, these assurances will be taken with a grain of salt given the sensitivity of the feature and Ring’s track record on data sharing.
Privacy concerns and legal limits around face recognition
The Electronic Frontier Foundation has warned that ubiquitous monitoring would ensnare passersby who did not agree to be analyzed at a quick glance away from home. A lawyer for the EFF urged state regulators to test the muscle of biometrics privacy laws and look into the rollout. The feature is not rolling out in places with tough local restrictions, including Illinois and Texas, as well as the city of Portland, Ore., which has significant limits on private entities using facial recognition, according to the group.
Skeptics also cite the track record of enforcement actions and security lapses around consumer surveillance tools. The United States Federal Trade Commission fined Ring $5.8 million in 2023 after finding that employees and contractors had “broad and unrestricted” access to customer videos in previous years. Other research revealed that Ring’s Neighbors app had, at one time, provided precise locations associated with users’ posts, pointing to the fact that metadata can be privacy-invasive even without video content.
Law enforcement ties and third-party access to Ring data
Ring’s longstanding partnerships with police and public safety agencies have been a sore point. The company has traditionally constructed tools that help departments request footage from residents, but it has modified those programs after it came under scrutiny for civil liberties reasons. Amazon has also more recently teamed up with companies including Flock Safety, which sells networks of license plate readers to local police departments and federal immigration enforcement agencies. Privacy advocates warn that as these walled gardens begin to overlap, in practice at least, data silos could become less distinct.
Amazon says it is technically not able to run a search person by person across all of its customers’ devices and pledges it won’t do so with Familiar Faces data. Skeptics argue that features built to scan home cameras for lost pets show cross-device search is possible in other contexts. The real issue is not so much about today’s settings as the path dependency that comes with having a biometric layer in place.
Accuracy, bias, and the risk we face in the real world
Facial recognition accuracy is hit or miss.
Facial recognition can be much more accurate under specific conditions of lighting, camera angle, and demographic qualities. In a sweeping study by the National Institute of Standards and Technology, there were vast disparities in how accurate the algorithms were, with higher false match rates for some populations. While Ring’s system is not one designed for law enforcement purposes, misidentifications at the door can have real-world effects—tripping alarms in ways they shouldn’t, provoking confrontations, or feeding into skewed assumptions in neighborhood watch scenarios.
There are cautionary tales. Several wrongful arrests in American cities have been attributed to police use of faulty face-matching systems, showing how low error rates can have high-stakes consequences when downstream decisions are baked into them. And home devices that normalize biometric sensing—especially in places with dense camera coverage—may weave their own ambient surveillance fabric that’s difficult to pull apart.
How Ring’s approach compares to Google Nest and Apple
Ring is not the first to feature it. Google’s Nest includes Familiar Face Detection in a paid subscription, and Apple’s HomeKit Secure Video can identify faces with on-device processing through a home hub. The technical and policy distinctions are important: more on-device processing could make us less vulnerable to a breach of a cloud system, but cloud computation could centralize risk while also allowing rapid evolution of good models. Ring’s decision to process in the cloud is sure to draw scrutiny from regulators and privacy advocates.
What Ring owners can do today to manage privacy
For homes that do opt in, experts advise the following steps to reduce risk and unnecessary capture:
- Avoid using real names in labels.
- Keep any face library to only essential entries.
- Check the retention settings often.
- Use nonspecific titles like “Neighbor 1.”
- Turn off alerts for habitual household comings and goings.
- Audit sharing preferences (such as whether clips end up on community feeds or are available to other members of the household).
The larger question is societal: What should be the limit for biometric tracking technologies on private property that scan public space (or want to) in perpetuity? With state attorneys general and city councils chiming in, and companies scrambling to clarify their policies under the glare of scrutiny, whatever path Ring blazes during this rollout will probably trace some of the next wave of consumer AI out on the edge—what it’s capable of, who benefits from it, and just how much voluntary freedom you end up sacrificing.