Amazon’s Ring is attracting scrutiny once again after a group of privacy and civil rights organizations drew attention to several new inventions from the consumer home surveillance company covering a range of issues with implications for free speech, discrimination, and government surveillance. The Electronic Frontier Foundation (EFF) says the device has the potential to scan faces of passersby and visitors without their knowledge—a legal flashpoint in U.S. states that demand clear consent before biometric identifiers can be obtained.
What the Old Guard Does — and Why It’s Different
Familiar Faces allows Ring users to build profiles so that the system can recognize known people and decrease frequent alerts. The feature relies on facial recognition: Cameras analyze face geometry each time someone comes into view, whether that’s a family member or guest, a delivery driver, or even just the neighbor who happens to walk by the lens. Though owners must opt in to use the feature, the people who are being scanned typically cannot opt out — an asymmetric relationship that is at the heart of today’s backlash.
- What the Old Guard Does — and Why It’s Different
- Where State Law Draws the Line on Biometric Face Scans
- Amazon’s Position and How the Tech Works
- Law Enforcement Ties Add Another Layer of Concern
- Legal Risk Isn’t Theoretical for Ring’s Familiar Faces Feature
- What Ring Owners Need to Know Before Using Familiar Faces
Where State Law Draws the Line on Biometric Face Scans
Biometric-privacy laws in various states have strict consent requirements. Illinois’ Biometric Information Privacy Act requires that a person give informed, written consent before their biometric info can be collected and allows people to file suit for violations — which is partly why BIPA has spawned big-time litigation (like Facebook’s $650 million face templates settlement). Texas’ Capture or Use of Biometric Identifier Act also prohibits facial scanning without consent, and Washington State’s biometric statute mandates notice and consent for the commercial use of biometric identifiers.
Stepping outside the biometric-specific rules, comprehensive state privacy laws in Colorado, Connecticut, and Virginia all classify biometric data as “sensitive” data — usually requiring opt-in consent before being processed. California’s privacy law imposes additional requirements when it comes to the treatment and retention of sensitive information. In this climate, running a facial recognition operation on people who never agreed to it — and particularly when that operation happens in the cloud — can be considered an obligation for the company, not just the owner of the device.
Amazon has stated to the EFF that Familiar Faces will not be available in Illinois and Texas — what critics see as an implicit admission that it might not pass legal muster there. Potential civil liability is particularly high in Illinois because of BIPA’s private right of action and the statutory per-claim damages, which can add up fast in class-action cases.
Amazon’s Position and How the Tech Works
Amazon says its customers are responsible for using Ring products consistent with local law and that the app will remind users of consent obligations. The company also notes that Familiar Faces processing happens in the cloud, where Ring says it uses encryption and access controls as well as database isolation to protect it. Users can delete face profiles at any time, which Ring claims will also remove corresponding biometric data.

Privacy advocates say cloud processing bolsters the argument that Amazon is itself gathering and analyzing biometric data, effectively making the company directly liable for state consent rules and retention limits. They also claim the app warnings don’t solve systemic consent issues faced by bystanders, who rarely get notice — let alone an opportunity to give or withhold it — before their faces are scanned on private property and shared spaces.
Law Enforcement Ties Add Another Layer of Concern
Ring’s longtime partnerships with local police departments have raised concerns about civil liberties. While the company has changed the manner in which agencies request video, it is also clear that wider deployment of facial recognition at agency doors could normalize surveillance well beyond a single homeowner’s property line, critics say. For renters in multi-unit buildings or workers like couriers, that can mean repeated scans throughout a neighborhood with no reasonable way to opt out.
Legal Risk Isn’t Theoretical for Ring’s Familiar Faces Feature
Some of the biggest payouts for privacy violations in United States history have come through enforcement of BIPA, and courts have made clear that simply collecting improperly can run afoul. And federal regulators have already punished Ring separately for previous privacy lapses: The Federal Trade Commission demanded refunds and promised to impose more vetting after asserting that the company had improperly accessed footage of customers. Against that backdrop, introducing new face-scanning features without robust, provable consent processes could risk attracting attorney general investigations, class actions, or both.
What Ring Owners Need to Know Before Using Familiar Faces
Homeowners who turn on Familiar Faces might check their state laws, post clear notice, and get written consent from anyone who is regularly being captured — family members, workers, staff. Restricting the number of face profiles, wiping retention data, and imposing restrictions on camera angles in private regions can help reduce risk. And as for the switches in shared entry buildings, or on streets, like in frames, I don’t know why disabling it wouldn’t be the most legally conservative position to adopt.
The bigger question is if consumer-grade facial recognition can run without tripping notice-and-consent rules for everyone it touches. With millions of Ring devices already in use, the answer will determine how far AI-powered monitoring’s power to protect can go before it becomes an unlawful invasion of privacy.