OpenAI is quietly turning ChatGPT into something more social, and your phone number may be along for the ride—whether you asked for it or not. A new “find friends” feature uses contact syncing to match people by phone number, meaning your friends could hand over your digits simply by uploading their address books. The move spotlights a familiar privacy dilemma: convenience for some, exposure for others.
What Changed and Why It Matters for Your Privacy
OpenAI updated its privacy policy to note that it may process your phone number if someone who has it stored chooses to sync their contacts. The company says the feature is optional, designed to help users discover friends across OpenAI products, and limited to numbers stored in coded (hashed) form. Still, the practical effect is clear—your information can enter OpenAI’s systems via other people’s choices.

If a match is found, ChatGPT may suggest connections and, if you follow someone, notify them with an option to follow back. The company also says it will routinely recheck contacts to catch new signups, a growth mechanic common in social apps that raises ongoing data exposure questions for non-users and holdouts alike.
How the Matching Actually Works Behind the Scenes
Contact discovery typically relies on one-way hashing, where a phone number is transformed into a string so platforms can compare values without storing the raw digits. Security experts note this is not the same as anonymization; if you know the hashing method, you can still confirm matches by hashing likely inputs. Regulators in Europe have long treated hashed identifiers as personal data when they can be linked back, a stance echoed in guidance from data protection authorities and research cited by NIST.
OpenAI says it will not store names or email addresses from your address book, only numbers, and that you can revoke permissions in device settings. That’s good hygiene, but it doesn’t change the core tradeoff: people in your contacts—and people who have you in theirs—help OpenAI map who knows whom.
A Familiar Playbook with New Stakes for ChatGPT
WhatsApp, Signal, and Telegram use similar contact uploads for discovery, while major social networks have long encouraged address book matching to accelerate growth. The difference here is context. ChatGPT began life as a chatbot, not a network, and OpenAI is layering on social features like follows and group chats (now supporting up to 20 participants) that become far stickier once a friend graph exists.

The Electronic Frontier Foundation has warned for years that contact uploads can create “shadow profiles” for people who never opted in. The Federal Trade Commission has also scrutinized apps that quietly ingest address books; an early case against the Path app more than a decade ago underscored how sensitive regulators consider contact lists. OpenAI will need clear consent flows, retention limits, and deletion controls to avoid repeating old mistakes in a new AI era.
Possible Risks from Quiet Connects and Misidentification
The immediate worry is misidentification or unwanted contact. Recycled phone numbers and shared family lines can lead to mistaken matches. People escaping harassment or keeping separate identities—for work, communities, or safety—may not want their number attached to a public profile or follow graph, even in limited form.
There’s also the downstream question of targeting. OpenAI this week began rolling out ads inside ChatGPT for free users, a shift that raises inevitable questions about how social signals might shape recommendations or promotions. Rival Anthropic publicly criticized the plan in a Super Bowl spot, and privacy advocates will be watching closely for any blending of contact-derived insights with ad delivery.
Your Options to Limit Exposure and Contact Sharing
- Decline contact syncing: If prompted, do not grant ChatGPT access to your address book. On iOS and Android, you can also revoke contacts permissions in system settings at any time.
- Minimize what’s stored: Consider removing your phone number from your OpenAI account if it’s not required for the way you use the service, and disable discoverability features if offered.
- Exercise data rights: Under GDPR and similar laws, you can submit access or deletion requests to understand whether your number is being processed and ask to have it removed from discovery. Consumer groups and privacy regulators recommend documenting these requests.
- Talk to your circle: The awkward but effective step is asking close contacts not to upload address books. Shared norms curb unintended sharing more than settings alone.
The Bigger Picture for ChatGPT’s Social Features
Contact syncing hints at a broader strategy: turn a blockbuster AI tool into a networked platform where discovery, following, and group experiences keep people engaged. Traffic metrics from firms like Similarweb have already placed ChatGPT among the most-visited sites worldwide; tying that scale to a social graph could reshape how AI tools spread—along with how personal data circulates.
For now, the feature feels incremental but consequential. It’s optional, it’s framed as helpful, and it’s built on a technique many apps use. The difference is trust and transparency: users will want crisp explanations about what’s stored, how long it’s kept, and how to opt out entirely. In an AI gold rush, address books are currency—and yours might be more valuable than you think.