Neon, a rapidly growing call recording app that recently became embroiled in a controversy over unsuspecting users’ phone numbers and conversations being leaked online, has shut down following the fallout from its lax security practices.
The shutdown followed reports that private data could be accessed by parties other than the app itself, an accusation independent researchers had recently substantiated.

What Went Wrong With Neon’s Security and Data Leaks
Security findings shared with journalists revealed that Neon’s mobile app sent data in such a way that any onlooker equipped with rudimentary network-analysis tools could see links to call recordings and text transcripts. In some cases, those links were available without any authentication, which means broken access control. This is what experts call a textbook case of exposed object storage and poor authorization checks (think OWASP’s “Broken Object Level Authorization”).
The danger wasn’t limited to individual files, however. Investigators also noticed that the server responses leaked information about recent calls made by other users, such as phone numbers dialed, call durations in seconds, timestamps, and earnings per call. That variety of metadata, even without audio, can be extremely sensitive — mapping who spoke to whom and when features the kinds of pieces that fraudsters and data brokers value most.
In a message to users, the company’s founder, Alex Kiam, said the app was coming offline for “extra layers of security” and stressed that privacy was a priority. Reports also pointed out that the notice did not indicate exactly what was wrong. The app’s listing on the iOS App Store remained available, but the service was turned off if you tried to sign up or log in.
How Big Was the Exposure and Who Was Affected
Neon’s growth had been so swift, the blast radius was potentially broad. The app analytics company Appfigures estimated the app was downloaded about 75,000 times in a single day at the peak of its surge, indicating how fast it shot up the charts on its promise to pay people for recording phone calls. Scalability of that magnitude without mature security controls is an old story: architectures built to scale leave authorization, key management, and logging as something to bolt onto the security horse when architecture headroom runs out.
The mechanics of the vulnerability illustrate a common failure mode for young apps: serving direct file links or otherwise predictable URLs to media stored in the cloud without short-lived, signed access tokens. If those links or indexes are exposed with verbose API responses (Swagger documentation?), generally anyone able to observe the traffic can pivot to download content from other users. Well-architected systems isolate tenants, limit metadata exposure, and apply server-side authorization prior to retrieval of each object.
User Impact and the Potential Legal Implications
Recordings and transcripts of calls are some of the most sensitive forms of personal data, as these can inadvertently include third parties who did not give their consent to enter into such a data-for-pay program. Privacy advocates, like the Electronic Frontier Foundation, have long cautioned that call-recording tools can run afoul of consent laws. Some U.S. states are so-called all-party-consent jurisdictions, where the law requires everyone involved in a call to give consent for it to be recorded; breaking those rules can result in civil and criminal penalties.

Outside the world of consent, regulators increasingly see shoddy security as an unfair trade practice. The Federal Trade Commission has filed cases under Section 5 for lax security of voice and messaging data, and state attorneys general have expansive powers under consumer protection statutes. For a service that deals with phone numbers, transcripts, and behavioral metadata, you’d expect encryption in transit and at rest, strong access controls, and minimal data stored for limited periods of time.
What Neon Needs to Do Now to Regain User Trust
Neon will need more than a patch to come back credibly. Security engineers cite a short list of nonnegotiables:
- Apply per-request authorization
- Serve media with short-lived, signed URLs
- Partition PII from content stores
- Establish least-privilege service roles paired with audited key rotation
Independent penetration testing, a public postmortem, and using a bug bounty program through a well-regarded platform would be additional expressions of seriousness.
Transparency will be critical. Stay on top of news from Microsoft Azure and Google Cloud: Sign up for Network World newsletters. Users must be informed as to what was exposed, how long the data was accessible, and if any access logs indicate malicious use. Transparent procedures for recalling tokens, deleting recordings, and informing affected contacts can help reduce downstream damage. For a product that is quite literally monetizing talk, granular consent triggers and in-call notice of monitoring should be requirements, not options.
A Warning for Data-for-Pay Apps and Startups
The trajectory of Neon is a cautionary tale for startups that transform human conversations into sets of data. The more sensitive the data, the higher-value target it is for attackers — and the higher standards of compliance. App stores have been tightening rules around call recording and background audio capture for years, and enforcement only ramps up in the aftermath of high-profile breaches.
That virality is in some ways both a blessing and an integrity test. In that instance, it revealed flawed security assumptions. Whether Neon comes back will be a function of whether it’s able to reconstruct its stack — and its credibility — around the notion that every recording, like every phone number, is sensitive as a default setting.
