A new app, called Neon, is offering actual money for what most of us do without thinking — talk on the phone. The pitch is straightforward: make calls through its dialing system, the app records the conversation, and you may then collect up to $30 per day as the recordings are sold off: first to publicly listed technology companies that need it for improving their AI, and possibly — one fine day — algorithmic gatekeepers of society known as anti-money laundering systems. The question is whether the payout justifies the privacy risk.
How Neon Works and What You Can Earn Using It
Neon works on iOS and Android and records only calls made within its app. You make 30 cents a minute when both sides are using Neon (because the host would have access to the recording anyway) and 15 cents a minute if only your side is on Neon. There’s a daily cap of $30, plus you’ll receive a $30 referral bonus for each new user who signs up. You can redeem rewards starting at just $0.10, and they generally come within three business days.

The math matters. Reaching the cap takes an estimated 100 minutes of Neon-to-Neon calls or 200 minutes if you’re calling with someone who isn’t a Neon user — meaning each will likely spend between 1.5 and 3.5 hours talking each day. That amounts to an effective rate of somewhere between $9 and $18 an hour based on your call mix. Even for dedicated workers, several hours of voice calls daily is a lot, so most won’t earn those headline numbers.
Why AI Wants Your Calls to Train Speech Algorithms
AI researchers require vast amounts of real-world conversation to hone recognition, diarization, and dialogue models. Polished studio clips and scripted prompts do not convey the hesitations, interruptions, background noise, and accents that make everyday speech difficult for machines. Which is why conversational corpora like Switchboard and Fisher have been staples for decades — and why fresh, diverse audio is precious today.
Annotating natural speech is expensive. Data access is consistently reported by both industry researchers and program managers as a key bottleneck to performance improvements, as is evident in the Stanford AI Index. By paying consumers for raw calls and bundling those conversations up to resell the audio to vetted buyers, Neon sits like a data broker for voice — purchasing at cents per minute in bulk and likely selling for a multiple of that, particularly when transcripts or metadata are added downstream.
Privacy Promises and the Fine Print You Should Know
Neon claims to anonymize the recording, stripping out names and numbers and addresses, encrypting it, and selling it only to trusted AI companies. It’s a beginning, but it is not a promise. Voice is a biometric; the National Institute of Standards and Technology’s evaluations consider speaker recognition a biometric domain, and most people can identify someone by voice. Research has also shown that “anonymous” datasets can often be reidentified with only a few metadata points, particularly when combined with other information.
There’s also the model problem: once a voice has been used to train systems, deletion requests may take away the raw file but not weight parameters learned from that data. Regulators have pointed to this tension between data rights and machine learning, and the Federal Trade Commission has cautioned that guarantees of anonymization are frequently exaggerated. Neon’s controls help, but no vendor can prevent downstream misuse if the data is leaked, resold, or combined with other sources.

Legal and Ethical Considerations for Recording Calls
Rules for getting consent to record calls are uneven. A few states in the United States require all-party consent when it comes to recording calls, so the other person would have to agree — not just you. Globally, similar laws such as Europe’s GDPR mandate that personal data be processed for a specific and legitimate purpose, with explicit informed consent. If the person on the line is a child, a patient, or talking about their financial and employment affairs, then the stakes get higher.
Think about context, too. Employers can ban recording work calls. Discussions about health, legal, and financial issues may reveal protected information. As voice cloning scams increase, the FTC and consumer advocates have warned against sharing high-fidelity samples of your voice online. Even if Neon’s buyers are vetted, there is still a larger risk landscape to consider.
If You Still Want to Try It, Minimize Your Exposure
Use a different number and save Neon for low-stakes, non-sensitive conversations. Get verbal permission on the phone for each call, particularly in all-party consent states, and do not give out account numbers, addresses, or medical data. Tend to make Neon-to-Neon calls with friends who explicitly sign up, both out of the need humans have for consent and in order to be paid the higher rate.
Carefully read the privacy policy — retention periods, who can purchase datasets, whether transcripts are generated, and how deletion or withdrawal of consent is managed. Establish a personal earnings limit, and check back after some time — maybe it’s not worth the trade-off. If you take a miss and do stop at this point, request to have the data removed and verify what can (and cannot) be erased after models have been trained.
The Bottom Line on Trading Voice Data for Cash
For most people, the time-and-privacy tradeoff will not pencil out. Making up to $30 a day requires hours of blather and a tolerance for your voice being along with other recordings in AI training pipelines you don’t control. If you know all that and keep to a strict minimum of what you share, staying with conversations agreed upon as tame enough for Neon can be a small side income. Remember what’s truly for sale: not minutes of your time, but a piece of your identity — your voice — and once it’s gone, you do not get it back.
