A new privacy analysis of top fitness apps suggests your workout data may be pulling more weight than you think. Security firm Surfshark reviewed Apple App Store privacy labels for 16 leading fitness apps and found that Fitbit collects the widest range of data types, while Strava leads in using data for purposes beyond core app functionality.
What the study says about fitness app data collection
Surfshark examined the 35 data types and 16 categories defined in Apple’s App Store privacy disclosures, tallying what each app collects, links, tracks, and potentially shares. According to the report, Fitbit gathers 24 of the 35 possible data types—nearly double the category average—though some of that intake supports features users explicitly want, like activity tracking, crash analytics, and account security.
Strava stands out differently: it reportedly “potentially exploits” 21 data types beyond basic functionality, meaning more information can be used for analytics, personalization, or marketing. That does not imply unlawful behavior, but it signals a broader footprint for profiling, ad targeting, and product tuning.
The report also flags location practices. Four apps collect precise location linked to the user, including Runna and Strava. Six collect coarse location; Surfshark says Nike Training Club and Peloton share some of this information with third parties. In the most sensitive category, four apps appear to collect data such as race, pregnancy details, or political opinions, with Nike Training Club cited among those that share sensitive categories with third parties.
Not all apps are equally invasive. Centr was identified as the least data-hungry, collecting only user ID, product interactions, and crash data—though Surfshark notes those are still tracked. PUSH is singled out as the most privacy-friendly, collecting data without linking it to a person and limiting use to essential app functions.
One important caveat: Apple’s privacy labels are developer-reported. They provide a comparable framework, but they are not audits. The findings reflect what developers disclose and how Surfshark categorized those disclosures.
Why the findings matter for privacy and fitness app users
Fitness apps blend health-adjacent signals—workout times, recovery metrics, and locations—with identifiers like device IDs and ad IDs. When combined, these can build remarkably detailed profiles of routines and habits. Past controversies show the stakes: Strava’s 2018 global heat map unintentionally revealed sensitive activity patterns near military sites, underscoring how aggregated fitness traces can have real-world consequences.
Regulators have been paying attention. The Federal Trade Commission has taken action against digital health and wellness services for sharing sensitive user data with advertisers without proper consent, including high-profile cases involving GoodRx and the Flo app. In Europe, GDPR sets strict rules for processing health-related and sensitive data, and enforcement actions have ramped up for transparency and purpose limitation violations.
For consumers, the takeaway is not that fitness apps are off-limits, but that data collection differences meaningfully affect privacy risk. Apps that link and track more data types across more purposes create larger exposure if data is misused, breached, or repurposed.
Context for Fitbit and Strava data collection and use
Fitbit, now owned by Google, discloses extensive collection to support features spanning sleep analysis, heart rate zones, and social challenges. Google has made public commitments in some jurisdictions to silo certain Fitbit health data from Google advertising systems, but the breadth of collection still raises important questions about long-term retention and cross-product use.
Strava’s platform is inherently social and location-centric, with segments, clubs, and public leaderboards. The company offers tools like privacy zones, activity visibility controls, and options to exclude activities from community heat maps, yet Surfshark’s findings suggest the app’s data may be more widely used for analytics and personalization than many casual athletes realize.
How to reduce your digital footprint when using fitness apps
Check app privacy labels before installing and revisit them periodically. In iOS and Android, disable precise location unless you truly need it, and turn location access to While Using or off. If you run or ride from home, set privacy zones or manual start points to mask your exact address.
Review Settings for ad personalization, analytics sharing, and third-party integrations; opt out where possible. Reset your device’s advertising ID, and on platforms that support it, enable options like “Ask App Not to Track.” Under GDPR and CCPA, you can request access to, deletion of, or opt out of the sale or sharing of your data—use those rights. Consider privacy-forward apps such as those highlighted by Surfshark when features meet your needs.
Bottom line on fitness app data collection and privacy
The Surfshark report suggests Fitbit casts the widest collection net, while Strava more often channels data into non-essential uses. For anyone lacing up in the new year, it’s worth remembering that your route, routines, and metadata may be going farther than your miles—and a few privacy settings can help you regain control.