FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Business

New York will require companies to report personalized pricing

Gregory Zuckerman
Last updated: November 29, 2025 7:06 pm
By Gregory Zuckerman
Business
6 Min Read
SHARE

New York is stepping into the ring with algorithmic price discrimination, and businesses will have to warn their customers if they are using personal data to charge them more.

The directive, buried in the state’s recent budget, would require companies to disclose when a price is tailored to them with the information: “This price was set by an algorithm based on your personal data.”

Table of Contents
  • What the new law requires from businesses in New York
  • How companies are responding to New York’s pricing rule
  • Personalized pricing versus dynamic pricing explained
  • Why transparency in algorithmic pricing practices matters
  • Legal and policy ripples to watch after New York’s law
New York skyline with price tags illustrating companies personalized pricing reporting law

What the new law requires from businesses in New York

Businesses that use personal data (which can be obscure, such as device identifiers and past purchases or browsing behavior) to affect the price a given shopper sees will have to begin prominently alerting people so they know when prices are based on their details.

If a company does use such methods, however, it must show the explicit disclosure so that clients know these numbers didn’t just come out of thin air.

Details like enforcement and penalties will be defined as regulators issue guidance, but the message from Albany is unambiguous: If algorithms are coming for personalized prices, consumers should get plain-English transparency before they part with their money. The law stops short of banning dynamic pricing more broadly, opting instead to target the line where personal data comes into play.

How companies are responding to New York’s pricing rule

It’s not clear how often retailers actually customize their prices. Some companies, as they have started to post the disclosure in response to the law, say they do not use personal data to set prices, according to reports by The New York Times. Uber, for instance, has informed New Yorkers that they are seeing the notice while insisting its pricing reflects riders’ locations and demand, not their personal details.

Industry groups say the law is unclear and could stifle typical pricing behaviors. The National Retail Federation has sued to stop the rule, but a federal judge, who refused to block it at this point, left in place the disclosure requirement while the case continues.

Personalized pricing versus dynamic pricing explained

It’s important to distinguish between two practices that are often confused. Dynamic pricing accounts for supply and demand — think surge pricing for rides or higher hotel rates on holiday weekends. Personalized pricing, on the other hand, focuses on the individual level, in which specific user data is used to model inferred willingness to pay or likelihood of conversion for a tailored price.

A white rectangular text box containing a paragraph about the threat posed to tech leadership by regulation, set against a light gray background with subtle geometric patterns.

Both price steering and personalization have been documented in academic studies of e-commerce. Researchers at Northeastern University, for example, saw evidence that some sites changed prices or ranked results differently depending on the user. Even A/B tests can border on personalization when experiments use attributes associated with a single shopper rather than general segments.

It does not sit well with consumers. About 79% are worried about the way companies use their data, according to Pew Research Center. And when that data is used to adjust the price at checkout, those concerns are heightened — especially if shoppers have a sneaking suspicion they’re paying more than everybody else for identical goods.

Why transparency in algorithmic pricing practices matters

Personalized pricing can bring more affordable offers to some buyers, but it also has the potential to deepen inequities if those proxies for income, location, or device type replace actual protected attributes. The Federal Trade Commission, for example, has cautioned that decisions made by an opaque algorithm can lead to unfair or discriminatory outcomes — particularly when a company is unable to explain how a model produced its end result.

Disclosure is also a light intervention. It does not ban experimentation or optimization; it pushes firms toward accountability and tells consumers that price isn’t always a neutral number. Clear warnings also leave a paper trail for regulators to scrutinize for evidence of abuse and for researchers to analyze prevalence.

Legal and policy ripples to watch after New York’s law

Look for the courts to rule on the extent to which states can require algorithmic transparency. Regulators could also follow New York’s model. Europe’s GDPR already grants individuals rights of access to information about automated decision-making, and U.S. states including California and Colorado are pressing ahead with broader rules around automated profiling as well as consumer data rights.

Former Federal Trade Commission chair Lina Khan has cited New York’s disclosure requirement as important while also recognizing how much more needs to be done to regulate algorithmic pricing. Whether other states follow the model — or simply skip to constraints on what personal data can be used in pricing — will depend on how New York’s experiment fares in the real world.

A shift, for now, is also instantaneous: a business that charges you an amount dreamed up by your data will have to tell you. That one sentence could change how businesses factor the reputational toll of personalization against the revenue that they expect to generate.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
Amazon Black Friday 2020 Alert: Here Are the First Early Deals
ChatGPT ads appear in leaked Android beta build
Amazon Cuts Garmin Vivoactive 5 to $185 for Black Friday
A.I. Admits Nothing as the Evidence of a Sexist Bias Piles Up
Amazon Gives $5 Off $25 in Groceries Till Cyber Monday
Epic CEO Says AI Game Labels Are Pointless
New Ways to Watch YouTube Without Its Ads
Apple’s Founding Documents Under the Hammer Could Sell for $4M
Five Pixel Hardware Upgrades Experts Want
Samsung Galaxy Tab S11 Ultra: Display a Feast for the Eyes
Samsung Galaxy Watch 8 Is on Top of Sleep Tracking
New Cloud Storage Offer Requires File Cleanup and Security
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.