This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

FTC Guidelines on Using Consumer Data for New AI Projects: Retroactive Privacy Policy Updates Require Consent

On February 13, 2024, the FTC published a blog reminding companies that it may be unfair or deceptive for a company to give themselves permission to expand their use of consumer data (for example, sharing consumers' data with third parties or using data for AI training) without getting consent. Quiet updates (i.e., not providing prominent notice or getting consent) will not suffice. The FTC clarified that a business that collects user data based on one set of privacy commitments cannot then unilaterally renege on those commitments. The FTC gave two examples of its enforcement of this point:

  •  In 2004, the FTC settled its first case challenging deceptive and unfair practices in connection with a company’s material change to its privacy policy against Gateway Learning Corporation. The FTC alleged that Gateway Learning violated federal law when it rented consumers’ personal information to target marketers despite explicit promises made in its privacy policy. The FTC also alleged that, after collecting consumers’ information, Gateway Learning changed its privacy policy to allow it to share the information with third parties without notifying consumers or getting their consent. 
  • Last summer, the FTC reached a settlement with 1Health.io that included claims that it changed its privacy policy retroactively without adequately notifying and obtaining consent from consumers whose data the company had already collected.

The technology and use cases have changed, but the FTC rules have not. Retroactive material changes to a privacy policy require affirmative consent. Failing to get consent will be considered "unfair," and if you told consumers you would let them know about material changes, and you don't, that would be "deceptive." With the FTC leaning into data deletion as a remedy, it's worth making sure you get this one right.

It may be unfair or deceptive for a company to adopt more permissive data practices—for example, to start sharing consumers’ data with third parties or using that data for AI training—and to only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service or privacy policy.

Tags

privacy security & data innovations, artificial intelligence