What are influencers laws

Influencer Law

What the FTC’s Crackdown on AI-Generated Reviews Means for Influencers and Brands

In May 2024, the Federal Trade Commission (FTC) issued a sharp warning to marketers, creators, and tech companies: the use of AI-generated product reviews or fake endorsements is not only deceptive—it could lead to serious legal consequences.

As brands increasingly experiment with artificial intelligence in their marketing, many are overlooking key compliance risks. This blog breaks down what the FTC’s latest stance on AI-generated content means for influencers, companies, and the agencies that support them.

What Did the FTC Say?

The FTC’s warning emphasized that AI is not a shield against liability. Whether a review is written by a human, a chatbot, or a language model, fake product reviews and undisclosed endorsements violate federal law.

Specifically, the FTC cautioned that:

  • AI-generated endorsements must be truthful and not misleading
  • Any paid or sponsored content must be clearly disclosed
  • Outsourcing review creation to AI tools does not remove responsibility from the advertiser or influencer
  • Marketers using AI are expected to monitor for deceptive or fabricated claims

For more, see the FTC’s official press release on AI-generated reviews.

Who’s at Risk?

If you’re an influencer, brand manager, agency, or content strategist using AI to create marketing materials—this applies to you.

Scenarios that may trigger FTC scrutiny include:

  • A brand using ChatGPT to write glowing customer reviews for its own website
  • An influencer publishing an AI-generated product endorsement without proper disclosure
  • A marketer using generative tools to create testimonials from fake identities
  • Failing to review or vet AI content before posting it publicly

Even if the content “feels authentic,” it must meet the same standards as human-authored advertising under the FTC influencer guidelines.

Best Practices for Influencers and Brands Using AI

  • Be Transparent About AI Use – If an endorsement is generated by AI or based on synthetic content, the audience must be clearly informed.
  • Use Clear Disclosure Language – Tag sponsored posts with terms like #Ad or #Sponsored, and avoid vague disclaimers. For endorsements, clarify if the person is a real user—or a generated persona.
  • Update Your Influencer Agreements – Contracts should include clauses about the use of AI-generated endorsements, requiring transparency and accuracy.

A social media risk assessment can help identify potential legal gaps in your current AI-based marketing strategy.

How to Strengthen Legal Protections

To stay ahead of regulatory action, brands and influencers should:

  • Review all AI-generated content before publication
  • Avoid fake reviews altogether, whether generated or ghostwritten
  • Consult an attorney before launching influencer campaigns involving synthetic content
  • Monitor evolving FTC guidance on advertising and AI technologies

Working with an influencer lawyer helps ensure your content and contracts meet both current rules and emerging standards.

Don’t Let AI Create Legal Headaches

AI can be a powerful tool—but it’s not exempt from truth-in-advertising laws. With the FTC now actively monitoring how brands and influencers use AI, it’s more important than ever to stay informed, transparent, and legally prepared.

Need help navigating influencer compliance or updating your content strategy? Contact The Social Media Law Firm today to protect your business and brand.


For more legal tips, give us a follow on Instagram, TikTok, Linkedin, or check out our YouTube Channel.

Subscribe to The Social Media Lawcast on Spotify Podcasts.


The Social Media Lawcast logo

Let us help you protect and grow your business.

READY TO GET STARTED?

    As featured on