The news: New York Governor Kathy Hochul signed two laws Thursday aimed at bringing transparency and consent into AI use across film, television, and advertising, marking the first US measures to directly address synthetic performers and posthumous-likeness rights.
- One law requires clear disclosure when ads feature AI-generated or digitally altered performers, ensuring audiences understand when a person on screen is not real.
- A companion law requires consent from heirs or executors before a deceased individual’s name, image, or likeness can be used commercially, extending protections beyond a performer’s lifetime.
- The legislation followed sustained lobbying from SAG-AFTRA and other entertainment groups, reflecting organized labor’s growing role in shaping AI policy at the state level, Deadline noted.
Clashing with DC: At the federal level, President Donald Trump moved in the opposite direction.
- Trump signed an executive order Friday threatening to withhold broadband funding from states whose AI rules are viewed as obstacles to US AI leadership.
- The order criticizes state-by-state regulation, calling for a single federal framework and warning that local rules could slow innovation and weaken US competitiveness with China.
- Major AI players (including OpenAI, Google, Meta, and Andreessen Horowitz) have largely supported federal oversight, aligning with the push for centralized regulation rather than state-led rules.
Why it matters: The New York laws arrive as AI-generated actors, cloned voices, and synthetic influencers move closer to mainstream commercial use, raising concerns about deception, consent, and labor displacement.
- The backlash to AI actress “Tilly Norwood,” condemned by SAG-AFTRA and Hollywood talent, shows why regulators and unions are pushing for AI disclosure and consent: Audiences still distrust synthetic performers.
- Public sentiment supports guardrails. Surveys consistently show that most US adults still prefer human-made movies and videos, while UK audiences accept AI mainly for technical tasks like visual effects or restoration, not for replacing writers or actors.
- Trump’s executive order escalates the tension between state experimentation and federal control and could set up a lengthy legal battle over AI regulation. The administration argues that varied state rules create friction for startups and investors, while critics warn that federal preemption could weaken consumer protections.
Key takeaways for media professionals:
- Media companies now face a split regulatory reality. In states like New York, disclosure and consent around AI-generated performers are becoming mandatory, setting new expectations for advertising and entertainment production.
- At the same time, federal pressure may limit how far other states can go, increasing uncertainty for studios, agencies, and platforms operating nationally.
- Given limited public comfort with fully synthetic talent, transparency is likely to function as both a compliance requirement and a trust signal.
- Production leaders should prepare for AI use that is visible, permissioned, and carefully governed, while tracking whether New York’s approach becomes a template or a flashpoint in the broader federal debate.