Insights
/
Articles
/

2025 Year-End Privacy and AI Governance Update

As 2025 comes to a close, privacy and AI governance have moved from a fast-developing regulatory theme to a core operational reality for healthcare marketing. The year was defined less by a single sweeping federal law and more by three trends: continued expansion of state privacy laws, meaningful enforcement action, and early but consequential AI regulations, many of which overlap with privacy concepts like profiling, automated decision-making, and transparency. 

For healthcare marketers, 2025 underscored that compliance is about provable controls, disciplined vendor ecosystems, and privacy-by-design product decisions that can withstand scrutiny across jurisdictions.

State privacy laws continued to evolve

Several new state privacy laws took effect in 2025, and three more are scheduled to go live in January 2026. While the pace of comprehensive privacy laws is slowing, legislatures are increasingly focused on amending existing laws. Those amendments refine opt-out rights, sensitive data definitions, and profiling standards in ways that diverge across states. In our industry, this calls for ongoing privacy management with dedicated resources rather than a one-time compliance lift.

Sensitive data became more scrutinized than ever

States continued broadening sensitive data categories, with especially strong focus on reproductive health, children and teens, and precise geolocation. Regulators are also starting to focus on inferred or derived signals as potentially sensitive, especially when they could reveal health conditions, visits to health-related sites, or other sensitive information. 

Profiling and automated decision-making obligations sharpened

More states refined consumer rights to opt out of profiling and automated decisions. For example, California rulemaking tightened requirements around preference signals, disclosures, and consumer controls. Currently, the framework for AI regulation is following a similar path to privacy regulation, with each state developing its own laws around using AI for profiling and automated decision-making. We expect them to focus on consumer protections, in areas such as consents, notifications, opt-outs, and complying with existing regulations around privacy and sensitive data.


Enforcement accelerated

California, Connecticut, Texas, and Florida all pursued meaningful enforcement actions in 2025 and are showing strong signals that enforcement will continue to ramp up in 2026. In April 2025, eight state regulators announced a bipartisan Consortium of Privacy Regulators that will jointly work on investigations and enforcement of privacy regulations across state lines. Companies are expected to validate data sources, consent mechanics, and downstream uses of their partners rather than relying solely on contracts.

2025 lessons learned

  1. Consumer rights come first. 2025 reinforced that privacy compliance starts with honoring consumer rights, especially opt-out rights and control over their own data. That requires strong consent and notification practices and reliable opt-out signal handling. DeepIntent expanded our structured due diligence across upstream data partners and downstream recipients to validate collection notice, consent mechanics where required, and reliable opt-out flow through the advertising chain. We treat this as a continuous program rather than a one-time review.

  1. Sensitive data is heavily scrutinized. Regulators continue to expand what counts as sensitive data and to treat these categories as inherently high-risk, with enforcement increasingly tied to whether companies identified potential harms and put safeguards in place. DeepIntent continues to track how sensitive data is defined across jurisdictions and adjust our practices accordingly.

  1. Companies are held accountable for automated decision-making. The theme across jurisdictions for 2025 was accountable automated decision-making and AI processing. States increasingly signaled that if AI is used to make decisions or process sensitive information, companies should be able to explain what the system is doing, show they’ve evaluated and reduced foreseeable harms, and keep humans meaningfully in the loop. 

  1. Due diligence is mandatory across the advertising ecosystem. 2025 enforcement showed that regulators expect companies to validate partner practices, not merely pass privacy obligations along via contracts. This applies to both upstream and downstream partners. 

  1. Compliance is an ongoing process. Static policies and catch-all contractual protections are no longer enough. Regulators seek operational proof and ongoing refinement. 

What to Expect in 2026

2026 will be a pivotal year for ad tech privacy and AI compliance. The regulatory direction is clear, and enforcement is accelerating. DeepIntent is planning for 2026 with the assumption that privacy and AI governance will continue to evolve, and that scalable, auditable compliance programs will be a necessity.

  • More state privacy laws will go live, and existing ones will evolve. Additional comprehensive state privacy laws will take effect in 2026, and we expect a continued wave of amendments. States are also moving beyond general privacy frameworks into sector-focused regimes, including comprehensive consumer health privacy laws such as New York’s HIPA, which will raise expectations for how health-adjacent data is handled across the advertising ecosystem.

  • Enforcement is coming, and courts will keep fine-tuning the rules. DeepIntent views both enforcement trends and litigation outcomes as a practical roadmap and an opportunity to continuously strengthen our controls and help customers stay ahead of risk.

  • AI, automated decision-making, and profiling regulation will accelerate. While states are attempting to expand privacy rights tied to profiling, automated decision-making, and AI-specific governance, the federal government is pushing back against AI regulation at the state level. The recent “Ensuring a National Policy Framework For Artificial Intelligence” Executive Order seeks to preempt state-level regulation in favor of a national standard. This discrepancy will be a central focus for DeepIntent in 2026, as we closely track any developments and how they may impact DeepIntent and its partners. In the meantime, we are continuing to mature our privacy and AI programs, focusing on transparency, consumer choice, and accountable oversight of AI usage.

For healthcare marketers, privacy and AI governance aren’t background compliance functions but true differentiators. In a sector defined by sensitive data and heightened expectations, strong, transparent privacy practices will increasingly separate responsible partners from the rest of the market.

Want to learn more about the promise of AI in health marketing? Click here.