The Privacy Patchwork in 2026: What Compliance Teams in Legal, Healthcare, and Finance Actually Need to Watch
A few weeks ago, a colleague on our legal team sent me a text message at 11 p.m. (she does that), asking whether a new vendor intake form our marketing team had just launched triggered Maryland’s new privacy rules. I told her I’d look in the morning. I did not sleep well.
That is the unglamorous truth about U.S. data privacy in 2026. It is not one law. It is a mosaic. And the mosaic keeps adding tiles, sometimes faster than compliance teams can read the press releases.
The Federal Vacuum, Still with Us
For years, industry watchers kept insisting a comprehensive federal privacy law would land any moment. It still hasn’t. In the meantime, the Federal Trade Commission continues to use its authority under the Federal Trade Commission Act to chase what it calls “unfair or deceptive trade practices.” That phrase is flexible enough to sweep in weak data security, privacy policies that contradict actual practice, and transfers of personal information that were never disclosed to the consumer.
The federal sectoral laws still occupy their familiar lanes. HIPAA for health information, GLBA for financial institutions, FCRA for credit data, COPPA for minors, and FERPA for student records. They govern well, but narrowly — a little like well-maintained highways that only reach certain cities. Between those cities lies a lot of open road.
The States Stepped In (Quietly, and Then Not)
California started the comprehensive wave. The California Privacy Rights Act (CPRA) amended the original CCPA and took effect on January 1, 2023. It did two important things. First, it created an actual regulator, the California Privacy Protection Agency, instead of leaving enforcement solely to the Attorney General. Second, it introduced the concept of Sensitive Personal Information (SPI) — Social Security numbers, driver’s licenses, precise geolocation, genetic data, racial or ethnic origin, health information, the contents of a consumer’s email or text messages. Californians gained the right to limit the use of any of it.
If you operate in healthcare or finance and you assume HIPAA or GLBA exempts you fully, read the statute twice. You are probably wrong about the non-covered data.
Then came Virginia. The Virginia Consumer Data Protection Act (VCDPA) set the template most states have borrowed from since: threshold triggers (usually 100,000 residents, or 25,000 residents plus revenue from data sales), an opt-out model for ordinary processing, and opt-in consent for sensitive data. Colorado, Utah, Connecticut, Montana, Tennessee, Iowa, Nebraska, New Hampshire, New Jersey, and Minnesota have all followed in various flavors. Rhode Island, too. And Kentucky. And Indiana.
The list keeps growing, which is part of the problem.
Just since the turn of this year, the Indiana Consumer Data Protection Act and the Kentucky Consumer Data Protection Act came online (both January 1, 2026). Rhode Island’s Data Transparency and Privacy Protection Act also took effect on that same day. Maryland’s processing obligations under MODPA kicked in on April 1, 2026 (just over three weeks ago at the time of writing). Montana’s 60-day cure period sunsets on that same date. If your organization has not refreshed its data inventory since last autumn, you may have slipped out of compliance without noticing. Silence, in privacy, is rarely golden.
Maryland Is the One to Really Watch
The Maryland Online Data Privacy Act (MODPA) is not a friendly law, and it does not aspire to be. Unlike most U.S. state privacy laws (which allow businesses to collect sensitive data so long as they obtain opt-in consent), Maryland restricts the collection, processing, and sharing of sensitive data to situations where it is strictly necessary to provide a product or service the consumer actually requested. It also outright bans the sale of sensitive data. Full stop. Penalties climb to $10,000 per violation and $25,000 for each repetition of the same violation, enforced by the Consumer Protection Division of the Attorney General’s office.
For a regional hospital system, a community bank, or a fintech serving the Mid-Atlantic, that is a material operational change. The blue plastic clipboards at the intake desk, the backend analytics profiling user financial behavior, the “quick demographic survey” you send customers in January. All of it now warrants a second look.
Europe Is Simpler, in a Sense
Across the Atlantic, the GDPR remains the gravitational center. You already know this. Explicit consent for special category data (health, racial origin, political opinions, trade union membership, genetic, biometric, sex life or orientation). Fines of up to €20 million or 4% of total global turnover. A 72-hour breach notification window.
What American compliance teams still occasionally underestimate is how much has been layered on top of GDPR in the past few years.
- The Digital Services Act (DSA) entered into force in November 2022 and restricts certain kinds of targeted advertising, notably advertising to minors and advertising that uses GDPR special category data.
- The Digital Markets Act (DMA) reshapes how the very large platforms — the so-called “gatekeepers,” a small list that includes companies like Apple, Google, Meta, Microsoft, and Amazon — can combine cross-context personal data.
- The EU Artificial Intelligence Act, approved in 2023 and rolling into force through 2027, places real constraints on high-risk AI systems used in employment, credit, healthcare, and law enforcement. It prohibits social scoring by public authorities, forbids the untargeted scraping of facial images to build recognition databases, and bans certain forms of workplace emotion recognition.
If you do business in both jurisdictions, you are not complying with “GDPR plus.” You are complying with a rapidly thickening stack.
Beyond Europe
The United Nations Conference on Trade and Development tracks more than 130 countries with data privacy laws on their books. A few deserve special attention in any multinational risk review.
Brazil’s Lei Geral de Proteção de Dados Pessoais (LGPD), which came into force in 2020, tracks closely with GDPR. A well-run EU compliance program usually ports over with modest changes.
China’s Personal Information Protection Law (PIPL), effective November 2021, broadly resembles GDPR but demands stricter consent and imposes harsher penalties. Individual rights are slightly narrower, but the cross-border transfer rules are materially tighter, which is what trips most multinationals up.
India’s Digital Personal Data Protection Act (DPDPA) covers roughly 1.4 billion people, a scale that demands its own engineering approach. Notable government exemptions have drawn criticism from privacy advocates, but the law is now reality for anyone doing business on the subcontinent.
Now the Uncomfortable Part (About AI)
Here is where things get personal, and where the compliance officer’s 11 p.m. texts start piling up faster.
We are now several years past ChatGPT’s public debut, and the muscle memory of pasting documents into AI chat windows is deeply embedded in how lawyers, analysts, underwriters, and clinicians do their work. I have watched a brilliant tax attorney paste an entire client engagement letter — full legal name, Social Security number, three years of income data — into a generative AI tool, asking it to “summarize the tone.” I have watched a nurse dictate a discharge summary into a free transcription app that included, verbatim, the patient’s name and admission date. None of these people are careless. They are busy, the tools feel frictionless, and the deadline is always yesterday.
The trouble is that under nearly every regulation surveyed above, uploading a document containing personal information to a third-party AI service is a data transfer. It is a disclosure. In many cases it is a “sale” (Colorado’s statutory definition is famously broad) or a “sharing” (California’s term of art) unless you have papered the relationship otherwise. If the AI provider uses the submission to improve its models — read the terms carefully, because many still do by default — you may have contributed protected data to a training corpus you cannot recall.
For HIPAA covered entities, an AI vendor processing PHI is almost certainly a business associate, which means a signed Business Associate Agreement is not optional. For GLBA-covered institutions, sharing a customer’s account number with an unvetted AI tool can trigger Safeguards Rule obligations the institution never intended to take on. Under MODPA, uploading a Maryland consumer’s health history to a generative tool to “draft a response” may be prohibited processing, full stop, even with consent, because the law restricts sensitive data collection to the strictly necessary.
The solution is not to avoid AI. That horse has already left the barn, eaten lunch, and moved into a condominium in the city. The solution is to redact, de-identify, or process locally before anything sensitive touches a third-party model. Tools that detect and mask personally identifiable information at the document layer — whether in a PDF, a Word file, a spreadsheet, or a scanned image — are no longer a luxury for legal, healthcare, and financial teams. They are minimum requirements.
In practical terms, a defensible workflow in 2026 looks something like this:
- Identify the document type.
- Run an automated PII detection pass.
- Review the flagged entities with a human in the loop.
- Redact or replace.
- Only then hand the sanitized version to whatever downstream tool — AI or otherwise — is being used.
A Closing Thought
Somewhere between the patchwork of state laws and the unstoppable adoption of AI sits a thin, fragile discipline: treat personal data the way you would a hot cup of coffee on a plane. You don’t need to refuse to drink it. You just need to pay attention to what you’re doing with it, at every step, because the turbulence is coming whether you see it or not.
Compliance in 2026 is less about memorizing rules (there are too many) and more about building systems that assume the rules will keep changing. Your data inventory, your vendor register, your redaction workflow, and your willingness to say “no, don’t upload that” are the load-bearing walls now.
The laws will keep moving. Your documents don’t have to.
Need to redact PII before it ever reaches a third-party AI tool? Download PII Anomalyzer — it runs entirely on your desktop, detects 55+ PII entity types across PDF, Word, Excel, and scanned images, and never sends your documents to the cloud. Start a 7-day free trial.
Sources
- Osano, “Data Privacy Laws: What’s New in 2025 and Beyond” — osano.com/articles/data-privacy-laws and associated statute-specific articles on osano.com.
- Federal Trade Commission Act.
- HIPAA (Public Law 104-191).
- Fair Credit Reporting Act, 15 U.S.C. § 1681.
- Gramm-Leach-Bliley Act overview.
- UNCTAD, Data Protection and Privacy Legislation Worldwide.
Robert Bergman is CEO of Southwest Management Technology and Next Level Mediation.