State Health Exchanges Leaked Race and Citizenship Data to Meta and TikTok
Bloomberg's May 4 investigation found pixel trackers on nearly all 20 U.S. state health insurance exchanges sending sensitive application data, including race, citizenship status, sex, ZIP codes, phone numbers, and questions about incarcerated family members, to Google, LinkedIn, Meta, Snap, and TikTok. More than seven million Americans bought 2026 coverage through these state exchanges. Washington D.C. paused its TikTok rollout; Virginia removed the Meta tracker after Bloomberg flagged it.
The data that actually left the page
The TechCrunch summary of Bloomberg's investigation lays out fields that nobody should ever see leaving a government insurance enrollment flow. New York's exchange shared application detail covering whether the applicant has incarcerated family members. The Washington D.C. exchange told Bloomberg that residents' email addresses, phone numbers, and country identifiers were sent to TikTok. TikTok's pixel tried to redact race fields, but some races came through redacted and some didn't, which is the specific kind of "we tried" engineering finding that pixel-tracking class actions tend to use as exhibit A.
Virginia's exchange was sending ZIP codes to Meta until Bloomberg asked about it.
This is not a one-vendor story. The list of receiving platforms (Google, LinkedIn, Meta, Snap, TikTok) is the standard ad-tech pixel stack on most U.S. web properties. The only difference is that the page collecting the data is a state-run health exchange, which is exactly the context that has produced the largest pixel settlements of the past three years.
Why this is already the next HIPAA-pixel class action
Pixel-tracking enforcement has been quietly compounding. Feroot's consolidated analysis of 19 cases from 2023 to 2025 totaled more than $100M in healthcare pixel settlements and OCR penalties. The trend line: 2023 closed at $37.15M across eight cases, 2024-2025 jumped to $50.61M across six cases, with HealthPartners ($6M) and University of Rochester Medical Center ($2.85M) bookending the early 2025 wave. Pomona Valley Hospital Medical Center settled for $600,000 in November 2025 over Meta Pixel and similar trackers.
The legal posture against Meta and TikTok is also worse than it was a year ago. Bloomberg Law reported that Meta and TikTok took losses in a federal pixel lawsuit over health information sharing, and the HIPAA Journal flagged a federal judge tentatively advancing a Meta Pixel medical-privacy class action past the motion to dismiss. That matters because it lowers the standing bar for the next round of plaintiffs.
State health exchanges are the most plaintiff-friendly fact pattern I have seen in this category. There's a regulated population (insurance applicants), explicit sensitive fields (race, citizenship), a clearly enumerated data recipient list, and on-the-record admissions from at least one state government spokesperson. The seven million enrollee number is the class-size headline a plaintiff firm would put on the first page of a complaint.
I think the question isn't whether suits get filed. It's which firm files first, and which states get added to the caption.
The pixel-on-PHI mechanic most marketers still get wrong
The reason this keeps happening, even after the HIPAA Journal documented one-third of healthcare websites still running Meta Pixel into 2024, is that pixel data leakage isn't usually a deliberate decision. It's a default.
Meta Pixel, TikTok Pixel, LinkedIn Insight Tag, and Google Tag Manager will, by default, hoover up form field values, URL query strings, and DOM text content unless you explicitly tell them not to. If your marketing team installs the pixel on a site-wide tag, and the site happens to include an enrollment form that asks for race or citizenship, the pixel collects what it sees. The vendor side argues this is a deployment problem; the regulator side argues the platform should know its pixel is on a healthcare property and behave accordingly. Both can be true. Neither saves you in discovery.
Disconnect's research team published a separate look at what TikTok's pixel can capture from a health context (cancer diagnosis fields, fertility tools, crisis counselor pages) that's worth reading if you're trying to convince anyone internal that this is real, not theoretical.
The data-layer audit that probably saves you a deposition
If you're running pixels on any regulated vertical (health, finance, government services, education, anything age-gated), do this in the next ten business days. None of it is hard. Most of it is uncomfortable.
- Pull the list of every third-party tag firing on your site. GTM container preview mode plus the browser dev tools network tab. If the marketing team isn't running this audit monthly, you don't actually know what's on your pages.
- Identify which pages contain regulated data input. Forms, URL paths with patient or applicant identifiers, query strings carrying eligibility data. Anything that touches PHI, race, citizenship, immigration status, or financial qualification.
- Configure pixel exclusions explicitly. Meta Pixel supports automatic-event blocklists and form-field redaction. TikTok Pixel supports event-level blocklists. Google Tag Manager supports trigger exclusions for entire URL patterns. None of these are on by default. Turn them on.
- Block third-party scripts on regulated pages entirely if you can. The least-bad pattern I've seen is "marketing pixels on the public marketing site only, zero third-party scripts on the actual application flow."
- Document what you turned off and when. You may need this on a CIPP letterhead later.
Freshpaint's HIPAA tracking timeline is a useful one-pager to send to your compliance team if they are still treating this as a "marketing problem."
The pattern this fits into
The browsergate stories of the last six months are all the same shape. LinkedIn's hidden scan catalog of 6,278 extensions showed how far enterprise platforms will go to map third-party data flow on their own users. The Chrome extension data brokerage piece on 82 extensions selling 6.5M users' browsing data showed the data leak vector consumers can't even see.
This story is the same shape with worse plaintiffs. The data is more sensitive, the operator is a state government, and the receiving platforms have already lost similar suits. From what I've seen, this is the kind of fact pattern that pulls in attorneys general before it pulls in private plaintiffs, because state AGs read Bloomberg too.
The marketers who get burned worst here probably won't be the ones whose names appear in the original Bloomberg piece. They'll be the ones running pixels on the next regulated vertical, who read this story, didn't audit, and got named in a derivative complaint two quarters later.
Run the audit this week. Or have a really good answer ready about why you didn't.
Notice Me Senpai Editorial