AudienceProject Counted Heineken's Reach and Found Meta Beat TV 19.4x

AudienceProject Counted Heineken's Reach and Found Meta Beat TV 19.4x
AudienceProject's deduplicated reach work surfaced a 19.4x efficiency gap that a platform-by-platform read missed entirely.

AudienceProject's April 2026 white paper documented a Heineken cross-media test where Meta delivered 19.4x more efficient incremental reach than linear television. The brewery moved digital from 30% to 42% of media spend in response. The wider data point sitting underneath the case: only 55% of UK TV ad spend in 2026 will run inside JIC-measured environments, down from 91% in 2014.

The Heineken number that broke the budget plan

The Strongbow Ultra launch ran across linear TV, YouTube, and Meta. AudienceProject measured reach across all three with deduplicated counts, not platform-reported reach. The output, published in their case study and covered by PPC Land, was 19.4x more efficient incremental reach from Meta than from television. Meta alone added 15.4% incremental reach to the campaign. The combined TV plus Meta exposure produced a 22 percentage point lift in brand awareness.

Heineken raised digital allocation from 30% to 42%. Twelve points of media budget, moved on the strength of one measurement study.

I think the part most planners will skip is the prerequisite: this number does not exist if you let each platform self-report. Meta's own measurement told Heineken one thing about Meta. JIC panels told them another about TV. There was no shared definition of who got reached, when, or whether the same person was counted twice. The 19.4x efficiency only became visible after AudienceProject deduplicated across the three environments. Without that step, Heineken's media plan looked roughly fine.

Why "scale" keeps getting confused with "reach"

AudienceProject's myth-debunking post makes the cleanest version of this point. Scale is impressions delivered. Reach is unique people exposed. Cross-media campaigns rarely have a problem with scale. They almost always have a problem with overlap.

When you run YouTube, Meta, and CTV against the same audience definition, a noticeable share of impressions hits the same people. Without independent measurement, that overlap reads as "reach growth" inside each platform's dashboard. It is mostly frequency growth. The headline reach numbers each platform hands you are not designed to be added together, but most planning decks add them anyway.

This part surprised me a little. The Strongbow result implied Meta was exposing people television had not reached at all, not just topping up frequency on heavy TV viewers. That kind of incremental gain is exactly what a properly deduplicated reach study is supposed to surface, and exactly what platform-by-platform reporting tends to hide because each platform claims credit for the people it touches.

The slow death of JIC-measured TV

The white paper cites an ITV/WARC analysis on the share of UK TV ad spend that runs inside JIC (joint industry committee) measurement. The line is steep. 91% in 2014. 85% in 2018. 73% in 2022. Roughly 55% projected for 2026. ITV's own linear TV revenue decline lines up with the broader trend.

The implication is uncomfortable for anyone still using JIC panels as the truth layer for reach planning. Almost half of UK TV money in 2026 will be running through environments those panels do not see. BVOD, CTV, social video, video-sharing platforms. Ofcom 2025 data cited in the AudienceProject paper shows UK adults watching 270 minutes of in-home video per day, of which only 102 minutes is live TV. The rest splits across recorded playback, BVOD, SVOD/AVOD, video-sharing platforms, and other TV-set usage.

If your reach model assumes the JIC panel is the denominator, you are increasingly modeling against a shrinking slice of where attention actually sits.

The myth most planners still cite without checking

The paper takes direct aim at the "TV only reaches older audiences" claim that gets recycled in every quarterly media review. AudienceProject's February 2026 survey found the youth reach gap is largely closed once BVOD and social clips of broadcaster content are included. The content does not stay locked inside live linear. It travels.

I think the more useful frame is this: broadcaster content is no longer a channel, it is an IP that gets distributed across channels. ITV's content shows up on ITVX, on TikTok in clip form, on YouTube as highlight reels. If a planner only counts the live linear watch, they undercount what the broadcaster IP actually delivered.

Kanishka Das, P&G's Senior Director of Global Media Analytics, framed it inside the paper this way: "We need complete, open, transparent and future-proofed cross-media measurement to enable consumers to have a better viewing experience with less annoying repetition." The repetition framing is worth flagging. Consumers experience over-frequency as a brand problem. Brands experience it as wasted reach budget. Both sides want the same fix, which is why this measurement work has more pressure behind it now than it did three years ago.

The audit that takes a quarter, not a week

I am not going to pretend a Heineken-grade cross-media measurement study is something a 10-person team runs casually. AudienceProject's methodology, which they expanded again with the Trade Desk partnership earlier this year, pulls panel data, server-to-server connections into Meta and YouTube and Disney+, clean rooms, and an identity graph for CTV. That stack takes a quarter to stand up if you start from zero.

The smaller version any team can run this month is more honest about what the existing reports do not tell you.

Pick one campaign that ran across at least two video environments in the last 60 days. List, in writing, what each platform reported as reach. Then ask the simple question: is there any deduplication between these numbers, or are we adding two reach figures together that almost certainly overlap? In most cases I have looked at, the answer is the second one, and the team has been quietly summing duplicate audiences for months without flagging it as a measurement gap.

If the answer is "we have no deduplicated number," that is the gap to close, not the budget to defend. Even a directional overlap study from a panel provider beats stacking platform-reported reach figures.

For context on how badly this can go, the MiQ household study earlier this year found only 43% of marketers trust their own measurement. The AudienceProject paper is, in part, a sales pitch for solving exactly that. But the underlying point holds whether or not you buy their product. The default assumption that reach numbers from different platforms are additive is wrong in the cases where it actually matters.

Why this measurement number stuck when most do not

Most of the budget reallocations driven by measurement studies in the last decade got reversed within 12 months because the methodology did not hold up to scrutiny. Heineken's 12-point shift has stuck because the deduplication work was independent of every platform it measured, and because the brand awareness lift number (22 percentage points combined) was big enough to survive the next planning cycle's pushback. From what I have seen, that combination, independent methodology plus a brand metric that actually moved, is roughly what separates a real measurement insight from a temporary slide in a deck.

If there is one practical thing to walk away with this week, it is to write down, for your last cross-media campaign, whether anyone in the room can name the deduplicated reach number. Not the per-platform reach numbers. The cross-channel one. If nobody can, that is the missing piece.

Notice Me Senpai Editorial