Google Open-Sourced Meridian GeoX. The Holdout Now Lives in Google's Stack.
Google announced three measurement tools at Google Marketing Live on May 5, 2026: Meridian GeoX (open-source geographic incrementality), Meridian Studio (a Google Cloud MMM platform), and a Data Manager map view that visualizes data flow across BigQuery, HubSpot, Shopify, and Drive. GeoX testing begins later this year. Studio runs on Google Cloud with no announced access details. Every new tool routes a piece of your measurement stack through Google.
The pitch is that incrementality testing finally got cheap. The catch is who owns the holdout.
What Meridian GeoX actually does
GeoX runs geo-based incrementality tests. You split your markets into a treatment group and a control group, withhold or change spend in the control, and measure the lift. That is the same setup you would build in-house with a vendor like Haus, Recast, or Cassandra, except Google has packaged it as open-source code that pipes its output into Meridian, Google's open-source MMM.
The methodology is not entirely new. Google's trimmed_match and matched_markets repos have been on GitHub for years. What is new is the productization. GeoX is now a named tool with documentation, an obvious upgrade path into Meridian, and a place in the Google Marketing Live keynote.
For most paid-search teams, this is good news. Geo-experiments used to require an in-house data scientist, a vendor charging six figures, or both. Free, well-documented, open-source code lowers that bar in a meaningful way. That is the pitch, and it is not wrong.
The catch sits in two places
First catch: GeoX needs scale. Google did not publish a minimum, but geo-experiments mathematically need enough markets and conversions per market to detect lift above the noise floor. Based on what Haus and Recast have published on their own minimums, a fair rule of thumb is around $250K+ per month in measured media to run a clean US DMA-level test. SMBs spending $5K per month on Google Search can run this tool, but they probably should not trust the result.
Second catch, and the bigger one: Google built the incrementality tool that is most likely to grade Google's own lift. That is not inherently disqualifying. Open-source code is auditable, and the math behind GeoX is grounded in published statistical research from Google Research. But the routing matters. If you run GeoX, Meridian becomes the natural place to combine the geo-test signal with your other channels. And if you run Meridian, Meridian Studio is where Google wants you to run it at scale.
Auditable code does not change the fact that you have centralized measurement and cloud infrastructure with the same vendor whose lift you are trying to validate.
Meridian Studio is the Google Cloud version
Meridian Studio was announced as the enterprise tier. It is a managed platform on Google Cloud, designed for teams that run high-volume MMMs and do not want to maintain the open-source codebase themselves. Google did not publish pricing or general-access timing, which Search Engine Journal flagged in its preview coverage.
For Fortune 500 brands already on Google Cloud, this is a frictionless add-on. For teams on AWS or Snowflake, it becomes a procurement decision that probably gets pushed to a CFO. The named partner ecosystem (Adswerve, Choreograph, Brainlabs, Epsilon, Fifty-Five, Jellyfish, Making Science, Merkle) reads like a who's-who of agencies who already have GCP practices.
There is also the comparison nobody at GML mentioned. Amazon shipped its own MMM API to general availability earlier this year and explicitly called out incremental measurement of non-Amazon channels (we covered that in Amazon's MMM API Hit GA and Repriced Every Channel That Isn't Amazon). Two of the largest ad sellers now ship MMM tooling that grades themselves and their competitors in the same model. If you trust both equally, you might land at the same answer. If you don't, the spread between the two MMM outputs becomes the most interesting number in your stack.
Data Manager Map View is the connector flex
Map View is the least controversial of the three announcements. It is a visual interface inside Data Manager showing how your first-party signal flows from BigQuery, Google Drive, HubSpot, and Shopify into Google Ads. The use case is real. Most marketers have seen a dashboard go quiet because a connector silently broke and nobody noticed for three weeks. A map that shows broken edges in red is genuinely useful.
The flex sits underneath the map. BigQuery is Google. Drive is Google. HubSpot and Shopify are partners. Snowflake is conspicuously absent from the announcement. The map does not just visualize your data flow. It quietly defines the supported partner ecosystem.
The 14% number and what it doesn't tell you
Google cited a stat in the keynote: advertisers using the Google Tag Gateway (the upgraded server-side tagging path) saw an average 14% conversion lift, comparing July to December 2024 against January to June 2025, internal Google data, finance-sector advertisers.
A few things worth flagging on that number.
The comparison period crosses two iOS releases and a holiday-vs-non-holiday window. Some of the 14% is seasonal mix.
"Conversion lift" here likely means more conversions reported in Google Ads, not more conversions that actually happened. Server-side tagging recovers conversions that browser-side tagging loses to ITP, ad blockers, and consent gates. It is recovering signal, not creating sales. The two are easy to conflate, and most reps will conflate them on a sales call.
Finance-sector advertisers tend to have long, complex consent flows where server-side recovery is highest. The 14% number probably will not generalize to ecommerce, where the recovery delta is usually closer to 3 to 7%.
From what I have seen, treat 14% as the upper bound of what is plausible, not the median. If your post-migration lift is below 5%, you didn't break the implementation. You're probably just in a less consent-heavy vertical.
Three operational calls for this quarter
If you run paid search at $250K+ per month and have been outsourcing geo-incrementality to a vendor, GeoX is worth a serious evaluation. The methodology is solid, the code is open-source, and you can validate it against your existing vendor's output for one quarter before deciding whether to migrate. If the answers diverge by more than 15%, escalate to your data team before deciding which tool to trust.
If you are running an MMM, hold off on Meridian Studio until Google publishes pricing and SLA details. The open-source Meridian framework already runs on whatever cloud you have. Studio is the convenience tier, and convenience tiers tend to have step-function pricing that does not show up in the keynote.
If your Data Manager has been broken for a while and you didn't know, the new map view will probably tell you that within five minutes of rollout. Audit your tag setup before it ships, so you walk in knowing what you expect to see. There is a thread on r/PPC where an agency described losing six weeks of conversion signal to a broken HubSpot connector and not realizing it until a client asked why CPA had doubled. Map View would have caught that on day two.
The honest read
Google has been pushed for years to give advertisers a real incrementality answer instead of last-click attribution. GeoX is a real answer. It is also Google's answer, run inside Google's framework, fed into Google's MMM, hosted (if you choose Studio) on Google's cloud. That does not make the math wrong. It makes GeoX one signal in a portfolio of measurement, not the final word.
From what I have seen, the teams that will get the most out of this are the ones who run GeoX alongside one independent vendor like Haus, Recast, or an in-house data team, and treat the gap between the two outputs as the actual research question. That is not the path the keynote walked you through. It's probably the one most ops teams should walk anyway.
Notice Me Senpai Editorial