Google Put GA4's Conversion Performance Report on the Data API in Alpha

Google Put GA4's Conversion Performance Report on the Data API in Alpha
GA4's Data API now exposes the conversion performance report directly, with attribution model and conversion action filters baked into a single call.

Google added cross-channel conversion reporting to the Google Analytics Data API on May 4, 2026. The new conversionSpec field in RunReportRequest accepts conversion action IDs and one of two attribution models, DATA_DRIVEN or LAST_CLICK. That means data that previously lived only inside the GA4 Advertising > Conversion performance UI is now pullable through a single API call, in alpha.

The release is small in surface area and big in plumbing implications. If you have ever maintained an attribution dashboard for a paid media team, you already know the shape of the workaround it deletes.

What actually shipped on May 4

PPC Land's writeup pinned three additions to v1 alpha of the Data API: a ConversionSpec field on RunReportRequest, a Section field on ResponseMetaData that flags whether a row came from standard reporting or conversion reporting, and an updated getMetadata response that surfaces a conversions array of ConversionMetadata objects.

The conversionSpec body is the meaningful one. According to Google's Data API conversion reporting docs, it carries a list of conversionActions (resource names of the conversion actions you want included) and an attributionModel field, which accepts DATA_DRIVEN (the default) and LAST_CLICK. If conversionActions is empty, every conversion in the property is in scope.

The metric set is roughly what you would expect: allConversionsByInteractionDate, allConversionsByConversionDate, returnOnAdSpendByInteractionDate, totalRevenueByInteractionDate, plus advertiserAdCost, advertiserAdImpressions, and advertiserAdClicks. The Section field on ResponseMetaData matters because hybrid reports, the kind that mix a few standard metrics with a few conversion metrics, will tell you which section produced the row. So you can finally stop guessing why your aggregates do not reconcile.

The fine print is real, though. The release is in alpha, which is documented on the official Data API changelog. Google's own note, in plain English, says "this feature may not be available to your Google Analytics property," and the team is rolling it out gradually. So before you tear out a pipeline you have running today, you check property eligibility first.

The BigQuery export just lost its monopoly on attribution data

Here is what most teams do today, or did until last week.

To get cross-channel conversion data out of GA4 into a Looker Studio dashboard, a Snowflake warehouse, or anywhere a CMO actually looks, you run the BigQuery export. The export itself is free up to a point. The dashboard layer is where the bill shows up. As Paolo Bietolini's writeup on GA4 BigQuery export costs lays out, a careless SELECT * across a year of event-level data scans somewhere between 50 and 200 GB, which costs $0.31 to $1.25 per query at on-demand pricing. Now multiply that by every Looker Studio refresh, every team member opening the report, and every alert that pings hourly.

The fix has always been the same: build flattened summary tables in BigQuery, never let the dashboard hit raw event tables, schedule the rollups overnight. That works. It also costs engineering hours that nobody on a paid media team really wants to own.

The Data API call collapses the path.

If your only need from BigQuery was pulling daily or hourly conversion deltas to feed an attribution dashboard, you can probably retire that pipeline. Hit conversionSpec with the model you want, page through the rows, write to your destination. The export still earns its keep for event-level forensics, custom audiences, and any modeling that needs raw user paths. For a clean attribution report, you no longer need it.

I think most analytics teams are about to discover their pipeline was 80% schema-ferrying and 20% real transformation. The 80% was always replaceable. It just took Google five years to expose the right endpoint.

DATA_DRIVEN being the default is the part to question

This is the spot where I would slow down.

DATA_DRIVEN attribution is Google's machine-learned multi-touch model. It works best when the property has high conversion volume and varied path lengths. From what I have seen, accounts under roughly 600 conversions in a 30-day window get sparse, sometimes nonsensical credit splits when DDA is the lens. Google does not publish a hard floor; smaller accounts just get model output that is closer to noise than signal.

The Data API conversionSpec defaults to DATA_DRIVEN. If your dashboard renders attribution numbers via this endpoint without specifying attributionModel, every report inherits that default. Which is fine for a major retailer. For a B2B SaaS account doing 80 conversions a month, it is going to look stable while quietly reshuffling credit between channels in ways nobody can audit.

If you wire this into reporting for accounts with low conversion volume, set attributionModel = LAST_CLICK explicitly. Make the model choice an intentional decision, not a default you inherited.

That is mostly an analytics governance point, not an API one. But the API is the thing that is going to push DDA into a lot of dashboards that probably should not be running it. The same kind of silent reconciliation problem turned up earlier this year when GA4's Task Assistant missed misconfigurations on roughly 73% of properties. Defaults that look reasonable but quietly distort the numbers underneath are a recurring theme in this product.

The integrations that need a schema diff before June

The angle worth flagging is what happens to vendors and internal tools that already read conversion-adjacent data from GA4 via aggregate endpoints. Anything that pulled conversion counts via the standard runReport flow without conversionSpec still works. It will not, however, give you the same numbers as a conversionSpec-aware report, because the new path adds attribution model choice, conversion action filtering, and a separate Section in the response.

Two reconciliation traps I would expect to see this quarter.

One: dashboards comparing conversion totals across the old non-spec path and the new spec path will diverge. Same property, same window, different numbers, and not because of a bug. The two reports use different attribution mechanics. Anyone who explains it as "the new API is broken" is going to waste a week.

Two: vendors building connectors will need to decide whether to expose attributionModel as a user-set parameter or hardcode one. Any connector that hardcodes DATA_DRIVEN ships the same default-quality problem to every customer that touches it. Any connector that exposes the choice will need a doc page explaining what to pick. The lazy path is hardcoding. The honest path is the second one, and from what I have seen, vendors rarely take it on the first release.

If you maintain anything that consumes GA4 conversion data programmatically, the move this week is to add a field-level diff: pull the same window through the old call and a conversionSpec call, log both, and watch for divergence over the next two weeks. The schema reconciliation work itself is small. The downstream alignment work, where someone has to explain to a CMO why the attribution dashboard now shows different numbers than the spreadsheet, is the part that takes time.

What this changes for the next attribution dashboard you build

If you are starting a new attribution dashboard tomorrow, the BigQuery export is no longer the obvious first move. Hit the Data API with conversionSpec, set attributionModel by hand, cache results in your warehouse if you need history, and skip the export for this use case. You'll save the BigQuery scan cost, and you'll skip writing rollup logic that the API now does for you.

If you have an existing pipeline, do not rip it out yet. Alpha means schema can change before GA. Run the new path in parallel for at least a month, log discrepancies, and only retire the export-backed flow once the API stabilizes and your eligibility is confirmed.

And keep one eye on whether DATA_DRIVEN as the default holds up to scrutiny in the wild. I suspect Google is going to get pressure from advertisers running smaller accounts to either expose more attribution model choices, or document the conversion-volume threshold below which DDA is not advised. Until then, the responsibility falls on whoever wires the API into the dashboard.

The Data API just made attribution reporting a one-call problem. The default model choice is the trap. Set DATA_DRIVEN on a low-volume account because you didn't read the docs, and the dashboard will look correct while the credit splits underneath quietly shift, sometimes for weeks, before anyone catches it.

Notice Me Senpai Editorial