Google Ads Cut Granular Data to 37 Months (17 Months After Promising 11 Years)
Google Ads will cut granular reporting retention from 11 years to 37 months on June 1, 2026, roughly 17 months after announcing the longer window. Daily, hourly, and weekly data older than 37 months will be unreachable through the Google Ads API and scripts; queries that ignore the cap return a DateRangeError. The export window for any agency running an annual MMM refresh or YoY benchmark pull is roughly four weeks.
The reversal nobody actually asked for
Google quietly updated its retention policy on May 1, with the change taking effect June 1, 2026. PPC Land's reporting on the change quotes the policy directly: reporting data collected by Google Ads will be available for 37 months, after which it will not be accessible via the Google Ads interface or APIs.
The same company told advertisers in October 2024 that all reporting data would be retained for 11 years. That policy lasted roughly seventeen months. From a planning standpoint, that is not a retention policy; that is a calendar suggestion.
The cut is selective, which matters more than the headline. The 37-month cap applies only to granular segments: daily, hourly, weekly. Monthly, quarterly, and yearly aggregates keep the 11-year ceiling. Reach and frequency metrics get a tighter limit of three years. DV360 and CM360 stay at 24 months, which is the ceiling those products always had.
If your reports run on monthly aggregates, you mostly do not feel this. If you run anything that needs daily granularity beyond three years (incrementality tests against historical baselines, MMM with weekly inputs, cohort analysis on long-cycle B2B accounts), you feel it on June 2.
What actually breaks on June 1
The four affected systems do not break the same way, which is the part most teams will miss until something silently goes wrong.
Google Ads API and scripts throw a hard error. Queries requesting granular segments older than 37 months return DateRangeError.INVALID_DATE. This is the well-behaved failure mode. You see it, you fix the query window, you move on.
The Google Analytics Data API does not throw an error. It silently truncates. Affected metrics (Advertiser Ad Cost, Clicks, Impressions) get clipped to "the latest 36-month window," per the same PPC Land breakdown. If you have a Looker Studio dashboard or a custom GA Data API connector pulling four years of cost data, it does not warn you. It just returns 36 months and you have to notice the missing year yourself.
The silent truncation is the dangerous one. The hard error tells you something changed. The truncated dashboard just looks slightly wrong if you weren't already looking.
BigQuery Data Transfer Service stops backfilling beyond 37 months for Google Ads and Search Ads 360 connectors. Anything you have already populated stays in the table. Anything you want to pull in retroactively after June 1 is gone. GA4 backfills carry an additional risk per Google's own documentation: re-triggering them after the deadline can overwrite existing rows with empty values. Re-running a backfill on the wrong day could nuke your saved history.
DV360 and CM360 APIs are untouched. The 24-month retention there has been baked in for years.
There is a related but separate change worth knowing about. Google's API will also start inflating PMax conversion numbers on June 15 with no backfill. That is two retroactivity holes in the same fortnight.
Why MMM teams feel this first
The teams that hit the cap fastest are the ones doing marketing mix modeling and any kind of YoY trend analysis on platform data.
Most MMM practitioners agree the minimum viable input is 18 to 24 months of weekly data across at least five channels, but the model genuinely improves with more. Two full seasonal cycles is the difference between the model separating marketing effects from seasonality and the model just printing whatever the trendline already was. From what I have seen, mature MMM setups typically run on three to four years of weekly input.
Three to four years of weekly input is exactly what the 37-month cap removes the rails under. A fresh MMM kicked off in June 2026 can pull about three years of granular Google Ads data. By June 2027, it can still only pull about three years; the floor moves with the calendar. The window stops growing forever, which is a real shift from how this data has worked since late 2024.
The same applies to anyone who pulls "this November vs last November vs the November before that" performance reads at daily granularity. The third year disappears in stages, depending on the date you ask.
The four-week export checklist
If you have an account or a portfolio that might ever need granular history beyond June 2026, the practical move is to run a BigQuery backfill before the cap takes effect. Google's own Data Transfer Service docs for Google Ads flag this directly: if you want to keep historical data beyond 37 months in BigQuery, start backfill runs early so they can complete before June 1, 2026.
A reasonable order of operations for the next four weeks, working backward from June 1:
- Identify every Google Ads MCC and direct account whose granular history you actually use. Most agencies have more accounts than active modeling needs; this list is shorter than you think.
- Set up (or check) the BigQuery Data Transfer Service for those accounts. The connector itself takes about an hour per MCC; the bottleneck is permissions and billing on the GCP project, not the configuration.
- Trigger a backfill for the maximum window the API still allows (currently up to 11 years). Backfill jobs for large accounts can run for days, which is the practical reason to start now and not the last week of May.
- Verify the table actually has the rows you expected before June 1. Spot-check a date 38+ months back and confirm row counts. If the backfill silently capped, you want to know in May, not the day after.
- Snapshot anything else that is not in BigQuery: custom Looker Studio extracts, Sheets pulls, agency report exports. The interface gives you monthly aggregates after the cut. Granular daily breakdowns past 37 months are simply gone.
For teams doing legal-sensitive work, this is also the moment to pull a clean billing export. The Keller Postman $218B Google Ads arbitration is being built on advertiser billing records, and the relevance window stretches further back than 37 months. Anyone considering joining, or anyone who might need to defend their own spend, should download the full historical billing CSV while the interface still serves it.
A note on what this signals
The optimistic read is that storage costs scaled badly and Google decided granular long-tail queries weren't worth the infrastructure. The less generous read is that fewer years of granular advertiser-side data also makes it harder to reconstruct exactly what Google did to your account three years ago, which is roughly the timeframe under arbitration scrutiny right now. I do not know which read is closer to the truth, and honestly the difference does not change what you should do this week.
What I do think is that Google is unlikely to extend retention back to 11 years. They reset the floor, and they will keep it where it is until someone makes them move it. From what I have seen, ad platforms shrink data access; they almost never grow it back.
If your historical Google Ads data ever shows up in a board deck, an MMM, an audit, or a lawsuit, the safest place for it after June 1 is somewhere you control. The export window for that is closing in roughly four weeks, and the API will not be polite about it.
Notice Me Senpai Editorial