OpenAI's New OAI-AdsBot Skips Your GPTBot Block and Has No IP Range File

OpenAI's New OAI-AdsBot Skips Your GPTBot Block and Has No IP Range File
OpenAI quietly added OAI-AdsBot to its public crawler docs on April 21, 2026. The user-agent is separate from GPTBot, so existing blocks miss it.

OpenAI added a second ad-related crawler, OAI-AdsBot, to its public bot documentation on April 21, 2026. The new agent is separate from GPTBot, so existing GPTBot blocks in robots.txt do nothing to stop it, and OpenAI has not published an IP range file for it. Advertisers who block too aggressively risk failing ChatGPT ad validation.

What changed in OpenAI's bot docs

The user-agent that appeared in OpenAI's developer documentation reads Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; OAI-AdsBot/1.0; +https://openai.com/adsbot. SEO consultant Glenn Gabe posted a screenshot of the updated OpenAI bot reference on April 21. Search Engine Journal confirmed the listing the same day, and Search Engine Roundtable followed up with a short note.

OpenAI's stated purpose for the bot is short: "validate the safety of web pages submitted as ads on ChatGPT." It checks policy compliance and assesses relevance for targeting. According to OpenAI's own documentation, the data OAI-AdsBot collects "is not used to train generative AI foundation models."

That last line is the part most coverage stops at. The mechanics are where it gets interesting.

Why your existing GPTBot block does nothing

If you blocked GPTBot last year, you wrote something like this in robots.txt:

User-agent: GPTBot
Disallow: /

That entry does not match OAI-AdsBot. They are different user agents. Robots.txt rules apply per agent string, and OpenAI deliberately gave its ad crawler a different identifier so site owners can treat them independently. From OpenAI's perspective that is the right design choice. From a site owner's perspective it means a quiet new bot started hitting your landing pages this past week, and your robots.txt didn't notice.

The deeper issue: this is the second time in eight months OpenAI has launched a crawler without telling anyone proactively. We covered the same pattern when OpenAI's crawl tripled after GPT-5 and the bandwidth showed up in server logs before the docs caught up. OAI-AdsBot is the smaller, more targeted version of the same playbook. Ship the bot, list it in the docs later, let practitioners sort it out.

I think most teams won't notice for weeks. The hits will be low volume, only on pages you've submitted as ads, and they'll get bucketed as "other" in your analytics.

The missing JSON file is the bigger problem

OpenAI publishes verifiable IP ranges for its other three crawlers:

  • openai.com/gptbot.json
  • openai.com/searchbot.json
  • openai.com/chatgpt-user.json

There is no openai.com/adsbot.json. PPC Land flagged the gap in its writeup and the absence is what makes this awkward. User-agent strings can be spoofed in seconds. Without a published IP file, you can't verify that a request claiming to be OAI-AdsBot is actually OpenAI and not a competitor scraping your ad pages, a content harvester pretending to be a polite crawler, or a security researcher running a fingerprint test.

For comparison, Google publishes Googlebot's IPs and Bing publishes Bingbot's. It's the standard. OpenAI is doing the right thing for its three older bots, then skipped it for the new one.

The omission could be a documentation gap. It could also be that OAI-AdsBot runs from shared OpenAI infrastructure that overlaps with GPTBot or ChatGPT-User and OpenAI hasn't decided how to slice the IP file yet. Either way, you're operating on trust until they publish one.

The validation paradox nobody is talking about

Here is the trap. If your WAF or bot manager (Cloudflare, Akamai, DataDome, the usual suspects) treats unknown crawlers as suspicious by default, OAI-AdsBot will get challenged or blocked. If it gets blocked, OpenAI's ad system can't validate your landing page. If validation fails, your ChatGPT ad doesn't run.

This matters more this week than last week because OpenAI expanded ChatGPT ad inventory to logged-out users on April 23. More inventory means more advertisers trying to push budget through the system. Aggregator complaints about budgets not pacing have been live for months. Adding a hidden validation chokepoint right now, when everyone is trying to onboard, is bad timing. From what I have seen on the ad ops side, "campaign won't approve" tickets rarely get traced back to a robots.txt or WAF rule. They get blamed on creative.

If you are running paid ChatGPT ads or planning to, this is a debugging dead-end you do not want.

The robots.txt entry to add right now

Two scenarios. Pick the one that matches your setup.

Scenario 1: You don't run ChatGPT ads and don't want OpenAI's ad crawler on your pages.

Add this to your robots.txt:

User-agent: OAI-AdsBot
Disallow: /

That tells OpenAI to skip your site for ad validation. If a competitor submits one of your URLs as their ad (which would be a separate policy violation on OpenAI's end), this also blocks the validation crawl.

Scenario 2: You run or plan to run ChatGPT ads.

Make sure your robots.txt and WAF explicitly allow the OAI-AdsBot user-agent string. In Cloudflare, that means a "Skip" rule for User-Agent contains "OAI-AdsBot/1.0" ahead of your generic bot challenge. Audit it the same week you submit your first ad. The 5-minute change here is cheaper than three days debugging why an ad won't approve.

For both scenarios, log the user-agent hits for the next 30 days. Until OpenAI publishes the IP file, your logs are the only proof you have that the requests are real. And honestly, even with the docs published, the lack of an adsbot.json means you should keep watching them anyway.

What this signals about ChatGPT Ads going forward

The bigger takeaway is not really about robots.txt. It is about the rate at which OpenAI's ads stack is being built in public. New crawler in April. Logged-out inventory in April. The discrepancy between GPT-5.5 on consumer tiers and GPT-5 on advertiser tiers landed last month. The product is being assembled live, and practitioner-facing documentation is consistently lagging the ship date.

Six months out, expect 2-3 more under-documented changes per quarter. Building a 15-minute weekly habit of reading the OpenAI bot reference page and the ChatGPT Ads policy doc seems more useful than waiting for trade press to catch up. The lead time on these changes appears to be measured in days, not months.

The 2-line robots.txt fix is the cheap part. Building the muscle to spot the next one before it breaks an ad campaign is the part most teams won't bother with until something actually breaks. Personally, I'd rather be in the first group.

By Notice Me Senpai Editorial