Google Ads Counts Every Trial Signup as a Win (Even the 75% That Never Pay)

Google Ads Counts Every Trial Signup as a Win (Even the 75% That Never Pay)
The dashboard says conversions are up. The bank account has a different opinion.

A thread on r/PPC this week described a problem I suspect is way more common than the conversation suggests. A SaaS advertiser was staring at two dashboards: Google Ads showing healthy trial conversion numbers, steady CPA, everything green. Stripe showing revenue that barely moved. The campaigns looked like they were working. The business said otherwise.

This is not a tracking bug. It is a structural problem with how most SaaS teams set up conversion tracking in Google Ads, and Smart Bidding is more than happy to exploit it.

Smart Bidding Treats a $0 Trial and a $10,000 Customer the Same Way

When you set "trial signup" as your primary conversion action in Google Ads, the algorithm goes hunting for the cheapest possible path to that event. It finds users who are likely to fill out a form, click "start free trial," and walk away. That is what you optimized for. The algorithm did its job.

The median B2B SaaS trial-to-paid conversion rate sits around 18-25% for opt-in trials where no credit card is required. That means roughly 75-82% of the signups Google Ads is proudly counting as conversions will never generate a dollar of revenue. And the algorithm has no idea, because you never told it.

Google's own product managers have acknowledged this dynamic. In a recent breakdown of common Smart Bidding mistakes, they noted that advertisers routinely "pass values that do not actually reflect the underlying business goals." The result: reported metrics look healthy while the actual business outcome degrades. The CPA dashboard and the Stripe dashboard diverge until someone in finance asks a question nobody wants to answer.

I think this is one of those problems that is obvious in hindsight but almost invisible while it is happening. Your CPA is stable, maybe even improving. Campaign health looks fine. The algorithm is learning and optimizing. It is just optimizing for the wrong thing.

The Gap Gets Wider, Not Narrower

This is the part that tends to surprise people. You would think Smart Bidding would eventually figure out that some trials are better than others. It will not, because it cannot see past the conversion event you defined. If trial signup is the primary conversion action, that is the entire universe the algorithm operates in. It does not know about Stripe. It does not know about monthly recurring revenue. It does not know that the person who signed up from a "free project management tool" query has a 6% trial-to-paid rate while someone who searched "project management software pricing" converts at 31%.

A deep analysis of SaaS Google Ads strategy put it bluntly: "CPA treats all conversions as equal, and in SaaS, they are absolutely not equal." Different keyword categories produce trial signups with dramatically different revenue potential, but a flat trial-signup conversion action makes them all look identical in the data.

We wrote about a similar dynamic recently when covering how adding more conversion signals to Google Ads often backfires. The algorithm simply gravitates toward the easiest, cheapest signals you give it. Trials are easy. Revenue is hard.

And it compounds. As Smart Bidding gets more data on what a "good" trial signup looks like (from its perspective, meaning "cheap"), it doubles down. Your CPA stays flat or improves. Your revenue-per-trial keeps dropping. The spreadsheet looks great until you compare it to the bank account.

The 90-Minute Fix Most Teams Keep Postponing

The solution has existed for years, and most SaaS teams still have not implemented it. Offline conversion imports. The concept is straightforward: capture the Google Click ID (GCLID) when someone signs up for a trial, store it in your CRM or billing system, and when that person eventually becomes a paying customer, import that conversion back into Google Ads with the actual revenue value attached.

This is what Google's official offline conversion import API is built for. When Smart Bidding receives revenue data tied to specific clicks, it stops optimizing for the cheapest path to a form fill and starts optimizing for clicks that actually produce customers.

The improvement is not subtle. SaaS companies that implement GCLID-based offline conversion imports typically see a 15 to 30 percent improvement in pipeline quality within 60 to 90 days. Some teams report results that are more dramatic than that.

So why do most teams skip it? It usually comes down to one of two reasons. Either the marketing team does not own the billing system (Stripe, Chargebee, whatever) and the engineering request keeps getting deprioritized, or the trial-to-paid cycle is long enough (14-30 days) that the delayed feedback feels like it will not help. Both reasons are wrong, but they are persistent.

What This Actually Looks Like in Practice

If I were setting this up tomorrow, here is the sequence.

Step one: Make sure your trial signup form captures the GCLID from the URL parameter and stores it alongside the user record. Most form tools and CRM integrations handle this natively now. If you are using HubSpot, Salesforce, or even a basic Postgres database, there is a field for it. Check that it is actually populating. It is surprisingly common for the GCLID field to exist in the CRM but never actually get wired up.

Step two: Set up a Stripe webhook (or equivalent in your billing system) that fires when a trial converts to paid. Have it pull the GCLID from the user record and send it to Google Ads via the offline conversion import API. The enhanced conversions for leads method is preferred now because it also works with email address matching when GCLID is not available.

Step three: Create a new conversion action in Google Ads specifically for "paid subscription" or "first payment." Set this as your primary conversion action and demote trial signup to secondary. Secondary conversions still show in reporting but do not influence bidding.

Step four: Give it 60 days. The algorithm needs time to recalibrate. Your CPA will probably look worse initially because you have just redefined what "conversion" means. That is the point. You are telling the system to find people who pay, not people who click "start trial."

The benchmark to watch: your cost per paying customer should drop 15-30% within two to three months, even as your cost per trial signup rises. That is the system working correctly, finally.

The Uncomfortable Math on Existing Campaigns

Here is a rough exercise worth running this afternoon. Pull your last 90 days of trial signups from Google Ads. Match them against Stripe or your billing system. Calculate the actual trial-to-paid rate by campaign, and then by ad group or keyword theme if you can.

Most SaaS teams running opt-in trials will probably find their paid conversion rate ranges from about 8% on broad keywords to 30%+ on high-intent terms. That spread means Smart Bidding is blending everything into a single optimization target and spending heavily on the 8% keywords because the trial CPA looks identical.

The standard LTV:CAC benchmark for healthy SaaS is 3:1. If you are calculating CAC based on trial signups, you are probably seeing a ratio that looks great. Recalculate using cost per actual paying customer and the ratio might be closer to 1.5:1 or worse on your non-brand campaigns. And that gap almost always turns out to be a measurement problem masquerading as a media buying one.

When Revenue Is the Conversion, the Algorithm Gets Honest

I am not particularly optimistic that Google will fix this on their end. The current setup works in their favor, honestly. More "conversions" means advertisers feel good about performance, which means they keep spending. The fix has to come from the advertiser side.

The 15-30% pipeline improvement from offline conversion imports is not really about the import itself. It is about telling the algorithm the truth for the first time. Most accounts have been lying to Smart Bidding for months or years, feeding it a signal that says "this click was valuable" when it was not. The algorithm believed you. It just cannot read Stripe.