Sales & Conversion

Why I Stopped Trusting Ad Tracking Data (And What I Use Instead)

Personas
SaaS & Startup
Personas
SaaS & Startup

Last year, I watched a client celebrate their Facebook ads hitting an "8.5 ROAS" while their bank account told a completely different story. They were burning through cash faster than ever, despite Facebook claiming their ads were printing money.

This wasn't an isolated incident. After years of managing campaigns for SaaS startups and e-commerce stores, I've learned something uncomfortable: ad tracking data lies more often than we'd like to admit. Not because the platforms are evil, but because attribution is fundamentally broken in 2025.

I've seen Facebook claim credit for organic SEO wins, Google Ads take credit for direct traffic, and attribution models so confused they'd make a drunk person look coherent. The worst part? Most businesses make critical budget decisions based on this fantasy data.

In this playbook, you'll discover:

  • Why attribution models are more fiction than fact in today's privacy-first world

  • The real story behind my client's "miraculous" 8.5 ROAS (spoiler: it wasn't the ads)

  • My alternative framework for measuring ad performance that actually correlates with revenue

  • The three questions I ask before trusting any ad metric

  • How to build a measurement system that survives iOS updates and privacy changes

Ready to stop living in an attribution fantasy? Let's dig into what the industry won't tell you about ad tracking data.

Reality Check
What every marketer has been told about attribution

If you've spent any time in marketing circles, you've heard the gospel of "data-driven decisions." The industry has built an entire religion around attribution models, conversion tracking, and ROAS optimization.

Here's what every marketing guru preaches:

  1. Last-click attribution is king - Whatever channel gets the final touch gets all the credit

  2. Facebook's attribution window is gospel - If Facebook says it drove the conversion within 7 days, it must be true

  3. ROAS above 3x means you're winning - Just scale the budget and watch the money roll in

  4. Cross-device tracking is reliable - Platforms can accurately track users across phones, tablets, and desktops

  5. UTM parameters solve everything - Just tag your links properly and you'll have perfect attribution

This conventional wisdom exists because it's convenient. Ad platforms need to justify their existence, agencies need to show ROI to clients, and everyone wants simple answers to complex problems. The attribution model industry is worth billions, built on the promise that we can track every customer touchpoint with mathematical precision.

But here's where this falls apart in practice: the customer journey is messier than any attribution model can capture. People don't browse the internet in neat, trackable lines. They see your Facebook ad on mobile, Google your brand name later on desktop, ask friends about you on WhatsApp, read reviews on Reddit, and then buy three weeks later through a direct search.

Yet your attribution model confidently assigns 100% credit to whichever touchpoint happened to be last. It's like giving full credit for a soccer goal to whoever touched the ball last, ignoring the entire play that made it possible.

The uncomfortable truth? Most attribution is educated guessing dressed up as scientific measurement.

Who am I

Consider me as
your business complice.

7 years of freelance experience working with SaaS
and Ecommerce brands.

How do I know all this (3 min video)

Here's a story that changed how I think about ad attribution forever. I was working with an e-commerce client who was heavily dependent on Facebook Ads. They had a decent 2.5 ROAS, but their CEO was nervous about putting all their eggs in Zuckerberg's basket.

I suggested we build out their SEO strategy as a backup channel. Within three months of implementing our SEO overhaul - complete website restructuring, content optimization, and a solid content creation strategy - something interesting happened.

Facebook's reported ROAS suddenly jumped from 2.5 to 8-9. The client was ecstatic. "Our ads are finally working!" they said. "Whatever you did to the website must have improved our conversion rate!"

But I knew better. Facebook was lying.

Here's what was actually happening: our SEO strategy was driving significant organic traffic and conversions. But here's the kicker - people were seeing our Facebook ads, not clicking them, then later Googling the brand name and converting organically.

Facebook's attribution model gave the ads full credit for these "view-through conversions," even though the actual conversion path was: Facebook impression → Google search → organic conversion. The SEO was doing the heavy lifting, but Facebook was taking the credit.

This taught me that attribution isn't just inaccurate - it's actively misleading. The client was ready to double their Facebook ad spend based on fake ROAS numbers. If we'd followed the data blindly, we would have killed the SEO strategy that was actually driving growth.

That's when I realized that trusting ad platform attribution is like trusting a used car salesman to appraise your trade-in. The incentives are completely misaligned.

My experiments

Here's my playbook

What I ended up doing and the results.

After that eye-opening experience, I developed what I call the "Attribution Reality Check" framework. Instead of trusting what ad platforms tell me, I use a multi-layered approach to understand what's actually driving revenue.

Layer 1: The Bank Account Test

This is the most important layer. I track total revenue and compare it to total ad spend across all channels. If Facebook claims a 6x ROAS but overall business revenue is flat while ad spend increased, something's wrong. The math should add up at the business level, not just the platform level.

For the e-commerce client I mentioned, when Facebook reported 8x ROAS, I looked at their overall revenue growth. It was up 40%, but their Facebook ad spend had only increased 15%. The claimed ROAS would have meant revenue should have tripled. The numbers didn't match because Facebook was stealing credit from SEO.

Layer 2: The Channel Isolation Test

I regularly pause channels for short periods to see the real impact. When you pause Facebook ads for a week, does overall revenue drop by the amount Facebook claims to drive? Usually not even close.

I did this with another SaaS client. Facebook claimed to drive 60% of their trial signups. When we paused ads for two weeks, trial signups only dropped 20%. The rest were coming from organic search, direct traffic, and word-of-mouth that Facebook was claiming credit for.

Layer 3: The Customer Survey Method

This is the most revealing layer. I survey customers asking "How did you first hear about us?" and "What convinced you to buy?" The answers rarely match the attribution data.

For a B2B SaaS client, Facebook attributed 40% of conversions to their ads. But when we surveyed customers, only 8% mentioned Facebook ads as their primary discovery method. Most found them through Google searches, LinkedIn content, or referrals.

Layer 4: The Brand Search Spike Test

When ads are working, they should drive brand awareness, which shows up as increased branded search volume. I monitor brand search terms in Google Search Console and Google Trends. If Facebook claims massive success but brand searches aren't increasing, something's off.

Layer 5: The Multi-Touch Reality Map

Instead of fighting attribution, I embrace the messy reality. I track all touchpoints I can measure and assume significant "dark funnel" activity between them. If someone converts and the attribution shows "Google Ads > Direct > Purchase," I know there were probably 5-10 unmeasured touchpoints in between.

The goal isn't perfect attribution - it's directional accuracy. I want to know if a channel is contributing positively, not the exact percentage it deserves credit for.

Channel Isolation
Testing what really drives conversions by pausing channels and measuring actual revenue impact
Customer Surveys
Direct feedback reveals attribution gaps that platforms miss - customers know their real journey
Brand Search Monitoring
Rising brand searches indicate genuine ad effectiveness beyond claimed direct conversions
Dark Funnel Acceptance
Embracing unmeasured touchpoints instead of pretending attribution models capture everything

Using this framework, I've discovered some eye-opening truths. That e-commerce client I mentioned? When we properly analyzed their data using my framework, we found that SEO was driving 60% more revenue than Facebook reported, while Facebook was overclaiming by about 300%.

For the B2B SaaS client, we discovered their best "channel" wasn't a channel at all - it was word-of-mouth referrals triggered by LinkedIn content that never got tracked. Facebook was claiming credit for 40% of conversions, but the real driver was organic social proof.

The most shocking discovery was timing. Ad platforms typically use 1-7 day attribution windows, but our customer surveys revealed the average consideration period was 3-4 weeks for B2B and 2-3 weeks for e-commerce. Attribution windows were missing most of the actual customer journey.

The result? We completely restructured budget allocation. Instead of doubling Facebook spend based on inflated ROAS, we invested more in SEO, content creation, and customer success programs that actually drove the results Facebook was claiming credit for.

Revenue increased 40% while total ad spend decreased 20%. The "worse" our attributed ROAS got, the better our actual business performance became.

Learnings

What I've learned and
the mistakes I've made.

Sharing so you don't make them.

Here are the top lessons I've learned from questioning ad attribution data:

  1. Platform attribution is marketing, not measurement - Every platform has incentive to overstate their impact

  2. The customer journey is a dark funnel - Most touchpoints happen where you can't track them

  3. Attribution windows are too short - B2B buying cycles often exceed platform tracking capabilities

  4. Customer surveys trump tracking pixels - People know their own journey better than algorithms do

  5. Business-level metrics matter most - If overall revenue doesn't match attributed revenue, trust the bank account

  6. Channel isolation reveals truth - Pause a channel to see its real impact, not its claimed impact

  7. Brand building is unmeasurable but essential - The best marketing creates unmeasurable word-of-mouth and brand equity

What I'd do differently: I'd implement this framework from day one instead of trusting platform data for months. I'd also set up customer survey systems earlier - the insights are invaluable and often contradict everything you think you know about your customer acquisition.

This approach works best for businesses with longer sales cycles, multiple touchpoints, and significant brand components. It's less critical for simple, transactional purchases with single-touch conversions.

How you can adapt this to your Business

My playbook, condensed for your use case.

For your SaaS / Startup

For SaaS startups implementing this approach:

  • Survey trial signups about discovery sources

  • Track branded search volume as leading indicator

  • Measure trial-to-paid conversion by attributed source

  • Compare customer lifetime value across channels

For your Ecommerce store

For e-commerce stores:

  • Post-purchase surveys asking about discovery journey

  • Monitor repeat purchase rates by acquisition channel

  • Track brand search trends during ad campaigns

  • A/B test attribution windows vs actual customer behavior

Abonnez-vous à ma newsletter pour recevoir des playbooks business chaque semaine.

Inscrivez-moi !