Growth & Strategy
Last year, I was called in to audit a B2C Shopify client's Facebook ads that were "failing miserably." Their dashboard showed a disappointing 2.5 ROAS with a €50 average order value. The marketing team was ready to pull the plug on their entire paid strategy.
But here's the thing - I've seen this story play out dozens of times. Most businesses are looking at their ad performance through the wrong lens, chasing metrics that don't actually matter for their bottom line.
After three months of digging deeper into their attribution data and implementing a comprehensive SEO strategy alongside their ads, something interesting happened. Facebook's reported ROAS jumped from 2.5 to 8-9. Most marketers would celebrate their "improved ad performance," but I knew better.
The reality? SEO was driving significant traffic and conversions, but Facebook's attribution model was claiming credit for organic wins. This experience taught me that most businesses oversimplify the customer journey and miss the real story behind their ad performance.
In this playbook, you'll learn:
Why your attribution model is probably lying to you
The hidden metrics that actually predict long-term success
How to audit paid ads without getting fooled by vanity metrics
When to ignore your dashboard and trust the dark funnel instead
The framework I use to separate real performance from attribution noise
Let's dive into why most paid ads audits miss the mark - and what you should be measuring instead.
Walk into any marketing conference or scroll through LinkedIn, and you'll hear the same advice about auditing paid ads performance. The industry has collectively agreed on a standard checklist that sounds logical on paper.
The conventional wisdom goes like this:
Check your ROAS - If it's above 3-4x, you're golden
Analyze click-through rates - Higher CTR means better creative
Monitor cost per acquisition - Lower CPA equals better performance
Review conversion rates - Optimize landing pages if they're low
Study audience insights - Double down on high-performing demographics
This framework exists because it's measurable, reportable, and gives everyone involved a false sense of control. CMOs love dashboards with green arrows, agencies can show clear optimizations, and everyone feels productive.
The problem? This approach assumes your attribution model is accurate, that customer journeys are linear, and that platforms like Facebook and Google are honest brokers of their own performance data.
In reality, most businesses are optimizing for metrics that have zero correlation with actual revenue growth. I've seen companies with "terrible" ROAS that were growing 50% year-over-year, and others with "amazing" metrics that were bleeding cash.
The conventional audit process also ignores the dark funnel - all those touchpoints that happen outside your tracking systems. When someone sees your Facebook ad, googles your brand name, reads reviews, checks your LinkedIn, and then finally converts through organic search, who gets the credit?
Spoiler alert: Facebook does, even though the entire journey involved multiple channels.
Who am I
7 years of freelance experience working with SaaS
and Ecommerce brands.
So here's the situation I walked into with that Shopify client. They were a B2C e-commerce store with over 1,000 SKUs, selling across multiple product categories. Their Facebook ads were their primary traffic source, generating what looked like decent volume but disappointing returns.
The marketing team showed me their dashboard: 2.5 ROAS, €50 AOV, and steadily rising costs per acquisition. By every standard metric, these ads were underperforming. They'd already tried the usual fixes - new creative, audience testing, landing page optimization - but nothing moved the needle.
What made this case interesting was the complexity of their product catalog. While most successful paid ads campaigns thrive on 1-3 flagship products, this client's strength was their variety. Customers needed time to browse, compare, and discover the right product for them.
This created a fundamental mismatch. Facebook Ads' quick-decision environment was incompatible with their customers' shopping behavior. People would see an ad, visit the site, but not purchase immediately. Instead, they'd bookmark products, compare options, maybe check reviews, and convert days or weeks later through organic search.
But here's where it gets interesting - I suspected this "poor performance" was actually hiding a much more successful acquisition engine. The attribution model was just too primitive to capture what was really happening.
Instead of immediately optimizing the ads (which every other consultant had tried), I took a completely different approach. I implemented a comprehensive SEO strategy to capture all that downstream demand the ads were generating.
The hypothesis was simple: if the ads were actually working but attribution was broken, then building organic visibility would reveal the true impact of that paid traffic.
My experiments
What I ended up doing and the results.
Here's the exact framework I used to audit what was really happening with their paid ads performance - and how you can apply this to your own campaigns.
Step 1: Map the Real Customer Journey
First, I completely ignored the platform dashboards and started tracking customer behavior manually. I set up UTM parameters not just for ad clicks, but for every possible touchpoint - email signatures, social profiles, content mentions.
I implemented a survey system asking new customers: "How did you first hear about us?" and "What convinced you to purchase?" The results were eye-opening. Over 60% mentioned seeing Facebook ads first, but only 15% actually clicked through and purchased immediately.
Step 2: Build Organic Capture Systems
Instead of optimizing the ads themselves, I focused on building organic visibility for all the ways people were actually finding the brand. This meant:
Complete website restructuring for SEO optimization
Development of content targeting product comparison keywords
Optimization for brand + product searches
Building review and social proof pages
Step 3: Track True Attribution Impact
Within 30 days of implementing the SEO strategy, something magical happened. Facebook's reported ROAS jumped from 2.5 to 8-9. But I knew this wasn't because the ads suddenly got better - it was because we were finally capturing the full funnel.
The real test came when we temporarily paused all Facebook ads for two weeks. Organic traffic dropped by 40%, proving that the ads were generating significant indirect demand that wasn't being attributed correctly.
Step 4: Develop Multi-Touch Attribution
I created a simple but effective attribution model that tracked:
First touch (usually paid ads)
Research behavior (organic searches, direct visits)
Final conversion channel
Time between first touch and conversion
This revealed that their average customer journey took 8-12 days and involved 3-4 different touchpoints. The Facebook ads were working - they were just the beginning of the story, not the end.
Step 5: Audit Based on Incremental Lift
The final piece was measuring incremental lift rather than platform-reported performance. I used geo-testing, turning ads on and off in different regions to measure the true impact on overall revenue, not just attributed conversions.
This approach revealed that while Facebook showed a 2.5 ROAS, the true incremental value was closer to 4.2x when accounting for all downstream organic activity.
The results of this comprehensive audit approach were pretty dramatic. Instead of the "failing" 2.5 ROAS that Facebook reported, we discovered the true impact was much higher.
When I mapped the complete customer journey, here's what we found:
True ROAS: 4.2x when accounting for organic lift
Customer journey: Average 8-12 days from first ad exposure to purchase
Touchpoints: 3-4 different interactions before conversion
Indirect impact: 40% drop in organic traffic when ads were paused
The most interesting discovery was that their "high-performing" direct-response ads were actually the worst for long-term growth. The ads that looked terrible in the dashboard - broader targeting, lifestyle imagery, brand-focused messaging - were driving the most valuable customers.
Within three months of implementing the full strategy, overall revenue grew by 60% while maintaining the same ad spend. The difference wasn't optimizing the ads themselves, but understanding and capturing their true impact across the entire funnel.
This experience completely changed how I approach paid ads audits. The platforms will always optimize for their own attribution models, but the real value often lies in the unmeasurable interactions they create.
Learnings
Sharing so you don't make them.
Here are the key insights from auditing dozens of paid campaigns using this approach:
Attribution lies, but consistently - Every platform overclaims credit, but by understanding their bias, you can work backward to the truth
Customer surveys beat dashboards - A simple "How did you hear about us?" reveals more than any analytics tool
Brand searches are your best metric - Monitor organic brand search volume as a leading indicator of ad effectiveness
Geo-testing provides clean data - Turn ads on/off in different regions to measure true incremental lift
Time lag matters more than you think - Most valuable customers take 7-30 days to convert, breaking attribution windows
Product complexity changes everything - The more consideration required, the less reliable platform attribution becomes
Distribution trumps optimization - Sometimes the best way to "fix" paid ads is to build better organic capture systems
The biggest mistake I see is treating paid ads as a standalone channel when they're actually part of an ecosystem. Your audit should measure ecosystem health, not individual channel performance.
I'd also recommend building your measurement systems before you need them. By the time you're questioning your ad performance, you're already missing weeks or months of valuable data about how customers actually behave.
My playbook, condensed for your use case.
For SaaS companies auditing paid ads performance:
Track trial-to-paid conversion rates by original traffic source, not just last click
Monitor product usage metrics for users acquired through different channels
Survey churned users about their initial discovery method
Measure time-to-value for different acquisition channels
For e-commerce stores auditing paid ads performance:
Track customer lifetime value by initial acquisition source
Monitor repeat purchase rates for different traffic sources
Use post-purchase surveys to understand the complete customer journey
Implement geo-testing to measure true incremental revenue impact
What I've learned