Meta, Google, TikTok, and other ad channels are misleading you. Third-party attribution tools like Triple Whale and North Beam aren't better—they’re flawed too. Tracking has always relied on estimated models, not hard numbers. After iOS 14, tracking became harder, leading to a surge in third-party solutions. But these also provide conflicting data, making it tough to find the truth. So, what is the truth? The only reliable way to measure your marketing efforts is through incrementality tests. These tests answer the question, "What if this channel or ad never existed?" By showing ads to one group and withholding from another, you can measure the true impact on revenue and profit. For example, if you're running Facebook ads and selling on Shopify and Amazon, incrementality tests reveal how Facebook ads impact Amazon sales. Without the initial Facebook touchpoint, an Amazon purchase might not have happened, even though traditional attribution wouldn’t show this. This is why ROAS and third-party attribution aren’t accurate. They use models that can be thwarted by privacy settings and cross-channel purchases. By running incrementality tests, you discover the true impact of your marketing efforts. We ran a 14-day Meta holdout test and found that zip codes shown ads generated 50% more Amazon revenue than those not shown ads, despite sending traffic to Shopify. Now is the perfect time to run these tests. Q3 is calm, free from major holidays that skew results. This is your chance to optimize before Q4. If your brand generates seven figures annually, this should be a top priority to grow profits in Q4.
Why Trust Incrementality Over Ad Platform Metrics
Explore top LinkedIn content from expert professionals.
Summary
Incrementality measures the true impact of your advertising by showing what results are directly caused by your marketing spend, rather than what ad platforms report based on clicks or impressions. Trusting incrementality over ad platform metrics gives you a clearer picture of how your ads drive real business growth, helping you make smarter investments.
- Question platform data: Understand that ad platforms often inflate their reported metrics, crediting themselves for sales that may have happened anyway.
- Run holdout tests: Use incrementality experiments to compare groups exposed to ads versus those who aren't, revealing which channels actually generate new customers or revenue.
- Shift your mindset: Focus on measuring incremental revenue and net-new customers instead of just clicks or ROAS, so your marketing strategy aligns with real financial outcomes.
-
-
Meta just charged you ₹200 per 1000 impressions. But what's your cost per 1000 INCREMENTAL impressions? This metric most marketers ignore could be the key to unlocking scalable growth. I just audited an account where October's campaign reached 3.7M people at ₹0.90 per reach. Two months later, December's campaign reached 3.8M people at ₹3.20 per reach. But here's the shocking part: 85% of those impressions went to THE SAME PEOPLE. Their true incremental reach cost had skyrocketed by 13X! While most advertisers fixate on CPC and CPM, the platform's auction system is quietly driving up your cost to reach new audiences as you duplicate campaigns and creatives. Three ways I'm changing client strategies based on this insight: 👉🏻Track incremental reach cost as a core metric 👉🏻Use Advantage+ Shopping Campaigns to let Meta handle incrementality 👉🏻Create audience exclusions to force discovery of new prospects The platform's objective is to maximize overall ad revenue, not your business growth. When advertisers optimize solely for conversions without considering incrementality, they train the algorithm to show ads to the same "easy" audiences. For established brands, the question isn't "What's my CPM?" but "What am I paying to reach each NEW potential customer?" This shift in perspective can transform how you scale campaigns beyond the initial performance plateau. What metrics beyond the obvious ones have transformed how you approach campaign optimization?
-
Stop minimizing the importance of TikTok. You're seeing low ROAS numbers? Unless you have post purchase surveys set up and a crisp ability to measure offline impact, I'm not buying that your ads + organic efforts on the channel aren't bearing fruit. poppi's success on TikTok is an example that we come back to all the time - "Poppi estimates that its digital marketing efforts in total are driving an 80% lift in offline sales, and that TikTok accounts for roughly 30% to 40% of that lift, based on other industry data the brand has seen." -- This anecdote from Ben Dutter is also super powerful: "This past month a client drove a 7x incremental return on TikTok. Yeah, that's right, for every dollar of ad spend on the platform the brand got seven dollars back of true, incremental revenue. That's by FAR the best performing channel this brand has ever had. You know what the platform ROAS was? About 0.25x. So that means that TikTok only sees about 3.6% of their true contribution (at least for this client). If we were just looking at platform metrics, or even their MTA (with alleged MMM bolted onto it) they never would have continued to invest. Here's the break down: • Incrementality validated by a geo lift test • Full funnel structure (LPV + purchase campaigns) • Tight creative + audience alignment (UGC style creative) • Invested at enough weight to achieve meaningful reach + freq This isn't that much of an outlier. Right now we consistently see TikTok generate between 3 and 6x incremental return. Compare that to Meta (much more mature) which averages around 2-5x."
-
I upset a client by telling them their most profitable channel wasn't Meta. It was my first week working with a new brand. They were proud of their Meta performance. Reporting a 4.2X ROAS. Investors loved it. And encouraged more spend to Meta. When I pulled up Northbeam, I saw what they'd overlooked: CTV campaigns barely getting any budget despite showing promising early signals. I said, "Let's run a couple incrementality tests." (on Stella, of course) They were skeptical. "Meta's our star performer." Three weeks later, the answer came: • Meta's incremental ROAS: 1.8X (vs. 4.2X platform-reported) • CTV's incremental ROAS: 3.4X (vs. 1.3X platform-reported (view based)) "That can't be right, Meta is literally a 4x ROAS" - the entire marketing team. They were skeptical because their MTA tools were misleading them: Meta was taking credit for sales that would've happened anyway...but CTV was driving new customers they wouldn't have found otherwise. This pattern repeats with almost every client I work with. It reminds me of group projects in high school, where there's always someone who does minimal work but talks the loudest during the presentation. Then everything thinks "wow this guy knows his stuff". With this particular client, when we adjusted their spend based on incremental performance instead of platform metrics, their net profit started to increase by 5% month over month, and it seems the trend is continuing. Platforms want your money, so there is a layer of illusion you need to strip away when it comes to platform metrics. The layer is - Correlation versus Causation. Which of your ads are actually profitable? Incrementality testing is the only way to really know.
-
Your most effective channel is losing you sales. You can often make campaigns more effective by moving money to less effective channels. What? Marketing Science maestro Simon Toms explains how: In the example image, the blue line represents a channel that’s 2x more effective than the pink one at every spend level. $1M invested in Channel 1 returns $2M in incremental revenue (A). But split the $1M between Channel 1 and 2 (50:50) and you’d drive $2.5M total incremental revenue (B + C). That’s 25% more revenue from investing in a “less effective” channel. So what? Don't accept average metrics alone, always look to understand the marginal returns. Ideally you should know the curves for all your investments. MMM can obviously help with this, but incrementality testing typically provides more detailed curves based on actual sales rather than modelled ones. Incrementality testing is not A/B testing. It's test and control - the test group see the ad, the control group (who match the ad audience but are withheld from the ads) don't. The difference is the incremental impact. (In an A/B test you do not withhold a segment of your audience from seeing the ad, so it can't measure incremental impact.) Here's where curves from incrementality testing can help: 1. Optimal Full Funnel Different optimisations have very different curves. The curve for reach spend is very different to conversion spend which can be very different to ASC activity etc. Plotting curves helps you understand where you should pull back investment and where you should double down, critical insights for maximizing incremental returns. 2. Channel synergy The curve for one channel changes depending on your investment in others. Charlie Oscar found that social reach improves paid search performance by 32%, YouTube improves email by up to 25%, most crazy of all, 70% of the value from social and video channels is their impact on other channels with only 30% direct. 3. Plan at the margins Don't use average ROIs to determine where to shift your budget. It depends on the curve, not the average. Incremental returns show which channels to invest in, marginal returns show how much. Your most effective channel isn't often where you should put your next $. Bottom line: To make your campaigns work harder, you need to understand how each investment works at the margins. That's the route to higher returns across the mix.
-
Most Meta advertisers are addicted to retargeting. Why wouldn’t you? The performance looks incredible. But here’s the hard truth: Retargeting often gives the illusion of success, not the reality of incrementality. Here are 3 uncomfortable truths about retargeting that should change how you buy media: 1. Retargeting happens anyway. Meta knows who’s likely to convert. If you’re using broad targeting, the platform is already prioritizing people who’ve engaged with you. Break down performance by audience segments—you might find 20–40% of your “broad” campaigns are actually hitting engaged users and existing customers. 2. It’s not scalable. Audience pools for retargeting are inherently limited. You can’t scale spend or results when you only target past visitors or past customers. Broad campaigns include those users plus net-new ones. 3. It inflates results—and hides lack of incrementality. Retargeting conversions are often view-through. The user already knew you. They saw an email, a post, a recommendation. Your ad helped—but wasn’t the reason. That’s not incrementality. That’s attribution noise. Want to really grow? Start by trusting the algorithm more than your instincts. Curious how much of your "cold" campaign is actually warm? Run the breakdown. You might be surprised.
-
Always great to see Measured cited in Business Insider. Everyone knows that platform reporting is siloed and doesn’t account for incrementality. Reality today is even more complicated: - Every platform applies a different methodology to match impressions and clicks back to Sales (pixel, CAPI, etc.). - Many platforms supplement that with probabilistic conversions (made up). - No two platforms are doing this consistently or in the same way. That means, comparing reported CPO or ROAS across these platforms has become nearly impossible. Brands need a way to understand the incremental impact of their marketing independent of tracking and technology and based on ground truth experiments. It is a privilege for Measured to do this for 150+ brands today. A few quotable insights from the article: - Meta continues to drive efficient customer acquisition at a larger share of wallet and scale than any other platform, it is truly the 800lb gorilla. - Pinterest is consistently a VERY strong performer in the cohort of home, fashion, beauty, lifestyle brands focused on the female demo. - TikTok may not always look most efficient to 30 - 60 day sales, but when measured to higher funnel metrics and over the longer term it has proven to be a very effective channel for many of our brands. - It’s early, but many brands are starting to experiment with Reddit, Inc for acquisition, stay tuned. And yes… Snap Inc. appears to be a quiet secret for some in our portfolio. We have a handful of brands (~12%) using Snap to drive efficient customer acquisition.
-
We have run more than 100 Media Mix Models over the last year. If you aren't using MMM as part of your measurement approach then you are missing some crucial marketing insight. Understanding the incrementality of campaign performance is crucial, and it should be an always on part of your measurement (ie not a once a year test or MMM meeting.) While every brand performance is different there are some common themes that everyone should be able to learn from: • Paid Brand Search campaigns are typically lower than 10% incremental. So if you add a 0 to your adwords CPA it is more reflective of the true performance of these campaigns. • 40-70% of organic site traffic is driven by upper funnel marketing channels, this is usually the highest converting site traffic and it is usually driven by the channels which "aren't performance channels" (I hate this distinction..) • Influencer campaigns drive 85% of their value through indirect broadcast effects which are not measured through clicks and code redemptions. • Whitelisting assets outperform brand built assets by 30%, and show a stronger indirect uplift on organic channels. • YouTube activity significantly increases the ability to grow through search, with full funnel impact 12x stronger than shown on last touch metrics. • Channels such as Meta and TikTok frequently drive more revenue through Amazon and Retail than the DTC channels which they actually direct traffic to. • Brand Search and Retargeting channels show 4x stronger incrementality in new markets than in established markets, and are significantly more incremental during key promotional periods. • Demand generation channels show stronger synergy impact from new product content than demand conversion channels. Embedding these model learnings alongside your fast moving performance metrics allows brands to build a robust rounded measurement approach across all marketing levers.
-
Tactic level incrementality is very overlooked. Once you've established baseline platform incrementality, it's extremely important to drill in at a tactic or optimization level. Because of how different ad products optimize and find customers, they can have highly variable incrementality despite similar pixel or GA attributed ROAS. This is true across every major platform. We recently ran a tactic level lift study for an apparel brand on Meta. I don't think the results are terribly surprising if you have a deep understanding of the meta algorithm: - DABA - LEAST incremental - we see this somewhat frequently due to how DABA/DPA orients toward the highest intent shoppers - ASC - Middle of the pack - again, consistent with what we see and the benefit of ASC is usually huge scale. - Standard Campaign (1% Lookalike) - MOST incremental - but we know this tactic does not scale particularly well. In all cases, we have pretty tight customer and site visitor exclusions. So what do you do with a result like this? 1. Make adjustments and retest - my favorite real life comp for this is you got a blood panel, made some dietary/supplement/exercise adjustments and then re-test. Did the adjustments work? What changed? Rinse, repeat. 2. Revisit your bidding signal - if your exclusions are as tight as you can get via 1p and pixel data, consider passing back only new customer purchases or a conversion value more reflective of incrementality. 3. Bolster your audience match rates - with 3p append to improve your match rates. This is especially valuable on newer platforms with worse match rates, like TikTok (Liveramp, Hightouch come to mind) 4. Cut the obvious waste - while I put #1 above first for a reason, it's important not to ignore the areas you are wasting money. Adjust your investment strategy as necessary. From my POV, tactic level incrementality is an always-on best practice for most brands. Build a roadmap and start running monthly experiments. New Engen #Incrementality #Meta #Marketing