Every day, I hear about some nifty new measurement technique: multi-touch algorithmic attribution, eye-tracking studies, MRI brain scans. You could be forgiven for thinking that we're living through a glorious revolution in marketing measurement.
But we're not.
Instead, we're living through one of the most elaborate collective delusions in business history. We have more data than ever, and yet we understand marketing effectiveness less than we did in the 1990s.
Let me explain.
The Measurement Paradox
Here's a question that should keep every CMO up at night: if marketing measurement has improved so dramatically, why can't anyone agree on basic questions like "does this campaign work?"
I've sat in rooms where attribution software says a campaign delivered 300% ROI, media mix modeling says it delivered 50% ROI, and incremental lift testing says it might have been negative. All three are "data-driven." All three are "scientifically rigorous." All three cannot be right.
The dirty secret is that most marketing measurement doesn't measure what it claims to measure. It measures proxies. And the gap between proxies and reality has been growing for years.
The Three Great Lies
Lie #1: Last-Click Attribution Was the Problem
The industry narrative goes like this: last-click attribution was bad because it over-credited the final touchpoint. So we invented multi-touch attribution to fairly distribute credit across the customer journey.
The problem: multi-touch attribution doesn't fairly distribute credit. It distributes credit according to models that make assumptions about causation that cannot be verified. You're not getting a more accurate picture—you're getting a more complicated guess.
Lie #2: More Data Means More Accuracy
We've gone from measuring campaign exposure to measuring individual ad impressions, viewability, attention time, and dozens of other metrics. Surely all this data gives us a clearer picture?
Actually, it often gives us a noisier picture. More data means more opportunities for spurious correlations. Without proper experimental design, you can find "evidence" for almost any conclusion you want.
Lie #3: Digital Is More Measurable Than Traditional
This is perhaps the most damaging myth. Digital advertising promised perfect measurement: we could track every click, every conversion, every dollar.
Except: we were measuring the click, not the impact. The person who clicked on your ad might have bought anyway. The person who saw your TV ad might have converted through search. Attribution software credits the click, not the actual influence.
What Actually Works
I'm not saying measurement is impossible. I'm saying we need to be more honest about what we can and cannot know.
Here's what actually works:
- Experiments with holdout groups: The only way to know if marketing caused something is to run it in some places and not others, then compare results.
- Long-term brand tracking: Measure what people remember and associate with your brand over time. This predicts future behavior better than any attribution model.
- Market mix modeling: With all its flaws, MMM at least attempts to isolate marketing effects from other business factors.
- Simple metrics honestly applied: Sometimes "did awareness go up?" is more useful than "what was the attributed ROI to three decimal places?"
The Simulation Advantage
Here's where I'll connect this to SocioLogic, because I think synthetic users offer a genuinely different approach.
Instead of trying to measure what already happened (a fool's errand with observational data), synthetic users help you understand what might happen. You can test messaging, explore customer reactions, and identify potential failure modes—all before spending a dollar on media.
This isn't measurement—it's prediction and exploration. And it's more honest than pretending our attribution dashboards tell us the truth.
The Uncomfortable Conclusion
Marketing has always been part art, part science, part luck. The measurement industry has been selling a fantasy that we can reduce everything to precise numbers.
We can't. We probably never will. And the sooner we accept that, the sooner we can start making better marketing decisions—with appropriate humility about what we actually know.
Stop worshipping at the altar of attribution. Start making decisions based on evidence, judgment, and a healthy skepticism of anyone who claims to have it all figured out.
Including, for the record, me.