When it comes to data, do yourself a favour: play chess, not whack-a-mole.

It costs me an arm and leg every time, but the kids love going to the arcade on Brighton pier. And they always want to play whack-a-mole.

Armed with a rubber mallet, you bash up plastic moles that appear from holes on a board. It’s fun because the little blighters move fast, and as soon as you’ve hit one you have to be straight onto the next.

There’s a less pleasurable and much more expensive version of this being played out in marketing.

Because making marketing decisions by simply reading data and reacting to it means always playing catch up. There’s so much data that as soon as you react to one thing, something else appears.

It wouldn’t matter if data always pointed in the same optimal direction. But it doesn’t.

Most metrics only show today’s relationship between marketing and business outcomes, so, by tomorrow, there’s a new relationship and you have to react again.

The advice is to stop it and focus on the bigger picture instead. Set out a strategy and only use data to check that the plan is working.

A tale of before and after

The chart below shows econometrics findings for the same business before and after they re-engineered their way of working with marketing data.

Over the course of two years, they moved from reacting to every bit of data that passed their dashboards to a strategic approach. And they saw a phenomenal improvement in their ability to drive profit because of it: Profit per £1 spent on media increased by 8x.

In the early days, decision-making was non-stop. They compared daily sales to daily targets, changing tactics immediately and often, repeatedly opting for actions with instantaneous feedback via data.

In advertising, they used data on clicks, conversions and ROAS to decide on everything from creative to messaging, media allocation, and how much to spend.

And they managed price this way too. If sales weren’t where they should be, they offered discounts wherever data said they’d get the best response.

Even though each individual decision looked right, away from the dashboards more and more of their sales went through on deal. And their marketing was dizzyingly changeable, they had more than 20 completely different campaign ideas in just 2 years.

Despite all the frenetic activity and the best efforts of an increasingly frazzled team, the big picture numbers did not look good. Spend on Google and Meta was going up faster than sales, and profit margins were falling.

The online ad machine, just like the one in the arcade, took their money, but left them unsatisfied.

Whack-a-mole at work is stressful

It’s understandable why whack-a-mole has taken hold into marketing.

Senior people quite rightly want marketing to be numerate and accountable. So, marketing data scientists create reports and dashboards in good faith.

The trouble is that ups and downs in the metrics often don’t actually indicate ups and downs in how good your marketing is.

The chart above shows that, instead, ups and downs in ROAS are mainly just indications of the strength of general demand for the category.

The weather, the economy, PR, price, competitor campaigns, and a whole range of other things that are nothing to do with your marketing choices can make your online ads look good or bad.

And because you can’t control all those things, the result your dashboard told you held today, is typically not achievable tomorrow.

You act on the data, but you don’t get the result you’re expecting.

So, you spend time hunched over the dashboards, rubbing your temples and stroking your chin, coming up with hypotheses that might explain what you’re seeing, and feeling ever less in control.

And then, faced with poor outcomes, you start pointing the finger at your colleagues.

Marketing thinks the data itself can’t be wrong, so the problem must be the guys that built the dashboards. And analysts blame marketers right back, saying they just don’t get data.

And all the time, underneath the frenetic activity, there’s this nagging doubt.

You know that to get real business results, you need to manage progress over months and years. But you spend your days managing day to day wiggles in data instead and you worry because success is getting away from you.

Chess is better for your business and your people

In the case study business, they made the transition to a better use of data by flipping the script on hypotheses.

Instead of coming up with theories to explain data, the team were asked to write ideas for good marketing down first, and only use data to test whether they worked.

This led to a longer list of possible actions: Not just spending more on online ads or offering higher discounts, but also bigger picture things like brand building and positioning. There was suddenly space for results that would appear in 6 months’ time, as well as tomorrow.

With their theories assembled, they weaved existing knowledge in and came up with a strategy. They found a good positioning and aligned all assets and copy to it, and they reallocated media budgets too.

When they swapped attribution for a stable view of cause and effect – from econometrics – they saw their marketing was now working, and they further improved it using the findings.

For the first time in a long time, this team felt good.

And because they weren’t little kids, they didn’t celebrate in the arcade. They went to the pub.