When marketing budgets are scrutinised with more intensity than ever, effectiveness has become the ultimate currency for brands. But what does it really mean for a campaign to be “effective”? The IPA Effectiveness Awards provide a rare glimpse into the mechanics of successful campaigns, offering insights that go beyond surface-level metrics. As brands navigate shifting consumer behaviours, economic uncertainty, and media fragmentation – the questions of how to measure, optimise, and prove effectiveness have been more critical.
From the jury room, a clear theme emerged. The most successful campaigns go beyond creativity and targeting – they are rooted in strategic clarity and deliver measurable impact. It’s no-longer just about econometric models or advanced data tools; it’s about agility, context, and the ability to learn and pivot in real time. Whether employing sophisticated econometrics or straightforward A/B testing, effectiveness has evolved from an abstract ideal, into a nuanced, data-driven discipline.
Ultimately, as the jurors emphasise, it’s not about winning awards or ticking boxes, it's about embedding effectiveness into the DNA of an organisation. Brands that succeed in doing this aren’t just reacting to market conditions, but are shaping their future with intent, ensuring every campaign is a stepping stone towards sustainable growth.
Below, our experts from the brand and agency sides, discuss their top learnings from the judging process, and what they believe to be the most important questions for brands to answer around effectiveness right now.
It’s always difficult to be a judge when there are a lot of very strong nominations to review and mark, which was definitely the case this year! You must have a real focus about the campaign meeting the strategic objectives, demonstrating creativity and delivering measurable results. Effectiveness awards are more than just innovative concepts but assess the true commercial impact of marketing efforts. If brands want to set themselves up for success by investing time at the start of the process, this is what I would encourage them to think about:
Are you really clear on the problem you are trying to solve? Do you have a clear, compelling direction that everyone is aligned on and inspired by (including the broader business)?
Have you invested in really understanding who your customers are and how you are targeting them? Do you understand how the unique strengths of your brand will serve those customers?
Have you agreed what you are going to stop doing so that you can focus your key activities to make the campaign as successful and focused as possible, rather than trying to do everything?
Have you thought about measurements you will be using up front and put in place the mechanisms to track, test, experiment and learn? Can you optimise/learn from the data and the need to pivot when the environment around you changes?
My top takeaway is that the top performing brands can draw a straight line between their advertising and its impact on commercial performance and we have more tools at our disposal than ever to help with this. Creating a culture of effectiveness is key to ensuring we continue to prove the value of advertising as a worthwhile investment that delivers results.
There are three key questions advertisers are asking themselves right now about effectiveness. Firstly, how to get a quick read on effectiveness so that fast yet meaningful optimisations can be made in a time frame that best maximises ROI.
Secondly, beyond ‘full-fat’ market mix models (MMMs) what tools can be used to understand effectiveness. And thirdly, how much data is enough data to use for decisioning.
In the 2024 batch of IPA Effectiveness Awards papers, we have seen a significant growth in the number of papers submitted, and alongside that, a wider and more varied sample of the advertising community submitting papers to the awards. And correspondingly, we’ve seen a much broader range of approaches to effectiveness.
Smaller advertisers, without the budgets to unlock the power of MMM have turned to ‘MMM-lite’ solutions as well employing good old fashioned A/B testing. Mid-sized advertisers, perhaps with bigger analytics budgets but not much time, or in some cases not much data, have used Bayesian models and predictive machine learning techniques to create a framework for understanding effectiveness, allowing them to test and learn and simultaneously improve their models and be super-agile in budget (re)allocation. While larger advertisers continue to use detailed and powerful MMMs, we are seeing many adding to the data they get from those with experimental design.
Beyond the maths, there is a real-worldism very much present in advertisers minds around effectiveness. All around us everything is changing; from the economy to social norms to the media landscape and media consumption. Advertisers are therefore redoubling efforts to ensure data observations are not just relevant to the conditions of the model or test, but broadly applicable in the real-world.
There was an incredible breadth of entries in the IPA Effectiveness awards – big brands, small brands, multi-nationals, and not-for-profits. The best effectiveness stories were not always the biggest budgets and the biggest businesses, and not always the most sophisticated use of econometrics.
Instead, the common thread amongst the winners was rigour – in particular in isolating and identifying other factors that could have impacted the results, and painstakingly eliminating them. The winning award entries read as if they were brands that were concerned with understanding the truth about how their marketing was working – so that they can improve and scale it – rather than as if they were selling their story in order to win an award. My favourite entries were from brands that had done this consistently over a period of time to create a sustained impact.
From my experience in the judging room, I think brands need to ask how they can create a culture of effectiveness that is owned across multiple teams, rather than led by a single specialist unit. I also think they need to start with their unique, nuanced objectives and create hypotheses and bespoke learning plans to measure if these objectives have been met – rather than relying on generic measures. The best effectiveness entries were very different from each other because they took this approach.
Most of the entries were deep in the data. They came with the stacks of econometric analyses, including MMMs. There were plenty of supporting tests and customer surveys to back them up. Lots of efforts just to prove to us, the lowly judges, and perhaps their CFOs, that their campaigns did what they were supposed to do. Of course, I'm not sure anyone would ever submit a failed campaign to a contest like this, but I'd sure learn a lot from them if they did.
But here's where it gets interesting for me: What happens to all of this hard-earned effectiveness knowledge? It's like finishing the last scene of a movie that already has a sequel in the works. "Coming Soon!" I'm looking at you Marvel.
Some cases took a stab at it: Here's how our insights could change the entire marketing community going forward. Noble, but since I also signed an NDA so I'm not sure these secrets are making it out any time soon. Others played it tactically, talking about re-establishing momentum after covid. Not bad, but that's a crisis management plan for that one time everything went to shit. Or the playbooks on how to revitalise their brand after suffering from decades of neglect. Sure, they could be useful, but are you really going to dust it off again in the future? Would you want to be around when it does?
The submissions that made my cynical heart skip a beat, looked at this effectiveness data and offered, "This is how these insights will shape our company going forward." These weren't just campaign-based marketers looking to solve a problem, they were forward-thinkers trying to establish their work as a sustainable growth engine for their organisation. These teams didn't seem content with a quick win. They were using effectiveness data as a means to find that next opportunity. Those cases were truly the ones that were pushing boundaries worthy of recognition. And, I loved them for it.
Effectiveness award shows celebrate work that works but to evaluate the effectiveness of a campaign, you have to first ask what the work was designed to do. As marketers are increasingly looking to do more with less, there can be pressure to expect each piece of content to do virtually everything. I’d wager there are a lot more kitchen sink briefs than there were even a decade ago, without adequate comms strategy to deeply understand what content is designed to deliver. This is particularly true with ever-shorter tenures of marketing professionals on the brand side, and less enduring agency partnerships. Jump balls and rosters don’t lend themselves to longitudinal and integrative thinking, so the context that helps build the most effective campaigns runs the risk of slipping.
Correlation, causation or does it matter? Marketing has always been a mix of art and science, maths and magic—but the era of highly trackable and traceable digital tools has given us the impression that everything is knowable. Multi-touch attribution modelling, proprietary black box measurement systems—even with all of the econometric modelling in the world, the jury room often comes down to a healthy debate about whether or not one can prove that X campaign asset caused Y result. For all the data analysis, often the closest one can get is correlation, and we have to ask ourselves if that’s okay. I would argue that in many cases it is. When you isolate the variables, a strong campaign with strong reception (engagement, buzz, brand fame) will lead to strong results.
The effectiveness conversations are starting too late. You can tell the difference in the jury room between cases that were built before the results rolled in and cases that were built around the results that were available. Few agencies (or marketing orgs) have truly built a culture of effectiveness. It's apparent which ones have in the submissions themselves. A culture of effectiveness is one that puts rigour and intention on the front end and bake them into the planning before the comms are launched. Results roll in and you optimise toward what’s working. It’s far more common these days to see cultures obsessed with measurement—but often that measurement is built around risk mitigation, not iterating to create ever more effective work. The difference is palpable in the iconicity and quality of the campaign. The greatest results flow from there.
With so many current conversations about standing out in the sea of sameness, I think this an important lesson for IPA Effectiveness paper authors. Judges spend weeks before the actual judging days, reading papers at midnight, or at dawn, to get through nearly 40 entries, each with 30 pages of often dense copy and a multitude of graphs. After the 15th paper from an FMCG brand that employs the same tried and tested techniques to drive some solid results, I was impressed by the papers that both displayed effectiveness (a given) but also that did something different.
And writing this, three months after judging, it’s interesting what I remember, and what slips into the lake of lacklustre. An emotional opener from the CMO of a much-loved brand showing how deeply proud she is of the work, I remember. That same paper showed work that didn’t start with TV, but took an idea to interesting places. Papers that employed very specific channels, surprising for the IPA, but every bit as effective as the broader entries, I remember. Papers that took very specific audiences, with a pointy subject matter. Papers that displayed the insights in a really arresting way that created a heart punch. Papers that employed insight derived from customer data, and used interesting data techniques.
That’s not to say that the papers I couldn’t play back to you today, are not worthy. It’s more that they’re solid. But as more diverse entrants enter this awards scheme, finding your paper’s peacock tendencies might pay back.
To learn more about how agencies and brands can usher in a golden age of effectiveness, get your tickets for the IPA Effectiveness Conference 2024 – a hybrid conference where creatives, strategists, brand marketers, and agency leaders alike can get involved in the timely and timeless conversation about effectiveness. The conference will also see the launch of a publication called ‘Making Effectiveness Work’ which delves into the current methodologies of data measurement while advising how to navigate them.