April 17, 2024
|
4 min.

Is your creative team still wasting time on A/B tests?

Generative AI tools can help make creative asset production more efficient.
Media

What makes an advertising image effective? It’s the multi-million-dollar question that marketers have struggled with for generations. 

Sure, creativity may be limitless, but production and media budgets are definitely not. Marketers need to test out a variety of creative assets to ensure that the brilliant ideas their colleagues fawned over are actually effective out there in the real world. 

Much of this testing happens downstream, whether through A/B testing or Dynamic Creative Optimization (DCO). If your team is relying on these basic tactics, you may be spending too much time and money on less-than-efficient solutions.

Spoiler alert: Generative AI is upending the marketing landscape here, bringing exponentially more data into the measurement of creative effectiveness than ever before. AI-enabled solutions like SmartAssets can make old-fashioned A/B testing look positively archaic. 

Let’s take a look at some of the familiar testing tactics in a marketer’s toolbox, and then unpack what the future might hold.

A/B testing

A quick refresh: A/B testing, also known as ‘split testing,' is a controlled experiment where two variations of an advertising creative are presented to users simultaneously. 

The idea is to determine which version performs better in terms of achieving specific goals, such as higher click-through rates (CTR), more sign-ups, or increased sales.

So A/B testing allows marketers to test hypotheses around what makes a creative asset more effective, by identifying which of two differing assets drives more engagement. 

Sounds great, sounds simple, but there are obvious downsides.

  • A/B testing results are binary: “A worked better than B, or B worked better than A.”
  • Marketers don’t really learn what specific components of an asset drove the improved performance. Was it the size of the product image? The color palette? The model, the logo, the layout? Some complicated dance between all of the above?
  • Creating the various asset versions to test in the first place requires manual and time-consuming effort, as does analyzing the results to extract meaningful insights beyond “A is better than B.”

As ever, constraints drive innovation, and so several solutions emerged that caused A/B testing to evolve into Dynamic Creative Optimization (DCO).

Let’s take a look.

Machine Dynamic Creative Optimization (DCO)

DCO is a technology-driven approach to advertising that enables marketers to create highly customized and relevant ads for each viewer. It’s built on machine learning technology and enabled by low-cost versioning--aka, outsourcing some creative production to emerging economies, at scale. 

It goes beyond traditional static advertising by dynamically assembling ad components based on user data, behavior, context, and other real-time variables.

DCO is a step up from A/B testing for a few basic reasons.

  • More versions can be tested, period.
  • The resulting findings immediately and automatically acted upon to ensure that any assets in the public domain are as effective as they can be in that moment.
  • It’s more efficient and scalable than A/B testing. 

That said, the sky isn’t the limit, and DCO does have its own constraints in terms of scalability and efficiency. Let’s look at why this is by walking through the process.

To begin with, multiple versions of an ad are created and put live. The model then looks at the performance of each of the assets and identifies, based on a set of marketing parameters, which asset is performing the best. 

It then prioritizes serving the highest-performing version, effectively throwing all the other ones away. 

That’s ridiculously inefficient, both in terms of the effort to create the “wasted” versions, and in terms of the media budget spent on suboptimal versions that floated around while the model worked out the best one.

Secondly, DCO disregards historical data. It only looks at how an ad is performing now. The model does not take into account all the information gathered from previous campaigns. That’s a pretty glaring oversight.

Lastly, DCO, as with A/B testing, still operates at a creative version level. That is to say it can tell which of a number of creatives performs the best, but it doesn’t look at why

DCO can’t spotlight the key factor in the asset that drove engagement. Was it the angle of the product, the presence of a logo, the smile on a model’s face, or the surplus of red in the image? 

Why don’t DCO solutions drill down into the granular creative components of an asset? It’s because of the huge manual effort required to tag up ads to allow the model to “read” what is in them at a component level. 

Which means it’s time for--drum roll, please...

The role of generative AI

Getting insight into what makes a creative asset effective can now be done at a much deeper level using gen-AI techniques, bringing more data to the table. 

Using image recognition, it is possible to automatically tag assets up at the creative component level, without human intervention, and at speed.

This can be done for all assets (static and video) on any given platform, both current and historical. Feed this into a large language model (LLM) like ChatGPT, and suddenly you can search and query creative assets at a more granular level than ever before. 

Using this new creative component dataset, cross-referenced with media performance data, marketers can extract meaningful trends and insights about what exactly in the asset is driving engagement. (They can also use these tools to analyze their competitor’s assets.)

And most strikingly, these learnings can be used to evaluate new assets before they go live. With this pre-flight intel, brands can create assets they know are going to be effective--cutting down on excess production cost that would have been spent on ineffective versions, and saving the media spend that would have been wasted in a DCO test.  

Where SmartAssets can help

SmartAssets generates creative component data and derives creative effectiveness insights by cross-referencing with performance data. 

These powerful insights then become the basis for actionable creative recommendations: Add the logo here, increase the product size there, change the background to be outdoors instead of urban, and so on.

Historically, making any changes to a creative asset requires multiple rounds of back-and-forth with a production team. 

But with the advances in generative AI, it is now possible to make many of the recommended changes automatically, within the SmartAssets platform, without returning to post-production. This increases speed to market and decreases production costs.

The takeaway here? Creative effectiveness is truly at an inflection point. Generative AI, by increasing data and automation, is enabling scale and efficiency that previous solutions like old-school A/B testing or even DCO cannot match. 

SmartAssets is excited to be part of this next wave of creative asset testing. Click here if you’re interested in learning more about how the platform can cut down on creative waste and supercharge your marketing efficiencies.

Daniel Purnell

Daniel Purnell is the Marketing Manager for Stagwell Marketing Cloud.

Take five minutes to elevate your marketing POV
Twice monthly, get the latest from Into the Cloud in your inbox.
We’ll be in touch!
Someone from our sales team will reach out within 24 hours.
In the meantime, you can learn more about our product groups.
Oops! Something went wrong while submitting the form.
Related articles
Is your creative team still wasting time on A/B tests?
Generative AI tools can help make creative asset production more efficient.
Media