Comment & Opinion

How to build a winning ad strategy through creative testing

LifeStreet CEO Levi Matkins on how to create the best ads possible

How to build a winning ad strategy through creative testing

Advertising is the most effective way to bring your game to the attention of a wider audience. While it isn’t unheard of to see games become hits thanks to positive word of mouth, these games are the exceptions to the rule.

So how should game makers optimise ad development? Throwing money at the problem isn’t necessarily the solution – it’s more important to get the right ad than the most expensive one.

In this guest post, LifeStreet CEO Levi Matkins discusses how game makers can create the optimal ad strategy through the process of creative testing.

Advertising in the attention economy is never easy, but making ads truly successful is an even harder task. Running a top-performing mobile gaming campaign requires advertisers to convert impressions into customers and maximise the value of each impression by serving the 'optimal' creative.

Serving an ad that will attract the right users can be the difference between a campaign's success or failure. However, this raises two questions: what does the term 'the optimal creative' even mean? And how do we find out what this is?

The first question has a fairly easy answer. The optimal creative means the ad that generates the maximum ROAS for a given impression. However, we must acknowledge that we'll never truly know that we've arrived at the optimal creative, given the infinite possible creatives that could exist. In addition, it can be prohibitively expensive to test creative to the point of being able to understand ROAS at the creative level, so installs are more commonly used as a proxy by which creative efficacy is measured. How is this decided? By looking at the percentage change in yield between the "control" and the "new" creative. How do we calculate yield? We calculate that by dividing the number of installs by the number of impressions and multiplying that by 100. The "lift" we see is how we determine a creative win.

It's easy to assume that attractive, 'well-designed ads’ are far more likely to drive conversions than less polished ones.
Levi Matkins

The answer to the second question is more complex, and requires us to make deeper considerations about the human element of the creative process.

For example, it's easy to assume that attractive, 'well-designed ads’ are far more likely to drive conversions than less polished ones. But this isn't always the case - some strategically ‘less polished' ads are among our best performers, which is why you have to listen to the data to make sure you are serving the optimal ad, not just the one you think will perform best.

This need for regular testing of creative performance has serious implications for the way ads are designed, as there are a number of elements that can regularly be refined or revamped - and tested - to impact performance.

For example, at LifeStreet, we design the ads themselves to facilitate easy edits that allow us to quickly change the core components of the ads. Through this iterative process of testing different versions of an ad against the previous winner, we can ensure we are constantly striving to serve the optimal ad at every stage of a campaign's lifecycle. Due to the inevitability of ad fatigue and shifts in audience preferences, an ad canonly remain optimal for so long before it needs updating. Then, the whole cycle begins again.

The role of creative testing

Creative testing is the critical method to maximize ad performance; trialling a range of options to see what performs best with audiences. Consequently, advertisers need to decide on a creative testing methodology and make sure they have the time and resources to manage the testing process (or work with a partner who does). This is important, because in a post-iOS 14 world with limited access to the IDFA, creative testing is the one area where advertisers still have control & leverage to create a competitive advantage.

Winning consumer attention is not a one-step process. Rather, when a consumer sees an ad, we have to take them on the journey that stems from initial awareness, through to the end result - an install, where a user downloads and opens an app.

Winning consumer attention is not a one-step process.
Levi Matkins

By testing and optimizing for creative effectiveness, advertisers boost their chances of success; grabbing their audience's attention and then driving them to take a specific action - and maximizing the efficacy of their ad spend. Not having a rigorous creative testing process wastes media dollars, and has huge opportunity costs to advertisers. Conversely, a methodical creative process can directly drive down CPIs and help customer success managers meet the advertiser's KPIs.

How does creative testing work?

At LifeStreet, creative testing is run through a series of A/B tests. We choose this approach (as opposed to multivariate testing where a number of different elements are tested at the same time) due to the amount of data and traffic - and, by extension, time - required to glean any real learnings from multivariate tests. If you want to understand why a particular creative performs well, it's important to focus on testing only one element at a time in order to understand how each element influences an ad's performance.

By pitting multiple challenger versions of a creative against previous winners, over numerous A/B test phases, we can eliminate the versions that perform poorly to continually work towards that elusive optimal creative.

Getting the edge over your competitors

Optimising performance of your own campaigns is one thing, but improving your creative to gain an edge over competitors requires further work. Specifically, continuous market research to understand which ads are performing best across particular categories.

Inspired by trending concepts we find in our proprietary creative repository and by implementing them into our creative testing strategy, we have seen huge leaps in performance. For example, in one campaign the creative trend we applied drove a 62.57 per cent lift in yield compared to the previous best-performing ad; ROAS increased 3.7x and the CPI dropped. Ultimately, the advertiser was able to secure more installs and payers from their ad spend. In another campaign we tested, the "trending" creative concept increased the number of users purchasing in-app content by 357 per cent. Given a campaign's performance is the measure of its success, these figures demonstrate the need to switch up creative solutions to maximize a campaign's results.

The message could not be clearer: test, learn and instil a deep sense of performance curiosity in every aspect of the design and implementation of creative testing. The advertisers who are willing to innovate and experiment are the ones who will have the greatest success - particularly in a privacy-first ad market where getting messages through to users has never been tougher. regularly posts content from a variety of guest writers across the games industry. These encompass a wide range of topics and people from different backgrounds and diversities, sharing their opinion on the hottest trending topics, undiscovered gems and what the future of the business holds.