How to A/B Test Your Text Ads
By 2018, the world will see 9.3 billion mobile data plans in use. When it comes to capitalizing on the vast potential of mobile ads and SMS campaigns, marketers have barely begun to scratch the surface. Mobile marketing budgets in the U.S. alone will soon reach $4.4 billion, but unless businesses measure the effectiveness of their messaging, they may be wasting money.
This is exactly why A/B split testing is an essential tool for successful mobile marketing. Instead of making educated guesses, marketers should instead analyze concrete evidence to assess which campaign messaging works. Using this method to fine-tune one of its own marketing campaigns, top Australian parenting site Kidspot, for example, saw user app engagement go up by 87% in just two weeks.
What is A/B Split Testing?
A/B split testing is a method for testing two variations of a campaign to compare their performances. The A/B split testing of SMS texting campaigns helps boost ROI, allowing more marketing dollars to be invested in the messaging that drives the most sales. It also makes optimizing future campaigns possible.
Regardless of whether the goal is increasing the response rates or sales or simply reducing “unsubscribe” rates, the method is simple:
- Write two variations of the SMS message.
- Split recipients into two lists.
- Message variation A goes to one list, variation B to the other.
To track which campaign message generates which responses, variations A and B should ask consumers to text in to two different short codes. If two different short codes cannot be used, the two messages should at least be sent a few days apart, giving the first mobile ad's effect some time to wear off for a better read on the data, which may be viewed via Google Analytics or a similar tool.
How are A/B Split Test Results Used?
Once data is collected, these questions should be asked:
- Did one message have a higher “open” rate than the other?
- Of the number of messages opened, did one variation have a higher response rate?
- Did one message have a lower “unsubscribe” rate?
- Which message generated the highest number of sales or inquiries?
If there is no significant difference between the two messages' performance, it's possible the SMS campaign messages were too similar. The following factors should be taken into account:
- Tone: Did both message sound professional or friendly?
- Keywords: Was a compelling keyword used in one message or both?
- Price: Did only one version include the price, or did both? And were prices similarly presented?
- Calls to action: How were they different? For instance, did they both ask recipients to text, or did one ask consumers to call?
- Time of day: Consider testing to see if different times of day lead to the higher open, response, and/or conversion rates as well.
- Subscriber lists: Consider testing one consumer demographic over another. For instance, some campaigns may be more effective with recipients living in suburban zip codes than those living elsewhere.
If one message significantly out-performs the other, principles used in the most successful message can be applied to future campaigns.
Companies may split test more than two variations – known as multi-variant testing – so long as their recipient list is large enough for a good sampling. A/B split testing and multi-variant testing are excellent ways to ensure marketing dollars are going where they pay off the most and that each mobile campaign is more effective than the last. No matter how well an SMS or text ad campaign performs, there is always room for improvement.