When it comes to reaching out to your app users it can be tempting to make assumptions about the types of content and messages each persona would be likely to engage with. But we all know that to assume something makes an ass out of u and me (yep...embarrassingly, I went there 🙄).
What’s the point in wasting time on assumptions when you have the ability to test what works best for each user segment and actually learn about their preferences? With all the time and effort you put into your app’s marketing communications, it is essential that you are winning over your users. You need to make sure you can actually increase app engagement and user retention as a result of your push notifications. That’s where A/B testing comes in...
What is A/B Testing?
A/B testing is the comparison of one or more versions of a push notification to see which one performs better. These tests are sent to smaller groups of randomly selected users before sending the winning version to the remaining majority of relevant users.
For example, let's say you have a list of 5000 users you want to receive a certain push notification. Divide this list randomly in two, so you now have two lists of 2500 users. Select any one of the lists and again, divide it in two (or more depending on how many tests you have). You will now have three lists, two with 1250 users and one with 2500 users. Send your A/B test versions to the smaller lists and wait for the results. How long you wait is totally up to you and can be as little as 1 hour to as long as 24 hours. After this time, evaluate the results and send the version that performed the highest to the remaining larger list of 2500 users.
How long you choose to wait is obviously pretty dependant on the messaging you are testing. If it the content is time sensitive, for example a discount code that expires in 24 hours, then of course you aren't going to wait much longer than an hour for your results to come in.
Related content: [Video] The Best Practices for Push Notification Permissions
One at a Time
Before we dive into the list of variables you can test whilst carrying out A/B testing I have something worth mentioning.
Test one variable at a time.
If you test emojis at the same time you test using humour, how will you know which variable contributed to the increase in engagement? Keep it simple so that your results are clear.
What Variables for Push Notifications Can You A/B Test?
Using a users name in a push notification may catch their attention. However, with the right tool, personalisation can go way beyond a basic first name. It's likely that your user handed you this information on a plate at the very moment they first registered for your app... So it's not really very impressive.
Show your users that you pay attention and that you care about their individual needs by providing them with messaging based on their previous in-app activity or personalised information such as location (see Uber's example below). Test a personalised push notification against an unpersonalised one; I guarantee the personalised push message will come out on top and help you increase app engagement. Even basic push notification personalisation has been reported to increase open rates by nearly 10%.
Say the same thing… but a little different. Test whether or not your users’ respond better to short sentences or long sentences, humour or no humour, capitals or no capitals… You'd be surprised at how the smallest edit can cause the biggest change in results.
Image credit: Swrve
You can send images and GIFs directly to your users’ device without any need for them to click into the app. It's super effective; media-rich push notifications have been found to generate a 25% increase in engagement. Test whether or not this has a huge impact on your click through rates.
Emojis and GIFs can play a massive part in your mobile marketing. However, this does not make them appropriate or necessary at all times. Your decision whether or not to use emojis is largely dependant on the user segment you're working with or the message that is being pushed out.
Use A/B testing to discover when they work well, and when they don’t. If they do work for your brand, you stand to increase your engagement by up to 20%!
Again, humour is something that can be sensitive depending on the user segment and the message. It may also not work with your brand image. But hey, no harm in testing it!
Related content: Blog - Can Humour Work for Push Notifications?
Certain groups of users are more likely to engage with your push notifications during particular times of the day. Make sure to take time zones into consideration here too and segment your users accordingly so you don’t annoy anyone in the middle of the night. You can even go a step further with a platform that incorporates intelligent delivery. With this, the platform can learn the best times of the day to send messaging to your users through analysing their previous in-app behaviours.
A/B testing push notifications can drive higher engagement and retention rates for your app as you continue to learn over time what sort of content resonates best with your users. Just don't view it as a one-off task, and definitely don't think that just because something worked for someone else, that it's going to work for you too. Keep evolving and testing new things to figure out what gives your push notification campaigns the best results.
If you still have any questions with regards to how to A/B test push notifications or need help with your app user engagement then don’t hesitate to contact me directly via firstname.lastname@example.org. You may also be interested in downloading our free guide The Ultimate Guide to Mobile App Marketing.
You May Also Like
These Related Stories