Hello cold emailers, today we are talking about A/B testing for cold email. From our experience, the chances are that many folks are not clear on why A/B testing is so important, or on the difference between A/B testing for landing pages and for cold email.
In this episode we go over the definition of A/B testing, how to compare “apples and apples” when it comes to the testing results, and how to quickly improve your positive reply rate every week. TUNE IN!
HERE’S WHAT YOU WILL LEARN:
- What is A/B testing and why it’s essential for your cold email even if you are getting more replies than you can handle?
- What is the adjusted positive reply rate and why is it the only metric that counts when it comes to comparing A/B test results?
- How many prospects should you send to in order to run a proper A/B test for cold email?
- How to use A/B testing to increase the deliverability of your cold email?
- How many variations do you need per cold email and what elements do you need to change in them?
- What are the most common pitfalls with cold email A/B testing and how to avoid them?
To help you get most accurate cold email A/B testing results, we want to share a formula for determining the adjusted positive reply rate.
The adjusted positive reply rate = (number of positive replies divided by unique number of opens) * 100.
- 100 sent (50% opened) -> 10 replies = 10% reply rate = 20% adjusted positive reply rate
- 100 sent (75% opened) -> 20 replies = 20% Reply rate = 13% adjusted positive reply rate
To conclude, ALWAYS use A/B testing when you want to get from 5% to 20% reply rate for your cold email, go crazy with different variations to see which angle will work best, and focus mainly on the adjusted positive reply rate to compare your testing results.
Happy cold emailing,
Jeremy and Jack