Skip to the content

Prove yourself wrong with A/B testing.

We did some A/B testing on the Metia Newsletter this month.

For anyone who doesn’t know how this works, you take a small percentage of your list (usually about 20%), and send one version of the newsletter to half of them, and another version with one small change to the other half. You monitor the performance over 48 hours, and then send the winning version to the other 80% of the lists. This way, 90% of recipients are receiving the ‘better’ version of the email.

So we tested the ‘Friendly From’, which is the name that appears instead of the sender’s email address. We tested what we’ve been using in previous months (the head of department’s name for the relevant list) against ‘Metia’.

Every article I’ve seen states that testing shows that customers prefer to receive emails from the company name, rather than an individual’s name. This was backed up by some testing I did at my previous company. So I was ready to see another win for the emails that went out from ‘Metia’. I’d have probably put money on it…

I would have lost that money! The open rates for the emails sent from individuals’ names were about four times higher than the emails sent from ‘Metia’.

Now it could be down to the fact that a large number of people who have subscribed to our newsletter know the senders personally, so seeing their name at the top makes the email a higher priority than seeing ‘Metia’.  Our test group was also relatively small and as we all know, the higher the number of recipients, the more accurate the results of testing. But that might just be me trying to save some face!

So, it just goes to show, no matter what your experience in the field, no matter what’s worked for you or anyone else on previous campaigns, every email  and every audience is different. Unless you test, you’re just guessing.