Testing Optimization: Making Good Emails Better

This is week 7 of MDR’s Email Best Practices series

Emails talk back instantly.

talk.back.2Did they open it? Did they click through? Did they visit your landing page? Click the CTA button? For email marketers, these insights give us the opportunity to adapt our campaigns based on what works and what doesn’t. And the low cost of email vs. other media means you can experiment freely to fine-tune campaigns for the best results.

Knowing the education market as well as we do, MDR is able to advise our clients on how best to apply the insights from campaigns to continually improve performance and recipient experience. After all happy email recipients make for happy email marketers. A/B Testing combined with our Campaign Analyzer tool gives clients insights into the performance of individual campaigns. We aggregate data across campaigns to identify larger trends in response rates. Using these models we’ve cataloged best practices in email testing to help you make good emails better.

What is A/B Testing?

A/B testing is the term for any splitting of an email list to send two different variations of an email to each audience to gauge reaction. You can test any element of the email itself, as well as your audience segmentation and the timing of when you send the email. Below are some of A/B tests that email marketers are using to improve campaign performance.

It is best to test only one variable at a time so you can draw a clear conclusion. In most testing scenarios you segment your list into equal segments, both in volume and demographics, so any differences in result can be attributed to the email version seen vs. the audience who saw it. Here are the major categories of email A/B testing you can try:

B tests marketers use.2

Messaging: Are you saying the right things in the right way?

messagingIn Week 4 of this series, we talked about how important it is to make every word count in the limited real estate of an email. Testing is how you measure the effectiveness of the words you choose. Some of the messaging elements you can test are:

  • Subject Line (phrasing, length, is location added, is it personalized, etc.)
  • Message (e.g., greeting, headlines, body, closing)
  • Call to Action (How long, location within the email, language)
  • From Line (From a person or a brand)
  • Landing Page (The same message as the email, or a variation)

All words aren’t equal. People react differently to certain words or phrases. Some verbs incite people to action and some adjectives resonate more than others. As we discussed in Week 4, just changing the wording of a CTA button (i.e., from Order Super Product to Get Super Product) transforms that action from something that benefits you to an action that benefits them. Small word changes, big change in response.

Key Takeaway: Testing can reveal how small wording changes make big impacts.   

Design: Does the look of the email support the objective?

A poorly designed email can undermine its effectiveness, while a well-designed one can boost performance. Testing the elements of your design can reveal whether your design choices are helping or hurting. You can try testing:

Layout and images used
• Is a complex layout with tables better than a more simple design?
• Will a photo with people do better than a product image?
• Is your CTA a link or a button?

Mobile Layout
• In what order do elements appear?
• Are your buttons or links large enough?
• Are you using less copy than the desktop version?

Good design is subjective so start with a good design that follows best practices (as we covered in Week 5) then test to find if small adjustments can tune the design to the tastes of your audience.

Key Takeaway: Testing can identify the most effective design for an email

Targeting: Are you reaching the right people?

targetingIs your offer the right fit for this person? The broad categories we use to create an email campaign segment still leave room for tightening the focus to make the offer more appealing. Testing your targeting can take a number of forms, but some common approaches include:

Purchasing History: The most frequently used targeting test is based on past purchase behavior: have they purchased from you before, how long ago, and which products? Repeat purchasers should have their loyalty acknowledged through the familiarity of your tone and the level of incentive you offer. People who have never purchased may need more information and proof points in the copy to convert. While segmenting by purchase history may take more back-end data effort, these tests can yield the greatest results.

State of Mind: How did this person come to be on your email list? And how long since they’ve engaged with you? i.e., if you acquired their name through a list broker, they may need more incentive to engage with you than if they gave you their name at a trade show.

Geography: The U.S. is a big country with distinct regional cultures. Again, you might adapt how you speak about a topic depending upon where in the country you are, so your email may need to do the same. Being sensitive to unique state standards, different education policies, state adoption vs. open territory, your product category, recent events or issues, and even to how to use colloquial phrases correctly can all help put your email in the local language to make it more appealing.

Another testing technique to verify audience targeting is personalization by discipline or interest. Calling out that interest in a subject line, headline, or with an image or graphic will focus the topic of the email and quickly reveal if it catches your audience’s eye or not. Learn more about email personalization by referring back to our Week 2 article.

Key Takeaway: Monitor and clean your email list of hard bounce-backs and create criteria for handling repeated soft bounce-backs.

Scheduling: Does when you send your email matter?

When your email arrives in the inbox will have a big impact on how it is received. Testing is a great way to get to know the email habits of your audience and when they are most receptive. Try testing:

  • Days of the week sent
  • Time of day sent
  • Particular months

Educators in particular follow a pattern in the times of the year they are researching and purchasing for the next school year, as MDR’s research shows.


Key Takeaway: Testing can help you fine-tune your audience targeting.

Basically, A/B Testing allows marketers to add science to art, while learning which tweaks yield better results. Results don’t have to be random and variable. By reviewing what your audience does and does not react to, you can construct an email that will garner results. Using optimization techniques, what is learned from each test should be applied to the next deployment for continuous improvement.


About MDR: Build your brand, champion a cause and/or inspire a child by leveraging our suite of email marketing, market intelligence, creative services, database and technology. Deliver relevant content and create meaningful connections with millions of educators, parents, and children who can benefit from your organization with our unique connectivity to the market. Speak directly to your desired audience through our WeAreTeachers, WeAreParents and EdNET communities. Align your brand with the most trusted partner in education marketing, MDR. Contact us today.

Let's Talk

Contact MDR for a free consultation. Call us or we can call you…