Email marketing test results - data streams

5 Tips to Help You Think About the Big Picture with Test Results

By Andrea Amboyer

Marketers use data to interpret test results, but sometimes you need to look beyond the numbers to understand how your tests fit into the overall customer experience.

Our recent work with a client that provides background screenings for landlords and property managers generated some fascinating data. It also helped illuminate 5 tips you can use in your own email marketing work to understand your data better.

Our client provides screening services that landlords use to review background information about potential tenants. Customers buy screenings as needed, with one landlord often screening multiple applicants for multiple properties.

The purchase cycle for screenings is based on when a lease expires. Customers who converted in the past are sent send targeted messages one year after their last screening date.

All customers and prospects typically receive one marketing message per month: a newsletter-style template highlighting new blog articles. Our client also sends occasional promotional emails with an offer for a free or reduced-cost screening.

First, some background on what our client was looking for:

For over two years, 90% of their marketing emails focused on blog content and used the newsletter template. Our client came to us wanting to know if this was the best approach but didn’t know how to start testing.

Some of the remaining 10% of marketing emails used the more promotional template. We started with that template for our test variation, as the content was focused on the screening services and was already built. This minimised additional production work.

Tip 1: Always start with a clear customer behaviour-driven hypothesis.

We started with the following hypothesis: A blog email with editorial content will drive more subscriber engagement than an email about the features and benefits of screening potential tenants.

We based our hypothesis on the following information:

  • Most leases run for at least a year, and most landlords in the database have either a small number of properties or just one rental unit.
  • Most landlords need screening services only one to two times a year.
  • The promotional emails generally don’t perform well for engagement.

Holistic recommended starting with some dramatic A/B creative tests to explore how their subscriber list would respond to different types of email content.

Before working with Holistic, our client tested single elements, such as subject lines or creative elements like CTA language. To make this test efficient for the production team, we used the current blog template as the control and modified the promotion email template.

We renamed the modified promo template the “feature template” and used it as the variation creative.

Over three months, we ran the test four times. The control blog templates consistently had better click-through rates (CTR) and click-to-open rates (CTOR).

We assumed our data told the whole story …

 

…until we added in Google Analytics data.

Tip 2: Make sure you have access to all relevant metrics and data.

Ask your analytics team if you have access to all tracking platforms – you might be missing out on valuable data.

Although they add UTM tags (e.g., “utm_source”) on all emails, its system of record for revenue is its internal database. Google Analytics calculates revenue using the average value of a screening (due to a tiered price for different screening options).

We initially focused on ESP-only data for this test because our hypothesis concentrated on subscriber engagement.

The control blog template did drive more clicks. However, the variation feature template had a much higher conversion rate (i.e., landlords were screening at a much higher rate) and the delta between the estimated revenue was less than $1,000 for three of the four tests.

Campaign 2’s control blog template led with an article about rental scams. This is a hot topic and consistently drives much higher traffic than average. Even with this buzzy rental scam article, the variation feature template still converted at a 92% higher rate.

Tip 3: Understand how your creative layout and click paths connect to your test results

If we had focused only on clicks without looking at revenue and the number of screenings (which drives our client’s business), we wouldn’t get a clear picture of what happened after customers clicked on the call to action. But because we could call on screening and revenue data, we could dig deeper and understand the results in the context of the customer journey.

The test results became even more compelling once we considered where the creatives linked on our client’s website:

  • The control blog template links to three to five blog articles as well as the screening landing page.
  • Fewer than 20% of the control blog template links went to the screening landing page. In terms of total clicks, only 10% of traffic went to the screening landing page.
  • In contrast, all CTAs in the variation feature template link to the screening landing page. Thus, almost 100% of the clicks went directly to the screening landing page.

Yes, the blog control template drove more clicks in general. But that’s not the whole picture. The variation feature template drove more clicks to the screening landing pages.

This highlights the rule that your success metric must map back to the objective for your email or program. If you focus on the wrong success metric, you will optimise for the wrong result, thus sacrificing untold conversions and revenue.

Tip 4: Calculate statistical significance whenever possible.

We ran our test four times. This meant our test results had a 99% confidence level. We could look at clear patterns and understand the aberrations (the “buzzy” scam article).

Use our Statistical Significance Calculator

Tip 5: How do the test results figure into the big picture, such as a typical customer journey?

Both content types—blog content and product content—are valuable. Because a landlord doesn’t need screening services every month, our next tests will explore the optimal cadence for the variation feature template.

We aim to learn more about the subscribers who converted from each email type:

  • Were they first-time customers?
  • If they were prospects, how long after their opt-in dates did they buy for the first time?

We used the following information to create a new hypothesis:

  • Our clients send customers a lease-expiration reminder 11 months after their first screening. This email performs well.
  • Most customers buy their first screenings within two days after opting into emails, suggesting opt-in date is significant.

Here is our new hypothesis: Prospects will respond well to a variation feature template email 11 months after their opt-in date as this aligns with their rental property lease expiration.

Stay tuned to discover if we were right! We’ll reveal the complete results soon in a blog post.

Subscribers to our biweekly Holistic Insights newsletter will get a detailed sneak peek!

Sign up now so you’re among the first to find out. Plus you’ll get a digest of the latest digital marketing news, first crack at registering for events like our award-winning Email & More webinar series and much more.