5 Advanced Email A/B Testing Best Practices

Learn how you can maximize the performance of your email marketing campaigns by adopting these 5 top A/B Testing best practices.

The post 5 Advanced Email A/B Testing Best Practices appeared first on Blog.

We know email marketing is effective. According to Copyblogger, email marketing yields an average ROI of 4,300% and is nearly 40X more effective at new customer acquisition than Facebook or Twitter. With 85% of American adults checking their email at least once per day it’s a channel that can’t be ignored.

That said, you aren’t going to see big numbers like that if you aren’t actively testing the performance of your email campaigns. A/B testing is a great tool to help improve your email marketing performance – but only if you know what you’re doing.

Email A/B Testing Basics

A/B testing, as you may already know, involves presenting users with two options in order to see which alternative performs better. In the case of email A/B testing, that might mean sending half of your list one version of an email and the other half a different version, while you watch for changes in your open rate, click-through rate or other KPI.

The best practices described below represent the foundation of an effective A/B testing program. If you’re already familiar with the general structure of A/B testing campaigns, feel free to skip to the next section. Otherwise, make sure you’ve mastered these basics before increasing the complexity of your program.

  • Set a control version against which tests can be run. Don’t just pitch two random emails against each other, then start fresh with two new emails. Always have a control version (often, the winner of previous tests) so that you’re working off of baseline performance values.
  • Test a single variable at one time. If you change five variables in each email version you send out, you won’t know which of your changes actually contributed to any performance improvements you see.
  • Make sure you’ve hit statistical significance before calling out a winner. Statistical significance helps you to determine how likely it is that any lift you’re seeing is the result of changes you’ve made, rather than a random chance. Use a calculator to make sure your results are legit.

Your email marketing solution should offer you A/B testing functionality, but even if it doesn’t, you can create your own testing protocols by manually segmenting lists and creating separate campaigns for each.

Advanced Email A/B Testing

Once you’ve mastered the basics, you’re ready to expand on your campaign’s fundamental elements. Review the following best practices for opportunities to improve your email A/B testing campaigns.

Tip #1: Start with a hypothesis and a desired outcome

If you make changes to an email and find that one variation performs better than another, that’s a start. But if you don’t know what you’re testing for, you can’t know if you have a winner.

Instead, start every campaign by defining what you hope to improve and why you think the changes you’re testing will contribute positively to your desired outcome.

Tip #2: Test high-impact elements

Sure, you might be able to prove that a blue button in your email newsletter gets more clicks than a red one. But does that really matter to your business’s overall performance?

If you’re going through the trouble of setting up an A/B test for your email message, make sure that you’re testing elements – such as the wording of your CTA or the specific offer you make – that have the potential to provide a significant uplift to your business.

Tip #3: Test more than your subject line and body copy

Although these elements represent natural starting points, don’t stop there. Once you feel you’ve gone as far as you can with tests on your subject lines and body copy variations, expand your testing program to encompass the timing of your email automation flows, the actions you use as triggers, or the way you segment your recipients.

Tip #4: Test broadcast, segmented, automated and transactional messages

According to Litmus’ 2018 State of Email Survey, “Nearly 39% of brands never or rarely A/B test their broadcast and segmented emails. More than 65% of brands never or rarely A/B test their automated emails, and 76% never or rarely A/B test their transactional emails.”

That’s a big deal – and it’s a huge amount of money left on the table. Assuming you’ve mastered the basics of testing your broadcast and segmented messages, make sure you’re extending both the practice of A/B testing and of noting any learnings you’ve discovered, to the other types of emails you send.

Tip #5: Consider the potential impact of timing on email performance

Email Monks contributor Kevin George makes an important point: “Email marketing metrics are subjected to volatility based on time period. Comparing your results of the post-holiday slump i.e. January with the results of the pre-holiday rush won’t give you substantial result.”

No matter how excited you are to kick off a new email A/B testing program, be cautious if that means starting around a period of irregular seasonal or industry-specific activity. Reaching incorrect conclusions from abnormal spikes of activity won’t do your future testing any good.

Getting Started with Email A/B Testing

You may already be carrying out A/B tests on your website. If so, it should be an easy transition to start building out testing workflows on your email campaigns.

If you’re totally new to A/B testing, don’t let the more advanced tips above scare you off. Email A/B testing is a necessary part of maximizing the performance of your email marketing campaigns. Get started today, and remember that you can always increase the complexity and sophistication of your programs as you start seeing results.

What other advanced email A/B testing tips would you add to this list? Leave a note below with your suggestions.

The post 5 Advanced Email A/B Testing Best Practices appeared first on Blog.

Dynamic Heatmaps — The New E-commerce Data Gathering Weapon

Dynamic heatmaps give e-commerce companies the leverage of study real time customer behavior on pages dyanamic URLs

The post Dynamic Heatmaps — The New E-commerce Data Gathering Weapon appeared first on Blog.

E-commerce is a pure service business that demands utmost precision, patience, and persistence to survive the industry’s ever growing competition. The goal is not just to help shoppers find good products, but also provide an exceptional customer experience that forces them to convert into repeat buyers.

Also, customer actions serve as a valuable source of data at every stage of an e-commerce’s funnel. What you do with these customer insights to improve the overall shopping experience and conversion rate, is typically what makes or breaks a deal for your business.  

Most marketers suggest using multiple qualitative and quantitative analytical tools to do so. But, not all of them have the prowess to give you the kind of information you would need to study user experience.

This is where heatmaps come into play. Let’s see how heatmaps can improve your E-commerce business!

Source

What are heatmaps?

Heatmaps serve as one of the best qualitative tools to collect relevant customer data, especially in terms of understanding their actual behavior across your web platform.  

Think of heatmaps as X-Ray films. They show a detailed picture (in the form of graphical representation) of how users interact with your site or store. You can observe how far users scroll, where they click the most, and the products or pages they like or dislike.

Such data is precisely what you need to make your platform more user-friendly, drive more traffic and improve your conversion optimization strategies.

But, if you think that any heatmap would work wonders for your e-commerce site, you’re absolutely wrong!

E-commerce websites are highly dynamic in nature. They have more interactive elements and “behind login page” elements such orders, cart page, etc. than any other site. Such pages shed visitor interaction and information, which typically serves as the primary data for businesses to use to draw page performance conclusions, find elemental distractions, and improve overall customer experience.    

This is where Dynamic Heatmaps save the day!

What are dynamic heatmaps?

Unlike static heatmaps which can only be plotted on static web pages such as the Home Page, Product Pages, Landing Pages, Category Pages, etc., dynamic heatmaps give you the leverage of studying real time customer behavior on pages which are beyond the scope of static heatmaps. In other words, dynamic heatmaps can be easily plotted on live websites with dynamic URLs such as My Profile, Orders, Cart, Account Settings, etc. to gather in-depth customer activity data.

In general, a typical dynamic heatmap offers four primary features, namely, click maps, scroll maps, click areas, and element list. Each of these allow you to look at a web page’s hot spot areas in a detailed manner.     

Let’s now understand the scope of dynamic heatmaps for e-commmerce using some examples!

Studying Cart Page Insights Using Dynamic Heatmaps

The image below gives us an insight a dynamic heatmap plotted on an e-commerce site’s Cart page. It shows that in general, most people, after adding products to their cart, click on the product image, cancel button, postcode section, ‘Go To Checkout’ icon and discount codes.

Such information helps you draw multiple conclusions. Some of which are as follows:

  • Customers might want to see the product images once more before proceeding to final payment gateway.
  • They may not like their chosen product(s) and hence, remove the item(s) from the cart. Or, since they’re not able to view the product image (in zoom) on the cart page, they abandon their cart.
  • They’re most interested in availing discounts. Therefore, the area is hotter than other page elements.

Furthermore, such information also serves helpful in drawing hypothesis on how to improve the performance of various page elements with dynamic URLs, meanwhile find the right means and ways to enhance customer experience and increase your conversions.

“VWO’s dynamic heatmaps allow you to access information of those web pages which are, in general, very difficult to access using regular heatmaps.”

Studying Order Page Insights Using Dynamic Heatmaps

The “My Orders” page of an e-commerce site is an important page that is lesser explored in terms of gathering customer behavior data.

The page allows users to look at their current and previous orders, check delivery status, seek assistance, and even browse through their history. As an e-commerce company, plotting heatmaps on such pages and mapping number of clicks can significantly help you study which page elements are fetching you maximum customer attention.  

For example, the plotted above shows that maximum people are clicking on the “Track” icon followed by “Need Help” and product images. They’re hardly clicking on the “Rate & Review Product” icon, which, according to your platform, can be an important form page to seek product reviews and other essential information. You can further use such qualitative data to make necessary amendments and compel customers to fill the form, like:

  • adding product review pop-ups or call-to-action buttons on the Order page
  • making the section omnipresent
  • giving rewards for adding reviews

So now that you know what your static heatmap tools are lacking, it’s time to upgrade your e-commerce platform with VWO’s dynamic heatmaps and make the most of them.  


The post Dynamic Heatmaps — The New E-commerce Data Gathering Weapon appeared first on Blog.

Death by CRO: 4 Common & Deadly CRO traps to avoid (includes free survival tips)

Experimentation has been the cornerstone of all successful organizations. But many fall prey to the obvious yet deadly traps when running or building a CRO programme in their organizations. Find out how you can avoid these traps in this blog.

The post Death by CRO: 4 Common & Deadly CRO traps to avoid (includes free survival tips) appeared first on Blog.

Experimentation has always been the driving force when challenging the status quo – whether it’s the battlefield where a change in the strategy has altered the course of history or a product change which separates successful products from thousands of failures. For online businesses, this has translated into improving customer experiences thus leading to increase in conversions at the lowest risk possible. Rome wasn’t built in a day and your CRO programme too will not be.

So before you get going to prepare for the ultimate showdown using CRO in your organization; let me help you navigate the various challenges you might face during this journey.

  1. Don’t break the news, already!

Case 1: Omg! Just 3 days and my variation is performing way better than the control. Let me ring my CEO and tell her that I was right about this change.

If this is you, congratulations you just got killed by CRO. One of the most important ground rules for testing is to be patient. Initial test results might excite you to go out there and proclaim victory but wait for the test to conclude to clearly state it. Setting high expectations for the success of an experiment after seeing initial traction may do more harm than good.

Expectation setting may not directly be linked to website optimization but trust me when the results don’t come out as well as expected (thanks to your initial excitement), you wont get team buy-ins for bigger experiments.

Case 2: Damn! It’s been 5 days and there is no conclusion I can draw from this A/B Test. What will I tell my CMO if he asks me how the test is coming along?

Initial test result might put you on the backfoot if you see no or minute movement in your conversion graph to justify the CRO efforts. The answer to all worries is patience. Big changes or small, it takes some time for your results to reach statistical significance given a variety of factors such as the number of visitors being tested, number of variations, etc.

To help you not get excited or demotivated before time, we have built a calculator to help you determine the duration for your A/B tests here.

2. ‘This isn’t working’ syndrome

5 tests. But no major change in conversions. But company X whose case study I read did 2x better in conversions. What am I missing?

Let’s assume you ran an on-page survey for an ecommerce site and figured that people who were not completing the purchase were skeptical about the security of the checkout page (even though it may actually be safe). This stopped them from putting in their card details and abandoning their carts. As an obvious next step you form a hypothesis backed by solid data and create a test variation with more security certification badges, testimonials, etc. The result- no difference at all!

So, does that mean you crafted a wrong hypothesis?

The answer is maybe. But take a step back and think about how many ways can you improve security perception of your checkout page? Or make people trust your payment processes?

Answer: More than we care to count.

And this is true for your first successful test which may have got you a 10% lift as well. You still have to think of ways of improving that number. There are better alternatives out there. You just need to keep testing.

Your optimization army should be inspired by the losses to dig deeper and find richer insights to create that one victory which will change the course of your business. Ask yourself how many iterations did you try before arriving at the conclusion that ‘it isn’t working’!

3. Monkey see, monkey do

When our neighbors fought their first conversion battle, they just changed their website CTA color to green and camouflaged their way to get better conversion rates! Let us paint our own checkout CTA green!

They say ‘Imitation is the sincerest form of flattery’, but not when it comes to CRO.  ‘Best practices’ may not be the best for you. Hard data and ground reality on your website may be completely poles apart than the case study you read about. Do not expect similar results from the experiments run by others in the arena. Use quantitative and qualitative research methods to devise a unique hypothesis and then launch your test. Also choose the right weapons (A/B testing or multivariate testing) and a structured CRO plan to execute it.

All businesses are different, so are their visitors’ behavior and thus their experiments. What worked for one may or may not work for others, the idea is to always keep testing until you succeed.

4. The show must go on!

CRO should go on. I will make sure that when I retire or lose a limb in the battle for conversion throne, my army is ready to fight without me.

At VWO, we come across customers who suddenly stop testing and the main reason they cite is that the person who was carrying the CRO baton has quit. Find it shocking? Even we do!

We need to understand that CRO is not just a one-person or even a one-team job. Building an organization which thrives on CRO requires not just education and training, it requires a change in the cultural fabric of the company. A CRO-friendly culture requires the HiPPOs to take a backseat and invite the soldiers from different teams (product, marketing, design and so on) to draw up a battle plan. Don’t take anyone’s word on the face of it but test everything! Celebrate successes and publicize results to get a team-wide buy in for experimentation. It is an uphill battle and hence requires you to plan ahead and properly. Remember Rome?

Find some excellent tips to build a culture of experimentation in your organization here and build a CRO army to continue the battle for conversion even if someone calls it a day.

Parting words

While you wear your shining armor of a CRO catalyst, believe in yourself and don’t let ‘Death by CRO’ scare you because with the right attitude, you are going to win it not just for yourself but for future teams within your organization. Don’t take my word for it (you might, I have seen 5000+ customers across 90 countries succeed) but test it!

PS: I hoped to save some lives with this blog. Tell me if you survived in the comments section.

The post Death by CRO: 4 Common & Deadly CRO traps to avoid (includes free survival tips) appeared first on Blog.