One misstep we often see companies make: Investing in a massive website redesign without validating the changes they’re making by looking at their customer data first. The downside of this mistake is massive; wasted time and money, to be sure, but al…
One misstep we often see companies make: Investing in a massive website redesign without validating the changes they’re making by looking at their customer data first. The downside of this mistake is massive; wasted time and money, to be sure, but also, this is a huge missed opportunity for making positive and impactful website […]
In January, I shared a post on Crazy Egg’s new priority as a company: our mission is to help you get better at paying attention to the people visiting your website, so you can improve their experience. We believe (and have seen firsthand) the undeniabl…
In January, I shared a post on Crazy Egg’s new priority as a company: our mission is to help you get better at paying attention to the people visiting your website, so you can improve their experience. We believe (and have seen firsthand) the undeniable downstream impact of placing the focus on paying attention. Paying […]
We are all united in a common struggle: designing products, apps, and websites that engage, entice, and delight our customers. So how do you know for sure that you are actually providing what they’re looking for? You research, experiment, and launch A/…
We are all united in a common struggle: designing products, apps, and websites that engage, entice, and delight our customers. So how do you know for sure that you are actually providing what they’re looking for? You research, experiment, and launch A/B tests. A/B testing can be a lightweight, sustainable, and time-saving activity that helps […]
We know email marketing is effective. According to Copyblogger, email marketing yields an average ROI of 4,300% and is nearly 40X more effective at new customer acquisition than Facebook or Twitter. With 85% of American adults checking their email at least once per day it’s a channel that can’t be ignored.
That said, you aren’t going to see big numbers like that if you aren’t actively testing the performance of your email campaigns. A/B testing is a great tool to help improve your email marketing performance – but only if you know what you’re doing.
Email A/B Testing Basics
A/B testing, as you may already know, involves presenting users with two options in order to see which alternative performs better. In the case of email A/B testing, that might mean sending half of your list one version of an email and the other half a different version, while you watch for changes in your open rate, click-through rate or other KPI.
The best practices described below represent the foundation of an effective A/B testing program. If you’re already familiar with the general structure of A/B testing campaigns, feel free to skip to the next section. Otherwise, make sure you’ve mastered these basics before increasing the complexity of your program.
Set a control version against which tests can be run. Don’t just pitch two random emails against each other, then start fresh with two new emails. Always have a control version (often, the winner of previous tests) so that you’re working off of baseline performance values.
Test a single variable at one time. If you change five variables in each email version you send out, you won’t know which of your changes actually contributed to any performance improvements you see.
Make sure you’ve hit statistical significance before calling out a winner. Statistical significance helps you to determine how likely it is that any lift you’re seeing is the result of changes you’ve made, rather than a random chance. Use a calculator to make sure your results are legit.
Your email marketing solution should offer you A/B testing functionality, but even if it doesn’t, you can create your own testing protocols by manually segmenting lists and creating separate campaigns for each.
Advanced Email A/B Testing
Once you’ve mastered the basics, you’re ready to expand on your campaign’s fundamental elements. Review the following best practices for opportunities to improve your email A/B testing campaigns.
Tip #1: Start with a hypothesis and a desired outcome
If you make changes to an email and find that one variation performs better than another, that’s a start. But if you don’t know what you’re testing for, you can’t know if you have a winner.
Instead, start every campaign by defining what you hope to improve and why you think the changes you’re testing will contribute positively to your desired outcome.
Tip #2: Test high-impact elements
Sure, you might be able to prove that a blue button in your email newsletter gets more clicks than a red one. But does that really matter to your business’s overall performance?
If you’re going through the trouble of setting up an A/B test for your email message, make sure that you’re testing elements – such as the wording of your CTA or the specific offer you make – that have the potential to provide a significant uplift to your business.
Tip #3: Test more than your subject line and body copy
Although these elements represent natural starting points, don’t stop there. Once you feel you’ve gone as far as you can with tests on your subject lines and body copy variations, expand your testing program to encompass the timing of your email automation flows, the actions you use as triggers, or the way you segment your recipients.
Tip #4: Test broadcast, segmented, automated and transactional messages
According to Litmus’ 2018 State of Email Survey, “Nearly 39% of brands never or rarely A/B test their broadcast and segmented emails. More than 65% of brands never or rarely A/B test their automated emails, and 76% never or rarely A/B test their transactional emails.”
That’s a big deal – and it’s a huge amount of money left on the table. Assuming you’ve mastered the basics of testing your broadcast and segmented messages, make sure you’re extending both the practice of A/B testing and of noting any learnings you’ve discovered, to the other types of emails you send.
Tip #5: Consider the potential impact of timing on email performance
Email Monks contributor Kevin George makes an important point: “Email marketing metrics are subjected to volatility based on time period. Comparing your results of the post-holiday slump i.e. January with the results of the pre-holiday rush won’t give you substantial result.”
No matter how excited you are to kick off a new email A/B testing program, be cautious if that means starting around a period of irregular seasonal or industry-specific activity. Reaching incorrect conclusions from abnormal spikes of activity won’t do your future testing any good.
Getting Started with Email A/B Testing
You may already be carrying out A/B tests on your website. If so, it should be an easy transition to start building out testing workflows on your email campaigns.
If you’re totally new to A/B testing, don’t let the more advanced tips above scare you off. Email A/B testing is a necessary part of maximizing the performance of your email marketing campaigns. Get started today, and remember that you can always increase the complexity and sophistication of your programs as you start seeing results.
What other advanced email A/B testing tips would you add to this list? Leave a note below with your suggestions.
Often, people who manage websites are excited to dig into analytics, but they’re not sure how to bridge the gap between making observations on the actions of their website visitors, and taking an action on those insights themselves. To get valuab…
Often, people who manage websites are excited to dig into analytics, but they’re not sure how to bridge the gap between making observations on the actions of their website visitors, and taking an action on those insights themselves. To get valuable pointers from someone who runs website optimization experiments as part of his job, we […]
As the scope of CRO projects increases, so does the difficulty of management and the tediousness of the manual efforts required. To overcome these pain points, digital marketing agency Oogst – a Merkle company collaborated with HEMA, a leading Dutch retail chain and created a chatbot by bringing together the power of the VWO REST API and Slack, to scale CRO efforts at HEMA.
Continuous experimentation and testing increases conversions at high growth and data-driven organizations. Scaling a conversion rate optimization (CRO) program requires not only building a long-term testing roadmap but also building a culture of experimentation involving multiple teams and stakeholders across the company. As the scope of CRO projects increases, so does the difficulty of management and the tediousness of the manual efforts required. To overcome these pain points, digital marketing agency Oogst – a Merkle company collaborated with HEMA, a leading Dutch retail chain and created a chatbot by bringing together the power of the VWO REST API and Slack, to scale CRO efforts at HEMA.
This blog post shall delve deeper into why and how we leveraged the power of VWO API and Slack to scale our CRO program and help in building a culture of experimentation.
The challenges of scaling CRO
For Oogst and HEMA, CRO involved constant juggling of interests between its CRO specialists, analysts, clients, website developers and other relevant stakeholders. Each decision and action affects many people, which is why the process needs to go as smoothly as possible.
For example, whenever a test is activated or stopped, many people need to be informed at the very moment it happens without leaving anyone out. When you scale CRO program at your organization, informing everybody manually becomes a hassle, especially with new tests being turned on and off frequently. This process needed to be automated to bring efficiency and agility to the CRO initiatives.
The Birth of Rover
We discovered that the VWO REST API makes it easy to monitor tests for changes in status, and that we can use this information to make announcements easily. As a result of these discoveries, the idea for a chatbot was born. By linking the VWO API, Slack webhooks and a local database through Python, we were able to notify the right people about test status changes the moment they occurred. We named him Rover: he would be a dog/bot hybrid that barks at us when it needs our attention. Rover also sends notifications to relevant parties about when to check the preliminary results of a test. A huge burden was lifted from our shoulders, leaving us with more time to think about the next set of strategic experiments.
Growing the bot
As these simple adjustments to our progress proved to be quite helpful to us and HEMA, we began exploring the VWO API to see what else we could do. We came up with a number of areas where the bot could help us with:
We run a high number of tests and it is of utmost importance to us and HEMA that everything goes according to plan at all times. We have set up quality control checks internally to limit the risks. There are processes we follow before publishing a test to make sure it’s safe to go live and we double check right after publication in case anything was missed.
For example, some of the things we check for on the VWO side include if the traffic has been set to 100%, if GTM integration has been enabled, and whether the campaign’s name follows our naming conventions. Luckily, the VWO API made it possible for us to automate the checking of settings like this, so that we can be more certain a test is ready to be published, while taking us much less effort.
Monitoring live A/B tests
After publishing any test, we monitor events in Google Analytics to make sure we didn’t miss anything and the test is, in fact, running correctly. Although it allows us to maintain the level of quality we desire, this process is also very time consuming, tedious and prone to human error, much like the announcements.
To deal with this, we added the Google Analytics API into the mix to get Rover to check for the amount of VWO events for a particular test and notify us about its findings. The absence of VWO events likely meant the test wasn’t running (correctly), which is something we always had to check manually before.
Conversational A/B test management
However, at this point, Rover would only transmit messages, he could not process received messages. If we wanted a user to intuitively tell Rover to check a test, we’d have to include an element of interactivity.
That’s why we hooked up our bot to artificial intelligence using IBM Watson, which allows it to naturally process language. With this integration in place, a user is able to ask “Rover, is VWO test 244 ready to go live?” and it will perform the checks. Then, the user can simply publish the test by saying “Rover, publish 244”.
Right now, Rover has truly become part of the team, both of Oogst and HEMA. His contribution to the speed and quality of our process is well recognized. We included even more functionalities such as the ability to pause the manual activation tags we built in Google Tag Manager to trigger VWO tests, thereby linking VWO and GTM together.
But we’re not finished yet!
Scoring points: Building a CRO culture through gamification
Although the power and effectiveness of CRO and A/B testing is well established, building support for it throughout the organization still remains a difficulty.
To combat this, we found yet another role for Rover: quizmaster!
First, we pull basic information about the test and its variants from VWO. We then use this data to extract more detailed information about the experiment from our project management board. This includes background information, hypotheses, descriptions of the control and variants as well as screenshots.
From this knowledge, we dynamically generate polls on Slack where users vote on the landing page variants they think will outperform the rest. These users, generally members of the client organization, then battle each other in a CRO tournament where it is determined who knows most about the website visitor.
The introduction of a gamification element has made CRO more exciting to those not generally involved in it, it has made our efforts more visible throughout the organization and also highlights the successes we have. Furthermore, it builds team spirit and generates more ideas for future testing. All because of our data-driven quizmaster: Rover!
To sum up:
Here’s a quick summary of the benefits we have already achieved:
· Notify stakeholders of starting/stopping of tests;
· Notify stakeholders to check preliminary results;
· Check Google Analytics for presence of VWO events;
· Perform quality control checks;
· Pause Manual Activation tags in GTM;
· Start/stop tests through chat;
· Perform the role of quizmaster, allowing stakeholders to vote on which variant they expect to outperform the others.
More to come!
We’re in touch with VWO to expand the abilities of the REST API even further to make Rover more powerful. This collaboration is a very exciting one to us. The API has already proven to be invaluable in our current operations which are now running more smoothly than ever before.
This bot is the collaborative effort of Gino Renardus, Martijn Heerema and Thom van Engelenburg (consultants at Oogst, a Merkle company) and Floor Hickmann and Raun Sips (UX at HEMA).
Oogst, A Merkle Company, is the leading digital marketing, analytics and optimization agency in the Netherlands. Based in Amsterdam, its team of over 70 experts provide leading brands with digital marketing services aimed at utilizing customer data in the best possible way to achieve the highest returns.
The company has partnered with VWO since 2012 to test and optimize some of the most popular websites of the country for their clients. Their extensive knowledge of, and experience with the VWO platform has led to these operations continually growing in size. Moreover, this has also led to increased stakeholder involvement and more intensive test management. In order to better deal with this increased scope, the Oogst team was recently joined by a new member: a CRO Chatbot that operates through the VWO Application Programming Interface (API). This chatbot is able to assist in many of the activities surrounding the CRO process. Do you want to meet Rover or get to know more? Get in touch with Oogst’s Data, Tech and Optimization team! Reach out via email@example.com.
Since 1926, HEMA has made the everyday life of its customers easier and more fun through products that positively stand out: due to their quality, design and price. HEMA offers over 30,000 of its own products and services, has over 750 stores in nine countries and 19,000 employees. As consumers move more towards online, HEMA recognizes the importance of developing a digital strategy as progressive as the brand is.
HEMA’s webshop is widely recognized to be one of the best of The Netherlands, with a top position in the Twinkle 100 and by frequently receiving awards such as Best Department Store Webshop. HEMA and Oogst collaborate on CRO to ensure its Dutch and international webshops remain among the best.
As a destination marketer, one of your main challenges is turning your website visitors into destination visitors. Before a visitor comes to your destination they compare you their other options. During this research phase, you tailor your ads to match their interests, you utilize search engine marketing tools to make sure your advertising and social… Read More
As a destination marketer, one of your main challenges is turning your website visitors into destination visitors. Before a visitor comes to your destination they compare you their other options. During this research phase, you tailor your ads to match their interests, you utilize search engine marketing tools to make sure your advertising and social content is targeted to their search results, and you hope these visitors click through to your site to consume and engage with the top content you’ve created.
But what are the best practices in turning these online visitors into destination visitors?
Leading destination marketers from Explore Branson, Elkhart County, Indiana, and Visit Williamsburg believe website personalization is a cost effective way to turn their website visitors into destination visitors. In Time to Get Personal, these three destinations highlight some of the ways Bound’s personalization solution has helped them stand out amongst their peers and convert their online visitors into destination visitors. Some of their results include the following:
Explore Branson has seen a 560% increase in e-newsletter sign-ups by using a personalized pop-up targeted to different website audiences.
Elkhart County, Indiana used Bound’s A/B testing capabilities to increase travel guide conversions by 253%.
Visit Williamsburg used Bound to maximize the value of their paid media campaigns. ince targeting paid media visitors to the website with personalized landing pages, they have seen a 41% increase in time on site.
Read more in this report to learn how these destinations got these results and to see if now is the right time for you to explore personalization for your destination’s website.
The question “How to test if my website has a small number of users” comes up frequently when I chat to people about statistics in A/B testing, online and offline alike. There are different opinions on the topic ranging from altering the si…
The question “How to test if my website has a small number of users” comes up frequently when I chat to people about statistics in A/B testing, online and offline alike. There are different opinions on the topic ranging from altering the significance threshold, statistical power or the minimum effect of interest all the way […] Read More...
When I speak with our clients, it often strikes me how many of them feel overwhelmed by the very idea of personalization. Our imagination, often fueled by the marketing teams of various software companies, creates a perfect world where personalization enables every interaction to be completely custom for every individual. In this dreamland, artificial intelligence […]
When I speak with our clients, it often strikes me how many of them feel overwhelmed by the very idea of personalization.
Our imagination, often fueled by the marketing teams of various software companies, creates a perfect world where personalization enables every interaction to be completely custom for every individual. In this dreamland, artificial intelligence and machine learning solve all our problems. All you have to do is buy a new piece of software, turn it on, and…BOOM: 1:1 personalization.
As a data scientist, I’ll let you in on a little secret: that software only provides the technological capability for personalization. Even further, the algorithms found within these tools simply assign a probability to each potential experience that maximizes the desired outcome, given the data they have access to. Suffice to say, they’re not as intelligent as you are led to believe.
If you caught our first post in this series, you already know that we define personalization a bit more broadly, as any differentiated experience that is delivered to a user based on known data about that user. This means personalization exists on a spectrum: it can be one-to-many, one-to-few, or one-to-one.
And while there are many tools that enable you to do personalization from a technical standpoint, they don’t solve for one of the main sources of anxiety around personalization: strategy
Most personalization campaigns fail because of a lack of a strategy that defines who, where and how to personalize. So I’ve put together a free downloadable guide to help you do just that. This seven-page guide is packed full of guidelines, templates and best practices to strategize and launch a successful personalization campaign, including:
Major considerations and things to keep in mind when developing your personalization strategy.
More than 30 data-driven questions about your customers to identify campaign opportunities.
A template for organizing and planning your personalization campaigns.
Guidelines for determining whether to deliver your campaigns via rule-based targeting or algorithmic targeting.
Free Download: Plan & Launch Profitable Personalization Campaigns.
It’s January 3, and if you’re like us, you’re already heads down at your desk and neck deep in emails. But we’d be remiss if we didn’t take a minute to reflect on the previous year. In November of 2018, we quietly celebrated 15 years of being in business. When Brooks Bell was founded, experimentation was in […]
It’s January 3, and if you’re like us, you’re already heads down at your desk and neck deep in emails. But we’d be remiss if we didn’t take a minute to reflect on the previous year.
In November of 2018, we quietly celebrated 15 years of being in business. When Brooks Bell was founded, experimentation was in its infancy. But despite all the changes we’ve experienced since then, one thing remains true: it is the opportunity to connect with so many interesting people that are solving big problems for their business that makes our work worthwhile. Thanks for walking with us.
In October, things got a little spooky around the office and it had everything to do with Scott, our Director of Sales, who decided to channel his inner Ellen Degeneres for the day (much to our colleagues’ horror). Watch the video if you dare.
Making Bacon for our Clients
Back in 2014, we set a Big Hairy Audacious Goal to achieve $1 billion in projected revenue for our clients. By the end of 2017, we’d reached $500 million. And this past December, we hit $1 billion. (cue ::gong::)
But we’re not resting on our laurels. We’ve set some aggressive goals for 2019, with a focus on personalization, and we’re pumped to get to work.
Brooks Bell takes the Bay Area
In September, we officially opened the doors to our San Fransisco office. This decision came after years of working with clients on the West Coast and our desire to work even more closely with them. And with the Bay Area’s rich history of innovation, we can’t think of a better place to help more companies push their boundaries through experimentation.