Changing Your A/B Testing Software? Read These Tips First.

With the number of testing and personalization tools available, it can be difficult to choose one to invest in. But once you’ve already selected a software, making the decision to transition to a new tool altogether can feel overwhelming. But this happens quite often. For many clients, cost is often the deciding factor in making […]

The post Changing Your A/B Testing Software? Read These Tips First. appeared first on Brooks Bell.

With the number of testing and personalization tools available, it can be difficult to choose one to invest in. But once you’ve already selected a software, making the decision to transition to a new tool altogether can feel overwhelming.

But this happens quite often. For many clients, cost is often the deciding factor in making the decision to switch testing tools–there are a few testing tools that offer similar capabilities at a lower price point. On the flip side, if you’ve increased your program budget and capabilities, it may be time for an upgrade.

And although all testing tools offer similar functions, each has unique features that are important to consider. Personalization, for example, has become a point of focus for many testing programs – perhaps you’re interested in transitioning to a tool such as Evergage or Dynamic Yield that puts personalization at the forefront. Or your testing program has enough velocity to run multiple experiments simultaneously, and you feel you’d make good use of Optimizely’s built-in mutually exclusive experiments feature. Maybe your company uses other Adobe products, like Adobe Experience Manager, so you feel Adobe Target is a good fit.

Regardless of which tool you select, once you select a new software–the next major obstacle is implementing it. Here are our tips for going about the process:

First, examine your testing roadmap.

Take inventory of the tests that will be running close to the date when you plan to stop using your previous tool. Make sure they will have reached significance and be ready to be turned off before you lose access. 

If your budget allows for it, we recommend giving your team a period of time where both tools are available. This will ensure your testing cadence isn’t affected while your team gets up to speed on using the new tool and allows you to transition more seamlessly – you’ll be able to let current tests run their course in the old tool while launching new ones in the new tool.

Then, test your testing software.

While you might be excited to dive in and start launching tests left and right, it’s important to take the time to ensure your new tool is implemented correctly.

Run a QA test that visually changes the page to check that the code is being delivered and the flicker looks reasonable. If there are a lot of flickers, you may need to move the testing tool tag higher up in the head of your HTML.

We also recommend running a live test without visual changes, just for the purpose of checking metrics. This enables your analyst to see that metrics are being tracked correctly within the testing tool, or if you’re using an outside analytics tool, that those metrics are being passed accurately to it. 

Once you’ve confirmed that visual changes are showing up as expected and metrics are tracking accurately, you’re ready to start using your new tool!

Switching testing software comes with its challenges. However, in the right circumstance, switching can offer substantial benefits to your testing program. Taking the time to pinpoint your reasons for switching, plan your testing roadmap carefully around the transition, and having patience as the new tool is implemented will ensure your tool transition goes smoothly.


Brooks Bell has over 15 years of experience working with enterprise brands to establish and scale their experimentation programs. We take a holistic approach to our technical diagnostics and analytics services, providing technology and data recommendations based on your business, your goals, your team, and your unique challenges.

What can Brooks Bell do for you?
✓   Clean, organize and centralize your customer data.
✓   Help you select the right a/b testing and personalization tools.
✓   Ensure your tools and systems integrate with one another.
✓   Train your developers and analysts.

Contact us to learn more.

The post Changing Your A/B Testing Software? Read These Tips First. appeared first on Brooks Bell.

You’ve got a winner – now what?

Winner winner chicken dinner! Discovering a winning variation is one of the most exciting moments for an optimization program. It’s the moment when all the work that went into creating a test finally pays off. But while you should always take time to celebrate your team’s accomplishment, hold off on busting out the champagne just […]

The post You’ve got a winner – now what? appeared first on Brooks Bell.

Winner winner chicken dinner! Discovering a winning variation is one of the most exciting moments for an optimization program. It’s the moment when all the work that went into creating a test finally pays off. But while you should always take time to celebrate your team’s accomplishment, hold off on busting out the champagne just yet. Your work has really only just begun.

Winning A/B tests can tell you a lot about your customers—what’s important to them, and why they respond the way they do. These results also enable you to quantitatively predict the impact it will have on your business’s bottom line (typically revenue), and project what that impact looks like over the next year.

Once you attribute a value to a winning experience, it’s critical that you also get the experience live on your site. This ensures you’re not leaving money on the table and also maximizes the impact of your testing program.

But to do this, you and your engineering team have to be on the same page. That is, you have to not only understand the way they work, but you also have to deliberately establish a process for implementing winners into your code base.

Most engineering teams operate using the Agile Method.
If you’re unfamiliar with Agile…well, first, what rock have you been living under? (Just kidding. But really?) Agile is a project management method that relies on incremental, iterative work sequences called sprints. For website developers and engineers, shorter sprints usually last 1-2 weeks and longer sprints last 3-4 weeks.

Most Agile engineering teams organize their projects by way of a prioritized backlog. This backlog is often managed by the product team, though other teams can request additions as needed. During each sprint, developers will work to add features and make other site updates based on what’s listed in the backlog.

During a sprint planning meeting, it’s important that you communicate the importance and impact of your winning experience. The higher the impact, the higher the priority, and the more likely it’ll be included in the upcoming sprint.

Of course, delays are common; particularly when your shared development resources are balancing many different priorities.

As an interim fix, you can use your testing tool to push the winner to production.
To do this safely, end the test campaign and duplicate the code into a new campaign, allocating 100% of traffic to the winner. We advise this method because pushing the winner through the original test campaign would risk displaying the losing experience to returning visitors who previously qualified for that experience.

Of course, there are risks to using a testing tool in this way—even if it’s only a short-term solution. While you might be able to cash-in quickly on your winning test, you could also face interference with future tests, maintenance issues and reduced page performance.

Beyond analyzing your results and getting your winner into production, there’s one final step following the identification of a winning test: capitalize on the win within your organization.

Communicating big wins for the business and customer insights drive momentum and support for experimentation within your company. Create powerful case studies; hone your storytelling technique to ensure you leave a memorable impression. Share your successes on Slack, by email, at town halls, or host a webinar…the opportunities are endless. Find the communication channel that catches the most attention in your organization, and run with it!

In our experience, cross-functional alignment is the biggest barrier and the largest contributor to the success of an optimization program. Have any additional ideas or examples of ways to create alignment around testing between engineering, product and optimization teams? Let us know in the comments!

Does your optimization process feel like less like a process and more like organized chaos? We’d love to help. Learn more about our services or contact us today.

The post You’ve got a winner – now what? appeared first on Brooks Bell.