Changing Your A/B Testing Software? Read These Tips First.

With the number of testing and personalization tools available, it can be difficult to choose one to invest in. But once you’ve already selected a software, making the decision to transition to a new tool altogether can feel overwhelming. But this happens quite often. For many clients, cost is often the deciding factor in making […]

The post Changing Your A/B Testing Software? Read These Tips First. appeared first on Brooks Bell.

With the number of testing and personalization tools available, it can be difficult to choose one to invest in. But once you’ve already selected a software, making the decision to transition to a new tool altogether can feel overwhelming.

But this happens quite often. For many clients, cost is often the deciding factor in making the decision to switch testing tools–there are a few testing tools that offer similar capabilities at a lower price point. On the flip side, if you’ve increased your program budget and capabilities, it may be time for an upgrade.

And although all testing tools offer similar functions, each has unique features that are important to consider. Personalization, for example, has become a point of focus for many testing programs – perhaps you’re interested in transitioning to a tool such as Evergage or Dynamic Yield that puts personalization at the forefront. Or your testing program has enough velocity to run multiple experiments simultaneously, and you feel you’d make good use of Optimizely’s built-in mutually exclusive experiments feature. Maybe your company uses other Adobe products, like Adobe Experience Manager, so you feel Adobe Target is a good fit.

Regardless of which tool you select, once you select a new software–the next major obstacle is implementing it. Here are our tips for going about the process:

First, examine your testing roadmap.

Take inventory of the tests that will be running close to the date when you plan to stop using your previous tool. Make sure they will have reached significance and be ready to be turned off before you lose access. 

If your budget allows for it, we recommend giving your team a period of time where both tools are available. This will ensure your testing cadence isn’t affected while your team gets up to speed on using the new tool and allows you to transition more seamlessly – you’ll be able to let current tests run their course in the old tool while launching new ones in the new tool.

Then, test your testing software.

While you might be excited to dive in and start launching tests left and right, it’s important to take the time to ensure your new tool is implemented correctly.

Run a QA test that visually changes the page to check that the code is being delivered and the flicker looks reasonable. If there are a lot of flickers, you may need to move the testing tool tag higher up in the head of your HTML.

We also recommend running a live test without visual changes, just for the purpose of checking metrics. This enables your analyst to see that metrics are being tracked correctly within the testing tool, or if you’re using an outside analytics tool, that those metrics are being passed accurately to it. 

Once you’ve confirmed that visual changes are showing up as expected and metrics are tracking accurately, you’re ready to start using your new tool!

Switching testing software comes with its challenges. However, in the right circumstance, switching can offer substantial benefits to your testing program. Taking the time to pinpoint your reasons for switching, plan your testing roadmap carefully around the transition, and having patience as the new tool is implemented will ensure your tool transition goes smoothly.


Brooks Bell has over 15 years of experience working with enterprise brands to establish and scale their experimentation programs. We take a holistic approach to our technical diagnostics and analytics services, providing technology and data recommendations based on your business, your goals, your team, and your unique challenges.

What can Brooks Bell do for you?
✓   Clean, organize and centralize your customer data.
✓   Help you select the right a/b testing and personalization tools.
✓   Ensure your tools and systems integrate with one another.
✓   Train your developers and analysts.

Contact us to learn more.

The post Changing Your A/B Testing Software? Read These Tips First. appeared first on Brooks Bell.

You’ve got a winner – now what?

Winner winner chicken dinner! Discovering a winning variation is one of the most exciting moments for an optimization program. It’s the moment when all the work that went into creating a test finally pays off. But while you should always take time to celebrate your team’s accomplishment, hold off on busting out the champagne just […]

The post You’ve got a winner – now what? appeared first on Brooks Bell.

Winner winner chicken dinner! Discovering a winning variation is one of the most exciting moments for an optimization program. It’s the moment when all the work that went into creating a test finally pays off. But while you should always take time to celebrate your team’s accomplishment, hold off on busting out the champagne just yet. Your work has really only just begun.

Winning A/B tests can tell you a lot about your customers—what’s important to them, and why they respond the way they do. These results also enable you to quantitatively predict the impact it will have on your business’s bottom line (typically revenue), and project what that impact looks like over the next year.

Once you attribute a value to a winning experience, it’s critical that you also get the experience live on your site. This ensures you’re not leaving money on the table and also maximizes the impact of your testing program.

But to do this, you and your engineering team have to be on the same page. That is, you have to not only understand the way they work, but you also have to deliberately establish a process for implementing winners into your code base.

Most engineering teams operate using the Agile Method.
If you’re unfamiliar with Agile…well, first, what rock have you been living under? (Just kidding. But really?) Agile is a project management method that relies on incremental, iterative work sequences called sprints. For website developers and engineers, shorter sprints usually last 1-2 weeks and longer sprints last 3-4 weeks.

Most Agile engineering teams organize their projects by way of a prioritized backlog. This backlog is often managed by the product team, though other teams can request additions as needed. During each sprint, developers will work to add features and make other site updates based on what’s listed in the backlog.

During a sprint planning meeting, it’s important that you communicate the importance and impact of your winning experience. The higher the impact, the higher the priority, and the more likely it’ll be included in the upcoming sprint.

Of course, delays are common; particularly when your shared development resources are balancing many different priorities.

As an interim fix, you can use your testing tool to push the winner to production.
To do this safely, end the test campaign and duplicate the code into a new campaign, allocating 100% of traffic to the winner. We advise this method because pushing the winner through the original test campaign would risk displaying the losing experience to returning visitors who previously qualified for that experience.

Of course, there are risks to using a testing tool in this way—even if it’s only a short-term solution. While you might be able to cash-in quickly on your winning test, you could also face interference with future tests, maintenance issues and reduced page performance.

Beyond analyzing your results and getting your winner into production, there’s one final step following the identification of a winning test: capitalize on the win within your organization.

Communicating big wins for the business and customer insights drive momentum and support for experimentation within your company. Create powerful case studies; hone your storytelling technique to ensure you leave a memorable impression. Share your successes on Slack, by email, at town halls, or host a webinar…the opportunities are endless. Find the communication channel that catches the most attention in your organization, and run with it!

In our experience, cross-functional alignment is the biggest barrier and the largest contributor to the success of an optimization program. Have any additional ideas or examples of ways to create alignment around testing between engineering, product and optimization teams? Let us know in the comments!

Does your optimization process feel like less like a process and more like organized chaos? We’d love to help. Learn more about our services or contact us today.

The post You’ve got a winner – now what? appeared first on Brooks Bell.

Adobe is Killing Ad Hoc Analysis & Everything is Going To Be Fine

Where were you when you heard the news? I was checking my analytics team’s Slack channel at work when my teammate shared this screenshot: At first, my thoughts went to Java. Then, it really hit me. Ad Hoc?! No. That’s not possible. But instead of letting this news ruin my week, I thought to channel […]

The post Adobe is Killing Ad Hoc Analysis & Everything is Going To Be Fine appeared first on Brooks Bell.

Where were you when you heard the news? I was checking my analytics team’s Slack channel at work when my teammate shared this screenshot:

At first, my thoughts went to Java. Then, it really hit me.

Ad Hoc?!

No.

That’s not possible.

But instead of letting this news ruin my week, I thought to channel my mom’s advice from my junior high school days: “Don’t get upset about things that are outside of your control.”

If I could influence Adobe, I would definitely try.  Actually, is anyone at Adobe reading this? Is there any chance that you could reverse the decision? No? Ok, that’s fine, too.

So, deep breaths. I’m going to jot down why this gave me feelings and try to determine whether my perceived issues are real problems at all. And, I thought, why not bring you all along for this personal therapy session of mine?

 

First, let’s unpack why this is such a big deal.

If you’re unfamiliar, Ad Hoc Analysis is a tool within Adobe Analytics. It’s used by analysts, like myself, to analyze and report on website performance— engagement, conversions, eCommerce, etc.

Adobe has another reporting and analysis tool within Analytics, Analysis Workspace. It operates in a similar fashion to Ad Hoc, but in a more visual way. The company is already encouraging Ad Hoc users to make the switch over to Workspace. 

Ad Hoc Analysis

Analysis Workspace

Rather than using the reporting dashboards available within various testing platforms, most analysts connect their test results to analytics tools like Adobe Analytics or Google Analytics.

Using these more sophisticated tools enables us to view A/B test results in conjunction with any one of the dimensions, segments, time periods, or metrics that exist within the analytics tools. 

So if Workspace exists, you’re probably wondering why we’re still stuck on Ad Hoc over here. 

First, Ad Hoc is flexible.  Once you’ve learned the capabilities, and assuming your site has the proper tagging in place, Ad Hoc enables you to answer any business question; business questions cause our little analyst gears to turn and assembling a report, custom segments, or calculated metrics transforms into a neat little puzzle to solve.

Also, analysts are creatures of habit.  When it comes to doing our jobs, analysts like to stick with what we know. Solving problems or answering questions is enough of a challenge, and we don’t want to spend extra time thinking about where to find segments or buttons in a tool. Those of us who live in Ad Hoc on a regular basis will need a little time to adjust. Bear with us.

The impact of this on my own day-to-day wasn’t lost on me. So, even as I mourned the loss of Ad Hoc, I also began to consider the challenges ahead.

Here were my concerns about switching from Ad Hoc to Workspace

Can I create all of my complex segments and calculated metrics in Workspace? Even though the two products look different, the functionality seems to be all there.  In general, Workspace is a prettier product; Ad Hoc just feels more real. And let’s face it: when things look a little too pretty, analysts become skeptical.

Doesn’t Workspace have limits on the number of rows you can export? Yes. Today, Workspace only enables you to view and export up to 400 rows at a time (though the default view is 50). So, while this isn’t something that we can work with today, Adobe does have plans to increase downloads for up to 50,000 rows from a freeform table. Cue: huge sigh of relief.


Should I use Data Warehouse instead? 
Adobe Analytics’ Data Warehouse tool is better for setting up a large and/or scheduled data-pulls. It’s not a good option for an exploratory tool.

Isn’t Workspace buggy and slow? When I asked my colleagues what they thought about Workspace, many of them used the word “clunky.” This impression exists because Workspace is a browser-based tool. It also automatically reloads your report every. single. time. you make a change. Compare this to AdHoc, where you can change as many elements as you want, but the report will only refresh when you hit that magical little “Replace Table” button.

Maybe this is in the list of upcoming upgrades, but I haven’t come across any mention of it yet.

How will I explore my data?
Short answer: Differently.

Long answer: Neither Workspace nor Data Warehouse are ideal for exploring new datasets. If you’re already completely up-to-speed with your dataset’s tagging, metrics props and eVars, you’re fine. However, when you get into new datasets, data exploration is critical to ensure that you are getting the most out of your data and analysis. This will be a bigger challenge for agencies and consultancies (like Brooks Bell), as data exploration is key to kicking off our work with new clients.

Workspace isn’t a bad option. It’s just different.

While there are definitely redundancies between Workspace and Ad Hoc, there are actually quite a few benefits to switching to Workspace.

First, Workspace is good for on-going test reporting.  Here at Brooks Bell, we can set up and share dashboards with both our colleagues and our clients, enabling everyone to actively monitor test results. This is particularly nice at the beginning of a test’s lifecycle and allows for transparency throughout the entire process.

It also has an undo option.  Many Ad Hoc power users can relate to the combination of defeat and hope they feel after accidentally closing the wrong tab, attempting that “close without saving” trick, while praying they didn’t change too much since last save.

Finally, any changes you make to segments will automatically update in Workspace. Meanwhile, in Ad Hoc, you have to remove your segment from the work area, and then add it back in from the full list of segments. Fewer steps = less time.

Finally, how to prepare for a world without Ad Hoc

1. Start your transition today.  I opened up Workspace for the first time in a while just last week.  I’m now using it to do most of what I would normally do in Ad Hoc. So long as I don’t update Java, I know I can always fall back on Ad Hoc for the large data-pulls and data exploration until those features are in place in Workspace.  For now, though, it’s all about building new “muscle memory” as I incorporate Workspace in my workflow.

2. Check out this process documentation for making the transition. I read through this a few days after hearing the news and wish I had read it sooner.

3. If you plan to continue to use Ad Hoc for the time being, don’t update Java. Ad Hoc will no longer work with future Java updates.

4. Give feedback!  Adobe is soliciting feedback all over the place right now! This shows that they care about their users and want Workspace to be a useful tool. Don’t hold back on feature requests—it never hurts to ask.

Ultimately, we all had a sense this day would come, especially as data and analytics technologies continue to develop. I feel much better now that I’ve dove into this change headfirst—I hope you do too!

The post Adobe is Killing Ad Hoc Analysis & Everything is Going To Be Fine appeared first on Brooks Bell.