You’ve got a winner – now what?

Winner winner chicken dinner! Discovering a winning variation is one of the most exciting moments for an optimization program. It’s the moment when all the work that went into creating a test finally pays off. But while you should always take time to celebrate your team’s accomplishment, hold off on busting out the champagne just […]

The post You’ve got a winner – now what? appeared first on Brooks Bell.

Winner winner chicken dinner! Discovering a winning variation is one of the most exciting moments for an optimization program. It’s the moment when all the work that went into creating a test finally pays off. But while you should always take time to celebrate your team’s accomplishment, hold off on busting out the champagne just yet. Your work has really only just begun.

Winning A/B tests can tell you a lot about your customers—what’s important to them, and why they respond the way they do. These results also enable you to quantitatively predict the impact it will have on your business’s bottom line (typically revenue), and project what that impact looks like over the next year.

Once you attribute a value to a winning experience, it’s critical that you also get the experience live on your site. This ensures you’re not leaving money on the table and also maximizes the impact of your testing program.

But to do this, you and your engineering team have to be on the same page. That is, you have to not only understand the way they work, but you also have to deliberately establish a process for implementing winners into your code base.

Most engineering teams operate using the Agile Method.
If you’re unfamiliar with Agile…well, first, what rock have you been living under? (Just kidding. But really?) Agile is a project management method that relies on incremental, iterative work sequences called sprints. For website developers and engineers, shorter sprints usually last 1-2 weeks and longer sprints last 3-4 weeks.

Most Agile engineering teams organize their projects by way of a prioritized backlog. This backlog is often managed by the product team, though other teams can request additions as needed. During each sprint, developers will work to add features and make other site updates based on what’s listed in the backlog.

During a sprint planning meeting, it’s important that you communicate the importance and impact of your winning experience. The higher the impact, the higher the priority, and the more likely it’ll be included in the upcoming sprint.

Of course, delays are common; particularly when your shared development resources are balancing many different priorities.

As an interim fix, you can use your testing tool to push the winner to production.
To do this safely, end the test campaign and duplicate the code into a new campaign, allocating 100% of traffic to the winner. We advise this method because pushing the winner through the original test campaign would risk displaying the losing experience to returning visitors who previously qualified for that experience.

Of course, there are risks to using a testing tool in this way—even if it’s only a short-term solution. While you might be able to cash-in quickly on your winning test, you could also face interference with future tests, maintenance issues and reduced page performance.

Beyond analyzing your results and getting your winner into production, there’s one final step following the identification of a winning test: capitalize on the win within your organization.

Communicating big wins for the business and customer insights drive momentum and support for experimentation within your company. Create powerful case studies; hone your storytelling technique to ensure you leave a memorable impression. Share your successes on Slack, by email, at town halls, or host a webinar…the opportunities are endless. Find the communication channel that catches the most attention in your organization, and run with it!

In our experience, cross-functional alignment is the biggest barrier and the largest contributor to the success of an optimization program. Have any additional ideas or examples of ways to create alignment around testing between engineering, product and optimization teams? Let us know in the comments!

Does your optimization process feel like less like a process and more like organized chaos? We’d love to help. Learn more about our services or contact us today.

The post You’ve got a winner – now what? appeared first on Brooks Bell.

Adobe is Killing Ad Hoc Analysis & Everything is Going To Be Fine

Where were you when you heard the news? I was checking my analytics team’s Slack channel at work when my teammate shared this screenshot: At first, my thoughts went to Java. Then, it really hit me. Ad Hoc?! No. That’s not possible. But instead of letting this news ruin my week, I thought to channel […]

The post Adobe is Killing Ad Hoc Analysis & Everything is Going To Be Fine appeared first on Brooks Bell.

Where were you when you heard the news? I was checking my analytics team’s Slack channel at work when my teammate shared this screenshot:

At first, my thoughts went to Java. Then, it really hit me.

Ad Hoc?!

No.

That’s not possible.

But instead of letting this news ruin my week, I thought to channel my mom’s advice from my junior high school days: “Don’t get upset about things that are outside of your control.”

If I could influence Adobe, I would definitely try.  Actually, is anyone at Adobe reading this? Is there any chance that you could reverse the decision? No? Ok, that’s fine, too.

So, deep breaths. I’m going to jot down why this gave me feelings and try to determine whether my perceived issues are real problems at all. And, I thought, why not bring you all along for this personal therapy session of mine?

 

First, let’s unpack why this is such a big deal.

If you’re unfamiliar, Ad Hoc Analysis is a tool within Adobe Analytics. It’s used by analysts, like myself, to analyze and report on website performance— engagement, conversions, eCommerce, etc.

Adobe has another reporting and analysis tool within Analytics, Analysis Workspace. It operates in a similar fashion to Ad Hoc, but in a more visual way. The company is already encouraging Ad Hoc users to make the switch over to Workspace. 

Ad Hoc Analysis

Analysis Workspace

Rather than using the reporting dashboards available within various testing platforms, most analysts connect their test results to analytics tools like Adobe Analytics or Google Analytics.

Using these more sophisticated tools enables us to view A/B test results in conjunction with any one of the dimensions, segments, time periods, or metrics that exist within the analytics tools. 

So if Workspace exists, you’re probably wondering why we’re still stuck on Ad Hoc over here. 

First, Ad Hoc is flexible.  Once you’ve learned the capabilities, and assuming your site has the proper tagging in place, Ad Hoc enables you to answer any business question; business questions cause our little analyst gears to turn and assembling a report, custom segments, or calculated metrics transforms into a neat little puzzle to solve.

Also, analysts are creatures of habit.  When it comes to doing our jobs, analysts like to stick with what we know. Solving problems or answering questions is enough of a challenge, and we don’t want to spend extra time thinking about where to find segments or buttons in a tool. Those of us who live in Ad Hoc on a regular basis will need a little time to adjust. Bear with us.

The impact of this on my own day-to-day wasn’t lost on me. So, even as I mourned the loss of Ad Hoc, I also began to consider the challenges ahead.

Here were my concerns about switching from Ad Hoc to Workspace

Can I create all of my complex segments and calculated metrics in Workspace? Even though the two products look different, the functionality seems to be all there.  In general, Workspace is a prettier product; Ad Hoc just feels more real. And let’s face it: when things look a little too pretty, analysts become skeptical.

Doesn’t Workspace have limits on the number of rows you can export? Yes. Today, Workspace only enables you to view and export up to 400 rows at a time (though the default view is 50). So, while this isn’t something that we can work with today, Adobe does have plans to increase downloads for up to 50,000 rows from a freeform table. Cue: huge sigh of relief.


Should I use Data Warehouse instead? 
Adobe Analytics’ Data Warehouse tool is better for setting up a large and/or scheduled data-pulls. It’s not a good option for an exploratory tool.

Isn’t Workspace buggy and slow? When I asked my colleagues what they thought about Workspace, many of them used the word “clunky.” This impression exists because Workspace is a browser-based tool. It also automatically reloads your report every. single. time. you make a change. Compare this to AdHoc, where you can change as many elements as you want, but the report will only refresh when you hit that magical little “Replace Table” button.

Maybe this is in the list of upcoming upgrades, but I haven’t come across any mention of it yet.

How will I explore my data?
Short answer: Differently.

Long answer: Neither Workspace nor Data Warehouse are ideal for exploring new datasets. If you’re already completely up-to-speed with your dataset’s tagging, metrics props and eVars, you’re fine. However, when you get into new datasets, data exploration is critical to ensure that you are getting the most out of your data and analysis. This will be a bigger challenge for agencies and consultancies (like Brooks Bell), as data exploration is key to kicking off our work with new clients.

Workspace isn’t a bad option. It’s just different.

While there are definitely redundancies between Workspace and Ad Hoc, there are actually quite a few benefits to switching to Workspace.

First, Workspace is good for on-going test reporting.  Here at Brooks Bell, we can set up and share dashboards with both our colleagues and our clients, enabling everyone to actively monitor test results. This is particularly nice at the beginning of a test’s lifecycle and allows for transparency throughout the entire process.

It also has an undo option.  Many Ad Hoc power users can relate to the combination of defeat and hope they feel after accidentally closing the wrong tab, attempting that “close without saving” trick, while praying they didn’t change too much since last save.

Finally, any changes you make to segments will automatically update in Workspace. Meanwhile, in Ad Hoc, you have to remove your segment from the work area, and then add it back in from the full list of segments. Fewer steps = less time.

Finally, how to prepare for a world without Ad Hoc

1. Start your transition today.  I opened up Workspace for the first time in a while just last week.  I’m now using it to do most of what I would normally do in Ad Hoc. So long as I don’t update Java, I know I can always fall back on Ad Hoc for the large data-pulls and data exploration until those features are in place in Workspace.  For now, though, it’s all about building new “muscle memory” as I incorporate Workspace in my workflow.

2. Check out this process documentation for making the transition. I read through this a few days after hearing the news and wish I had read it sooner.

3. If you plan to continue to use Ad Hoc for the time being, don’t update Java. Ad Hoc will no longer work with future Java updates.

4. Give feedback!  Adobe is soliciting feedback all over the place right now! This shows that they care about their users and want Workspace to be a useful tool. Don’t hold back on feature requests—it never hurts to ask.

Ultimately, we all had a sense this day would come, especially as data and analytics technologies continue to develop. I feel much better now that I’ve dove into this change headfirst—I hope you do too!

The post Adobe is Killing Ad Hoc Analysis & Everything is Going To Be Fine appeared first on Brooks Bell.

Unlocking the True Power of Testing & Other Takeaways from Brooks Bell’s Interview With Ambition Data

Recently, our Founder and CEO, Brooks Bell, sat down with Allison Hartsoe, host of the Customer Equity Accelerator—a podcast produced by Ambition Data. Listen to the full podcast or read on for a few highlights from their conversation:  On what inspired her to build an experimentation consultancy… Originally, Brooks founded Brooks Bell Inc. in 2003 as […]

The post Unlocking the True Power of Testing & Other Takeaways from Brooks Bell’s Interview With Ambition Data appeared first on Brooks Bell.

Recently, our Founder and CEO, Brooks Bell, sat down with Allison Hartsoe, host of the Customer Equity Accelerator—a podcast produced by Ambition Data. Listen to the full podcast or read on for a few highlights from their conversation:

On what inspired her to build an experimentation consultancy…

Originally, Brooks founded Brooks Bell Inc. in 2003 as a website development agency. After working with a few local clients, a chance introduction led to her first major experimentation client, AOL.

Today, you might think of AOL as one of the [now-extinct] internet dinosaurs, but even back in the early 2000s, the media giant was facing its fair share of challenges. According to one story by Time Magazine, despite having 34 million members in 2002, AOL was battling slowing subscriber growth, falling ad revenue and exorbitant operational costs. 

So, the company turned to experimentation. “AOL had the right environment to build a testing culture,” said Brooks. “They had a closed technology environment, their own analytics platform, and their data was clean and connected.”

Back then, AOL relied on pop-ups to drive new subscriptions. Working with Brooks, the company issued a challenge: design a new subscription pop-up that would beat the control experience. And so, drawing from her background in design and psychology, she did—and then she did it again, and again, and again.

But that was just the start. As other large companies began to rely more on the digital space to drive their business, Brooks saw an opportunity to help them tap into the power of experimentation.

“We realized that no one was testing!” said Brooks. “No other large companies had the data, culture and processes in place to test. So we set out to help them build the data fidelity and really recreate what we saw at AOL in those early years.”

On the difference between optimization and experimentation…

It’s one of the more common questions we get: “Brooks Bell is an experimentation consultancy. What’s that? What’s the difference between experimentation and optimization?” As Brooks explains it, it all comes down to science.

By definition, experimentation is the application of the scientific method to determine something. And while optimization is one potential outcome of an experiment, true experimentation requires running tests without a prescriptive outcome or application.

To put it simply – you’re testing to learn. And as long as your results are statistically significant, there is always something to be learned from experiments—even those with flat or negative results.

On how to unlock the real power of experimentation…

Today, in the age of Amazon, a customer-centric experience is critical. But for some established companies, this requires a bigger paradigm shift in culture and processes.  

“Customer-centricity requires rethinking metrics, the type of data you collect, how teams are organized, how teams are incentivized, how you communicate and also your core values,” said Brooks.

The true power of experimentation lies in its ability to align your customer needs with your company’s strategic goals and your program’s agenda. Furthermore, you can use experimentation to learn new things about your customers in a scientific way.

“Having statistically-sound customer insights can totally change how you organize your store, how you train your team, and how you structure your website,” said Brooks. “This is where testing programs can really drive change.”

To that end, we recently celebrated the launch of Illuminate, our customer insights software for testing teams and executives. Illuminate not only provides a place to store, share and learn from your experiments, but also a means to develop impactful customer insights.

“We launched Illuminate to provide a repository of great test examples, to learn from each other, and to build a library of great test case studies,” said Brooks.  This is because outside of the testing program, any key learnings from an experiment can get lost within the data. Illuminate solves this by encouraging deeper thinking about customers, their needs, preferences, and behaviors. 

Learn more about Brooks Bell’s experimentation consulting services. 

The post Unlocking the True Power of Testing & Other Takeaways from Brooks Bell’s Interview With Ambition Data appeared first on Brooks Bell.