A Copy Testing Methodology for the Digital Age

It takes one wrong word to put your foot in your mouth. We’ve all done it and, in the process, squandered an opportunity to impress someone (or some crowd). With copy, you have a chance to slip up on every homepage, product page, or ad. Copy is a bridge between your product and your customers. […]

The post A Copy Testing Methodology for the Digital Age appeared first on CXL.

It takes one wrong word to put your foot in your mouth. We’ve all done it and, in the process, squandered an opportunity to impress someone (or some crowd).

With copy, you have a chance to slip up on every homepage, product page, or ad.

Copy is a bridge between your product and your customers. Design matters, too, but it’s context for the message—not the message itself. It’s why copy is twice as important:

But how do you improve it? How do you know which word or phrase might tank a sale? Or what missing detail preserves just enough uncertainty to keep someone from clicking “Add to cart”?

While copy testing has been a decades-long standard for brand spots, it wasn’t built for the modern age. And A/B testing, another way to pit phrase-against-phrase, doesn’t tell you why a version won. That leaves you with a lot of uncertainty—which HiPPOs are all too happy to resolve.

A modern copy testing methodology, by contrast, delivers fast, affordable bursts of quantitative and qualitative feedback for direct-response copy.

Pre-testing: great for singular brand campaigns—and little else 

Copy testing isn’t new. It came from “pre-testing,” and it made more sense when companies ran singular brand campaigns.

If you needed to find the pitch with the highest “day-after recall” because you were running the same TV ad for weeks (or months), pre-testing helped protect you from a total flop.

You gathered a “consumer jury,” exposed them to variations of an ad, then measured the likeability, persuasion, and recall of ads with quantitative (e.g., “On a scale of 1 to 100…”) and qualitative (i.e. open-ended questions) methods.

Modern versions show ads to consumers who watch them online, from their homes. Still today, pre-testing is slow and expensive. Even contemporary, streamlined methods of pre-testing have price tags between $2,000 and $6,500 per ad.

That’s no problem for months-in-development campaigns and seven-figure ad budgets. It’s ludicrous for a startup and our rapid-fire environment of rotating social media ads and landing page tweaks.

As Frito Lay’s CMO Ram Krishnan concedes, “It’s very tough to test just because of the volume of content we are putting out.” Even if you’re spending millions on Google and Facebook, the traditional methodology has pitfalls.

What copy testing will (and won’t) achieve

If you ask a focus group to choose a color for your brand, you won’t end up with fuschia.

The ad that performed the best on average, especially for quantitative metrics, delivers your typically inoffensive McDonald’s spot—the safety scissors of the ad world. Sacrifice reward; avoid risk.

Some marketers take the opposite approach by avoiding copy testing. Neither Allstate’s “Mayhem” campaign nor Old Spice’s “The Man Your Man Could Smell Like” campaign went through copy testing. (There was “a lot of pressure to kill” the former, according to Allstate’s former VP of marketing.)


Being weird was the point. As Oscar Meyer’s former ad lead Tom Bick recalls, “We literally used what I fondly called the F-me test. Is it bold, will it possibly ruffle feathers internally, will consumers say, ‘I can’t believe they did that’?”

Testing can lead to false confidence, cautions Bick:

It gives you the illusion that you are being a disciplined marketer and it gives you a sense of confidence, be it false, that you are doing the right thing.

Advertising is about building trust and a feeling about a brand that predisposes people to liking you [. . .] that then allows more rational messaging maybe to come through the filter. And most copy tests don’t reward you for that.

Copy testing, in other words, won’t help you differentiate—it will help you know if you’re conveying that differentiation effectively, in a brand spot or on a long-form sales page.

A/B testing copy has limitations, too.

Why A/B testing isn’t a replacement

A/B testing can tell you which version of copy generated more leads or sales. It tells you nothing about why a given variation won. 

For ad campaigns, A/B testing also risks spending a lot of cash on a losing variation, one whose shortcomings you could’ve sussed out in advance.

A/B testing further assumes that you have enough traffic to test to begin with, which becomes increasingly less likely as users move down the funnel. (A blog post earns more traffic than a product page, which earns more traffic than a checkout page…)

Even then, do you really want to test your wild ideas on purchase-ready buyers? As Unilever’s Elliot Roazen notes, that’s an expensive, haphazard experimentation process:

Creative and product teams will work to put together sales pages and then launch the pages with paid media behind them, tweaking the page’s copy and design based on performance metrics.

The problem is that these assumptions, more often than not, are merely hunches, and paid traffic isn’t exactly cheap.

There’s also a risk of lost context. If you’re testing a brand new value proposition on your homepage, what happens if the product page alludes to the value prop of your control? Or your drip campaign touts unrelated benefits? A/B testing your messaging carelessly can turn your marketing copy into a patchwork quilt.

If the copy in a variation is ignored or contradicted elsewhere in the funnel, how will you know the impact of copy changes to one page? You won’t. 

A modern approach to copy testing

Direct-response copy is the driving force of modern marketing. Compared to pre-testing of TV campaigns, it has different needs. Recall is less important—attention (e.g., on a landing page) is already won and doesn’t need to be maintained for long.

A modern, data-driven approach blends quantitative and qualitative data to tell you:

  1. What is or isn’t working (quantitative). How easy is it to understand your message? How much do people care about that pitch? How badly do they want to keep reading?
  2. Why it is or isn’t working (qualitative). Which words and phrases make a difference? Which are missing? What turns people off?

Peep Laja, founder of CXL Institute, explains what that looks like for us:

CXL Institute has 100+ landing pages—one for each course and training program, and a number of PPC landing pages. These pages are copy-heavy, with hundreds of words because CXL Institute is a complicated, expensive product.

The way to increase the conversion rate on those pages is to improve the copy. But web analytics or heat maps can’t tell you anything about the quality of your words.

Most get by with opinions from their colleagues—because they’re easy to source. Of course, the constituent whose opinion matters most is the customer.

The process for getting answers for any use case breaks down into four steps.

A four-step copy testing methodology

Copy testing is a research methodology, not a set-in-stone process. There’s flexibility based on who you are and the questions you need to answer.

You can tailor these broad steps to your needs.

Step 1: Develop research goals and questions.

Make a list of things you want to learn from copy testing: What is it that you want to know? Typically, you want to focus on uncovering friction and copy blind spots. 

You might have research questions about the overall copy (“What doubts and hesitations do you have about this?”) or a specific section of the page (“On a scale of 1 to 5, how interesting is this?”). 

“There’s no limit to what questions you might ask,” says Laja. “You start with research goals. Then, you formulate the question accordingly. Few do copy testing after a page is live, although this is low-hanging fruit.”

As Roazen has found, copy testing can also help refine product messaging prior to a launch: 

Our mandate has recently switched to the creation of new brands, which (roughly speaking) follows a workflow of ideation, validation, launch, and optimization. Between each of these stages, we sense check our communications with feedback from target consumers.

For some concepts, the feedback from these tests results in a serious pivot. You really have only a few seconds to communicate the “what” you are selling, the value that this product provides, and how much you’re selling it for. In rounds of copy testing, consumers have said our product pages do not clearly articulate one of these key communication points, meaning we have to figure out a change that makes this clearer.

By rigorously copy testing your sales page, you ensure that you are getting verbatim, qualitative feedback to refine your copy further. This gives you a head start when you finally do launch. Essentially, you’re starting on second or third base.

Step 2: Recruit panelists.

You need folks to be part of your study. This is qualitative research, so as few as five people will add value, but the optimal range is around 15 people

Find people interested in your offer (i.e. your target audience) but aren’t customers yet (so they’re unbiased). 

For consumer products, interest-based Facebook groups are a good place to find people. For specific B2B folks (targeting by title + industry), LinkedIn is a good bet. 

You need to compensate the panelists for their time (e.g., gift cards, real money). The more niche or hard-to-get people are, the more you need to pay. 

Step 3: Facilitate research sessions.

Run individual sessions with each panelist. Any video conferencing tool with screen-sharing functionality works. As the panelist reads the copy, ask the research questions you’ve prepared. 

Step 4: Gather all the research data in one place.

The simplest way to analyze the data is with a spreadsheet. Gather all the questions and answers you got from the panelists. 

Like any research, it takes time and effort. (The way around it is to use a tool like Copytesting, which automates all of that for you.)

If you’re wondering what you’ll learn, here are some examples, broken down by quantitative and qualitative results.

Quantitative and qualitative results from copy testing

Examples of quantitative feedback

Quantitative feedback from copy tests tells you:

  • How clear a message is (e.g., via a Likert scale). Do users understand your headline? Your value proposition? Is jargon or awkward phrasing getting in the way? Clarity beats persuasion. 
  • How much people care. Are you talking about the things that people care about? A clear pitch for the wrong benefit isn’t persuasive. 
  • How much people want to keep reading. If the goal of a headline is to interest people in reading what comes after, are you doing a good job?

For example, in a test on Copytesting, Kion Flex scored well (4.3 out of 5) for clarity. The product describes the problem it solves—“mild, temporary joint discomfort”:

Kion Flex producgt.

But while clear, the messaging is general. Is it best as a daily supplement? For injury recovery? Aging joints? 

Readers cared less (in Copytesting parlance, the “CareScore” was lower) about the points made. A generic use-case robs readers of the “this is exactly what I’m looking for” moment. A supplement for any joint discomfort doesn’t sound like the one I need for my issue.

Compare that to Lambda School, which scored exceptionally well on CareScore:

Lambda school copytesting results.

The headline certainly helps—they’re “the school that invests in you.” But they back that up by addressing a primary anxiety in higher education and a barrier for many: “pay no tuition until you’re hired.”

These interpretations, of course, would be speculative without the qualitative feedback to support them. 

Examples of qualitative feedback

When it comes to copy, the problem often isn’t the wrong words but the missing ones—the specifics of which you won’t uncover without qualitative feedback.

That lack of information costs you sales, found Nielsen Norman Group:

In our e-commerce studies, we found that 20% of the overall task failures in the study—times when users failed to successfully complete a purchase when asked to do so—could be attributed to incomplete or unclear product information.

Supply sells a $75 single-blade razor. But its copy promises the same benefits as every other razor—less irritation, nicks, and bumps.  

Supply razor product.

Why this razor—at this price point? Consumers are unsure:

  • “I’d ultimately like to ask what makes this better than other similar products on the market?”
  • “I’d like to know more about the design of the handle and why it looks the way it does. Is it made to be disposable, or how long will it last?”
  • “Do people who bought it think it’s worth $75? How much are extra blades?”

The feedback is a laundry list of questions that crave specifics on exactly how it works, the materials of its construction, and performance differences between a single-blade razor and the ubiquitous three-blade varieties.

Incredible outcomes can also seem, well, incredible. That’s what Chris Rost of SwipeGuide learned after running a copy test on the site’s primary benefit page:

Process improvement and waste reduction.

Despite highlighting real, ROI-focused outcomes, testers we’re skeptical. “This sounds great,” Rost heard again and again, “but we don’t believe it.”

Rost and his team realized they needed to embed details about who achieved those results (e.g., testimonials from real people at real companies) and explain how they did it—the “meat and potatoes” of the process.

In other instances, the words that are present cause problems.

Take CXL Institute. We’re not afraid to lean into hostile brand territory—especially because completing a Minidegree or learning A/B testing statistics does take commitment.

But how far is too far? 

CXL become great at what you do copy.

Quantitative feedback alone about whether the above copy was persuasive (or, in the context of an A/B test, whether that variation converted better), wouldn’t reveal which phrases resonated or put people off.

Here’s what people had to say about the paragraphs above:

  • “To me that sounds really militant.”
  • “It sounds rather elitist.”
  • “This section turned me off. It comes across as haughty and unnecessarily arrogant.”
  • “I don’t like that it says too hard for most, because that sounds a bit snobby.”

If we were going for elitist or arrogant, we nailed it. But we weren’t. 

Some respondents were “intrigued” by the pitch of the courses as “challenging,” but, overall, the aggressiveness of the copy made us seem like jerks. So we dialed it back:

CXL become great at what you do additional copy.

A new group of testers validated those changes:

  • “The tone of the rest of the text does a good job of implying the type of commitment that’s needed to learn something of value via their course and website.”
  • “It’s refreshing to see a program that discloses that effort is required.”
  • “I think this program can be trusted since it says that people should only take this course if they are serious about their careers.”

Some people still thought it was a bit much, but then again, CXL Institute isn’t for everyone :)

At an even more granular level, we identified trigger words that really turned people off. “Badass,” apparently, is one of those words:

CXL personalization.
  • “I think this part of the segment is unnecessary: ‘It takes real badass.’ I honestly think this is pretty cheesy and takes away from the professionalism of the program.”
  • “I don’t like the swear word. It sounds like it takes effort to do the program but it could have been more professionally said.”
  • “The whole ‘badass’ phrasing is so dodgy it doesn’t feel like the big names that train with the company can be legitimate.”

For SwipeGuide, Rost catalogs such keywords, good and bad:

We’ve gotten great insight into what kind of language turns people on, sparks interest, or makes them skeptical.

We now have a list of keywords that B2B-minded people are looking for in a benefits page. When I go back to implement revisions, I can target keywords that are unclear.

If you’re writing copy—or accountable for its success—you want to know this stuff. Otherwise, you risk two things:

  1. Running an expensive ad campaign with copy that doesn’t work.
  2. Throwing out a bunch of good copy because you don’t realize that one word is poisonous.  

Because qualitative feedback helps you understand exactly where you’ve gone too far, you can take risks—rather than staying in the safe center. 

As Chef Sean Brock writes, “Overseason something with salt and acid just so you know what is too much. Then ride the line, and you’ll find your balance.” Without qualitative feedback, you’re throwing out the whole plate of food, still none the wiser. 

Conclusion

Most executives can’t code—HiPPOs are unlikely to challenge a line of JavaScript. But they can put together a sentence, and they absolutely have opinions on the sentences written by others.

Modern copy testing delivers data to support—or challenge—choices for direct-response copy. It also gives marketers the qualitative feedback to know what needs to be changed, be it a single word or whole paragraphs.

You may be happy that a percentage of reviewers think your copy is weird. Success, as with pre-testing, may not be about maxing out your quantitative scores. But, armed with information on why reviewers think what they think, you’ll know the risks and rewards you’re choosing. 

The post A Copy Testing Methodology for the Digital Age appeared first on CXL.

Who’s Hiring in January 2019?

Here are our picks: Website Optimization Specialist – In Atlanta, SunTrust is looking for a specialist to be responsible for “developing and executing business strategies, processes and policies to enhance the sales and service experiences intrinsic to SunTrust’s digital spaces.” A/B Testing & Personalization Analyst – Join Barnes & Noble’s Optimization team in New York […]

The post Who’s Hiring in January 2019? appeared first on Brooks Bell.

Here are our picks:

Website Optimization Specialist – In Atlanta, SunTrust is looking for a specialist to be responsible for “developing and executing business strategies, processes and policies to enhance the sales and service experiences intrinsic to SunTrust’s digital spaces.”

A/B Testing & Personalization Analyst – Join Barnes & Noble’s Optimization team in New York to “improve bn.com’s content, design, and usability for customers and to create unique experiences based on customers’ preferences and behaviors.”

Director-Digital Product Analytics & Testing –  Join the Enterprise Digital and Analytics team at American Express in New York.  They are looking for a leader to “provide value to the online card shopping experiences within the Global Consumer and Commercial businesses through customer data and measurement, insights through analytics techniques and experimentation.”

Marketing Manager, International Conversion – Ancestry is looking for a candidate to join their Conversion Marketing team in San Francisco.  This person is “responsible for improving and optimizing the user experience at each step in the conversion funnel with the end goal of maximizing revenue from visitors in each of Ancestry’s key global markets.”

Marketing Manager, A/B Testing & Optimization – Join AuthO’s Growth Team in “driving improvement in key engagement metrics and customer experience throughout the customer lifecycle.”

Director of B2B Marketing, Demand Generation – Join Vimeo’s B2B marketing team in New York to “scale qualified lead acquisition, build and continuously optimize digital marketing, account-based marketing (ABM), email automation, social, and event-based marketing channels.”

Sr. Analyst, eCommerce Direct to Consumer Analytics – Newell Brands is looking for a senior analyst in Hoboken, New Jersey, to drive “sustainable growth online through the best-in-class use of data and analytics.”

Digital Marketing Leader – Website Optimization – Join GE Healthcare in Wauwatosa, Wisconsin to “develop a rigorous testing and experimentation framework, and conceive, scope and implement experimentation initiatives to improve the website user experience and drive conversion rate optimization.”

Manager, Marketing Planning, Test & Analysis – Express is looking for an individual to lead the testing and optimization program in Columbus, Ohio, “starting with A/B & multivariate testing taking us into experience optimization and eventually personalization.”

 

Looking for a job or to fill a position?  Give us a shout and we’ll help spread the word in our next careers blog post.

 

The post Who’s Hiring in January 2019? appeared first on Brooks Bell.

Free Guide: How to Strategize & Execute Profitable Personalization Campaigns

When I speak with our clients, it often strikes me how many of them feel overwhelmed by the very idea of personalization. Our imagination, often fueled by the marketing teams of various software companies, creates a perfect world where personalization enables every interaction to be completely custom for every individual. In this dreamland, artificial intelligence […]

The post Free Guide: How to Strategize & Execute Profitable Personalization Campaigns appeared first on Brooks Bell.

When I speak with our clients, it often strikes me how many of them feel overwhelmed by the very idea of personalization.

Our imagination, often fueled by the marketing teams of various software companies, creates a perfect world where personalization enables every interaction to be completely custom for every individual. In this dreamland, artificial intelligence and machine learning solve all our problems. All you have to do is buy a new piece of software, turn it on, and…BOOM: 1:1 personalization.

As a data scientist, I’ll let you in on a little secret: that software only provides the technological capability for personalization. Even further, the algorithms found within these tools simply assign a probability to each potential experience that maximizes the desired outcome, given the data they have access to. Suffice to say, they’re not as intelligent as you are led to believe.

If you caught our first post in this series, you already know that we define personalization a bit more broadly, as any differentiated experience that is delivered to a user based on known data about that user. This means personalization exists on a spectrum: it can be one-to-many, one-to-few, or one-to-one.

And while there are many tools that enable you to do personalization from a technical standpoint, they don’t solve for one of the main sources of anxiety around personalization: strategy

Most personalization campaigns fail because of a lack of a strategy that defines who, where and how to personalize. So I’ve put together a free downloadable guide to help you do just that. This seven-page guide is packed full of guidelines, templates and best practices to strategize and launch a successful personalization campaign, including:

  • Major considerations and things to keep in mind when developing your personalization strategy.
  • More than 30 data-driven questions about your customers to identify campaign opportunities.
  • A template for organizing and planning your personalization campaigns.
  • Guidelines for determining whether to deliver your campaigns via rule-based targeting or algorithmic targeting.

Free Download: Plan & Launch Profitable Personalization Campaigns.

The post Free Guide: How to Strategize & Execute Profitable Personalization Campaigns appeared first on Brooks Bell.

Thank You + Brooks Bell’s Best of 2018

It’s January 3, and if you’re like us, you’re already heads down at your desk and neck deep in emails. But we’d be remiss if we didn’t take a minute to reflect on the previous year. In November of 2018, we quietly celebrated 15 years of being in business. When Brooks Bell was founded, experimentation was in […]

The post Thank You + Brooks Bell’s Best of 2018 appeared first on Brooks Bell.

It’s January 3, and if you’re like us, you’re already heads down at your desk and neck deep in emails. But we’d be remiss if we didn’t take a minute to reflect on the previous year.

In November of 2018, we quietly celebrated 15 years of being in business. When Brooks Bell was founded, experimentation was in its infancy. But despite all the changes we’ve experienced since then, one thing remains true: it is the opportunity to connect with so many interesting people that are solving big problems for their business that makes our work worthwhile. Thanks for walking with us.

A look back at some of our big moments from 2018

Winning like Winona

In January, our Founder & CEO, Brooks Bell, was recognized as one of 25 women who rocked digital marketing in 2017. Later in the year, she was also announced as a Southeastern Finalist for EY’s Entrepreneur of the Year award. 

We also celebrated 2017’s record-breaking growth, were recognized as Optimizely’s North American Partner of the Year, and we garnered our local business journal’s Best Places to Work award.

Getting Lit with Illuminate

Fun fact: We originally built Illuminate to help us better manage and iterate upon our clients’ tests. Over time, we got so much great feedback, that we decided to make it available to everyone this year.

Now, with a successful beta launch under our belt and even more new features being added to the software, we’re excited to see where this new endeavor takes us in 2019.

F is for Friends, Fun and…Fear?

In October, things got a little spooky around the office and it had everything to do with Scott, our Director of Sales, who decided to channel his inner Ellen Degeneres for the day (much to our colleagues’ horror). Watch the video if you dare.

Making Bacon for our Clients

Back in 2014, we set a Big Hairy Audacious Goal to achieve $1 billion in projected revenue for our clients. By the end of 2017, we’d reached $500 million. And this past December, we hit $1 billion. (cue ::gong::)

But we’re not resting on our laurels. We’ve set some aggressive goals for 2019, with a focus on personalization, and we’re pumped to get to work.

Brooks Bell takes the Bay Area 

In September, we officially opened the doors to our San Fransisco office. This decision came after years of working with clients on the West Coast and our desire to work even more closely with them. And with the Bay Area’s rich history of innovation, we can’t think of a better place to help more companies push their boundaries through experimentation.

Still Clickin’ 

Last May, we hosted our annual Click Summit conference. We might be biased but this remains one of our favorite events as it’s filled with meaningful connections and seriously impactful takeaways. 2019 marks our 10th Click Summit, and we’ve got big plans. Request your invite today.

2018 on the blog

 


The post Thank You + Brooks Bell’s Best of 2018 appeared first on Brooks Bell.

New Features in Illuminate: Impact Analysis, Enhanced Filters, Updated Dashboard & More

Since we launched Illuminate back in May, our team has been working around the clock to develop even more features to help optimization teams better organize experiments, report performance and maximize impact. Today, we’re excited to share a few of these with you. What’s new in Illuminate? Show impact and determine priority Use our new Impact […]

The post New Features in Illuminate: Impact Analysis, Enhanced Filters, Updated Dashboard & More appeared first on Brooks Bell.

Since we launched Illuminate back in May, our team has been working around the clock to develop even more features to help optimization teams better organize experiments, report performance and maximize impact. Today, we’re excited to share a few of these with you.

What’s new in Illuminate?

Show impact and determine priority

Use our new Impact Analysis to show the overall impact of your tests by page type and identify where you should be focusing your testing efforts.

Sort and filter by what matters most

Filter your tests by 15 attributes including target audience, page type, start and end date, KPIs, revenue impact and more. Not seeing what you need? Add your own using our new custom tagging feature.

Keep sight of the bigger picture

Our new dashboard view enables you to view your program’s overall performance or view performance by a specific team or line of business.

+ a new tiled layout

If you love a good masonry layout (á la Pinterest), then you’re going to love our updated experiment view. Easily switch between a basic list of your experiments or a super slick-looking tiled layout.

Many of these features were developed in response to feedback from our beta users, bringing more of Brooks Bell’s advanced experimentation methodologies directly into the software.

“With Illuminate, you’re not just getting another test repository,” said Suzi Tripp, Senior Director of Innovative Solutions at Brooks Bell. “You’re getting 15 years of experimentation expertise and proven frameworks to help you do more, and do it better.”

Interested in learning more about illuminate? Learn more on our website or schedule a demo using the form below.

The post New Features in Illuminate: Impact Analysis, Enhanced Filters, Updated Dashboard & More appeared first on Brooks Bell.

What are your website visitors doing?

Chances are that you’re tracking your website visitors en masse. You’re probably tracking acquisition sites, tallying up conversions and working to optimize your pages for the best success. But with all of that quantitative research, do you know about each individual user’s journey, and where they are struggling on your site? If not, you should […]

The post What are your website visitors doing? appeared first on Brooks Bell.

Chances are that you’re tracking your website visitors en masse. You’re probably tracking acquisition sites, tallying up conversions and working to optimize your pages for the best success. But with all of that quantitative research, do you know about each individual user’s journey, and where they are struggling on your site? If not, you should check out one of our partners: SessionCam.

Jonathan Hildebrand, Brooks Bell’s Sr. Director of UX & Design, spoke at SessionCam’s user conference last week in Chicago. If you’re unfamiliar with SessionCam, the company began with a mission of building the best session replay solution on the market.  Over time the solution has grown into a fully-fledged behavioral analytics solution including heatmaps, conversion funnels, form analytics and more.

We’ve been blown away by the machine learning algorithms which identify signs of customer struggle and frustration on a website.  We sat down with Jonathan to ask him for a couple takeaways from the event.

As a UX expert, what do you appreciate most about SessionCam?

Where SessionCam really shines is in the qualitative data it provides, which can uncover major hurdles on your site in ways that quantitative data could never reveal. SessionCam’s recordings allow customers to watch a complete play-by-play of a visitor’s experience on the site, whether it’s through a mobile device or desktop.

What about specific to testing?

From a testing perspective, SessionCam can be great for post-test analysis since it allows you to watch videos from the live test experiences. The Customer Struggle Score is also a great way to understand where problems are occurring.

Any interesting case studies?

Definitely. One that comes to mind is a retailer that has a buy online, pick up in store (BOPUS) program. They were using SessionCam to uncover the source of order mistakes. When there was an error at pickup, they would go back and watch that customer’s online session to see if a problem occurred during the online order process and determine if there were any improvements they could make.

And you only need to check out their website to see the kind of value that SessionCam has added to many of the world’s leading brands.

If you’re interested in finding out more about SessionCam, give us a shout.

The post What are your website visitors doing? appeared first on Brooks Bell.