Call analytics platforms expand their utility

Advances in machine learning are enabling call analytics platforms to do more than ever before.

The post Call analytics platforms expand their utility appeared first on Marketing Land.

Call analytics platforms have become important tools to help marketers identify and activate the rich data hidden in the growing volume of inbound calls. Call analytics platforms track both online and offline leads, following a call from its source (i.e., website, social media and click-to-call search or display ads) to a sales representative (i.e., based on geographic location or product line).

The ability to track calls is a core use case of call analytics technology. However, advances in machine learning and artificial intelligence (AI) are driving more sophisticated applications, including the following:

  1. First-party database-building: As marketers lose access to third-party cookie data, first-party data sources such as phone calls are becoming more valuable in brand efforts to build privacy-compliant customer databases. Call analytics platforms facilitate the scaled collection and analysis of caller data.
  2. Customer journey attribution: Call analytics platforms provide online-to-offline attribution across media channels, helping marketers understand the role that each customer touchpoint plays in a conversion. The result is more efficient resource allocation and more relevant messaging based on customer preferences.
  3. Marketing campaign optimization: Call analytics platforms connect calls to the search keywords, social display ads or webpages that drove them. Marketers can use unique phone numbers for each website visitor to understand which pages and elements are driving the highest quality calls, as well as which ones are causing visitors to leave. Call data, including demographics, product interests and buying stage, can also be used to optimize search bids or make on-the-fly changes to campaign messaging and creative.
  4. Audience segmentation and targeting: Call analytics platforms record and transcribe calls, then apply AI-based models to the results to determine the characteristics of the highestperforming callers or leads. Using the data, marketers can build personas or look-alike audiences to create high-performing customer segments.
  5. Personalized, intelligent lead routing: Call analytics platforms use machine learning to score and route calls based on factors including call source, geography, demographics, purchase history or intent. Tools such as whisper messages arm sales reps with known customer information that personalizes the caller experience.
  6. Sales rep coaching and development: Many call analytics platforms include automated sales performance and evaluation tools to provide scoring/grading systems, script optimization and real-time alerts that flag lost opportunities.

Learn more about call analytics by downloading our Martech Intelligence Report.

The post Call analytics platforms expand their utility appeared first on Marketing Land.

2021 outlook for digital marketing agencies: the future is bright

The ability to prove the value they’re creating for clients is contributing to agencies’ resilience in 2020 and a positive outlook for next year.

The post 2021 outlook for digital marketing agencies: the future is bright appeared first on Marketing Land.

No one was quite prepared for what 2020 had in store — including digital marketing agencies. When business shutdowns occurred in the spring, 66% of marketing agencies said they experienced a decrease in overall revenue. CallRail’s call data report showed a similar trend, with call volume in the advertising and marketing industry falling by 29% from pre-COVID levels. 

While 2021 still holds some economic uncertainty, digital marketing leaders indicate that their agencies have been remarkably resilient, according to CallRail’s fall survey of 167 global marketing leaders who use its services. The findings show that most agencies will finish 2020 with higher annual revenues than in 2019. 

Overall, agency leaders reported satisfaction across many different business areas, such as their talent and expertise, customer service and positioning. When asked about their ability to attain key growth metrics — like generating new client leads and closing new business — the majority of agencies also exhibited strong confidence that they can.  

While CallRail’s 2021 Outlook Report takes a much deeper dive into agency business practices, this will article highlight some of the most interesting aspects of the research. It also includes agency partners’ perspective on just why and how 2021 is shaping up to be a strong year for digital marketing agencies.

Agencies 2021 financial outlook is strong

In CallRail’s survey, 88% of agencies indicated they are satisfied to extremely satisfied with their agency’s financial health. Only 3% were extremely unsatisfied. In line with these findings, most agencies also anticipate exceeding their 2019 revenue by the end of 2020. 

Due to the shift to digital channels during the pandemic, Kyle Shurtz, vice president of performance at Avalaunch Media, reported an increase in business. “Because we focus strictly on online advertising, we had more business come through as people shifted from offline to online. We anticipate 2021 to be much more of the same. Our goal is to grow by about 20% year-over-year.” 

Even for those agencies who saw a decline in revenue in the spring or whose business models are more mixed between online and offline advertising, there’s still been a silver lining to 2020. Molly Randolph, vice president of client services at The Barbauld Agency, said the pandemic forced them to look at their business and make decisions they wouldn’t have made otherwise. 

“We became more profitable because our overhead shrunk so much. But, we would never have gotten rid of our offices or our subscriptions. We would have let them linger without COVID,” said Randolph.

Dale Powell, managing director of Atomic Marketing, concurs. “We had a lot of time to think about the direction of our business and the types of clients we wanted to work with,” says Powell. “So we made some educated decisions to move pricing up and take a more firm hold to say ‘this is our price,’ which has eliminated the time wasters.” Powell also predicts that his agency’s revenue will exceed last year’s.

Agencies are delivering strategic value to clients

Being seen as a strategic partner is one of the primary ways agencies can increase their value to clients — and, in this department, most agencies thought they were doing quite well. According to 67% of agencies, the primary reason that clients choose their agency is because they’ve established themselves as strong strategic partners.

The longevity most agencies have with clients is another indicator that clients believed agencies are delivering enough value to continue to use their services. Long-term relationships of two years or more were common for 69% of agencies. Only 4% said the client relationship lasted less than one year.

In talking with some agency partners, a key reason agencies felt they were delivering value to their clients and seen as strategic partners was their ability to show real results

“We earn our new business by providing results. We do a lot of competitive research and look for ways to break the molds,” said Powell, whose agency uses call tracking and form tracking as one way to track and report results to clients.

“Without call tracking, we really wouldn’t have the business we have,” says Powell. “Our model is built upon full transparency, and call tracking lets us measure how many phone calls came from a landing page. Ultimately, this is what our clients want to know – how many leads are we generating for them, and we can show them.”

Survey results show other agencies agree. Almost all (95%) agency leaders reported that call tracking and lead form tracking were very important to their business. Call recording analysis was also very important to 85% of agencies.

“Having tracking tools like call tracking and call analysis is the way we keep our clients happy,” says Shurtz. “We show that we’re not an expense, but an investment. Call tracking shows who’s called and what keyword came from where. Call analysis we use often with big law firms to create hotspot keywords, such as ‘appointment’ or ‘claim,’ to qualify leads.” 

Challenges remain, but agencies are confident 

Despite well-established client relationships and a strong financial foundation, challenges remain for agencies. The top two agencies listed were finding new clients (48%) and generating more revenue from existing clients (42%).

While agencies say it’s gotten harder to find new clients (53%), retain current clients (52%), and grow revenue with existing clients (62%), they are also confident they can overcome these challenges. 

A strong majority of agencies (74%) said they are confident they can generate new leads, 75% are confident they can close new clients and 59% are confident they can grow revenue with existing clients.

What’s driving these high levels of confidence despite challenges? It seems to be a mix of employing strategies that help grow the agency’s business and, at the same time, showing clients that they are getting value from the services the agency provides.

For instance, Avalaunch Media felt it could continue to grow its business because it has a robust referral program in place. “We’re definitely confident that we can grow and sustain our business. We have a really strong partnership referral program that gives us good business,” said Shurtz. “We also have a strong employee referral program where they get a lifetime commission on any referrals, so we get a lot of deals from internal employees.”

The Barbauld Agency also talked about how it has helped clients turn their businesses around. For example, it helped clients make strategic advertising decisions that have resulted in some of them going from experiencing some of the worst months of their business to having some of the best months ever.

“Our July client meetings were very difficult and pessimistic, and we were doing all kinds of brainstorming about how to pivot clients’ businesses without the wheels coming off. But by September or October, our clients were saying, ‘Wow, we can’t believe how well it’s going,’ ” said Randolph.

Looking ahead

As agencies look ahead to 2021, there’s every reason to believe they will continue to realize business growth, increased revenue, and overall strong financial health. Generating new clients and growing revenue with existing clients will have to remain a priority to achieve these goals, as will continuing to focus on delivering results. Consequently, call tracking, form tracking, and other reporting and analytic software will remain essential tools.

Uncertainty for 2021 remains due to COVID-19, but the grass truly is looking greener on the other side of 2020. And there seems to be agreement among agencies that even if more lockdowns occur, it likely won’t be a repeat of spring.

“My outlook, overall, is much more positive for 2021 than even four months ago,” says Randolph. “I’m cautiously optimistic.” 

For more insights, download CallRail’s 2021 Outlook Report.

The post 2021 outlook for digital marketing agencies: the future is bright appeared first on Marketing Land.

Adopting a CDP is just the beginning: How Fingerhut’s parent planned a successful onboarding process

From tapping internal resources to building off quick wins, Bluestem Brands’ CDP champions slowly began to realize the tool’s potential.

The post Adopting a CDP is just the beginning: How Fingerhut’s parent planned a successful onboarding process appeared first on Marketing Land.

We hear a lot about how to choose a solution for your martech stack, but less about what you do once you’ve made a decision. At the recent MarTech virtual event, one session took on that topic: “So, you have a new CDP… Now what?” 

“We chose a vendor, everything’s going to be awesome,” said Ben Thompson, director of e-commerce analytics and tag management for Bluestem Brands, the parent of Fingerhut. “But it doesn’t always go that smoothly, does it?”

Even after Bluestem committed to adopting Tealium’s Audience Stream CDP, Thompson described a situation in which key stakeholders were anything but enthusiastic about getting the technology into place. 

“We had some pretty strong internal resistance to the CDP,” said Thompson. Bluestem’s IT group wanted to maintain tight control over the data and the processes around it; the legal department was worried about GDPR and CCPA. So Thompson shared how Bluestem overcame these challenges and explained what he learned along the way. 

The current process 

Before you adopt a technology, there’s likely someone at your company whose job it is to perform the ugly, ungainly process of bringing together data and making sense of it before it can be used in marketing. 

Here’s how Thompson described the status quo at Bluestem: “The usual process for one of these campaigns, whether it’s email, social or other media looks like this — you have all of these silos, and you need to get something from each of them. So what you’re going to do is you’re going to query it and combine it using SAS SQL or whatever your favorite tool is. You’ll export it from there. You’re going to move the file around on FTP sites, etc. You’re going to import it into another system.” Then and only then could you activate and run the campaign that you were planning. 

“For us, assembling campaigns like this meant we needed to invest a lot of time and money just to create a one-off campaign that didn’t help us build a unified [customer] profile,” said Thompson. “It required skilled coders. And finally, it was just plain slow.” 

At Bluestem, the person in charge of social media was performing that process, and you might think he would feel threatened by a new technology coming along to take over. Instead, advised Thompson, you need to enlist that person to help identify what data elements should be included in the CDP. That person was also key to helping measure and evangelize the great results achieved by the technology, given all the time saved. In Bluestem’s case, said Thompson, they saved that person 40 days every year by automating the process of gathering the targeting list. And he got to spend his time perfecting the social media presence instead. 

The use cases

The second important element Thompson described is the assembly of use cases to prove the value of the technology. 

“Someone in your marketing org has wanted to do something awesome for a long time, but has probably hit technical walls,” said Thompson. At Bluestem, they wanted to identify people who had abandoned a cart or performed a similar activity, then email them a custom 10% discount that could only be used by the recipients. But it wasn’t possible with their existing tech stack. 

“Audince Stream’s Webhook integration talking to our internal promo service was able to accomplish this,” said Thompson. “So now, when you abandon on Fingerhut.com Audience Stream sees that, tells our promo service to tie you, tells our ESP to email you that promotion. And we have a happy customer who can come back and complete their purchase with a nice discount that’s not going to get out to the masses.”

This single use case brought many in Bluestem’s marketing organization onto the CDP bandwagon, because it was something they’d wanted to accomplish for a long time. 

Thompson described how accomplishments like this helped win over key decision makers who’d been preventing the project from moving forward. 

The steering team 

As you roll out the solution within your organization, Thompson recommended assembling a steering team that’s accountable for providing regular updates to stakeholders and leadership. 

Thompson recommended that this group have a couple of marketing folks, including a key decision maker. Additionally, you’ll want team members from web development, legal and email operations, as well as whoever is running display and social campaigns and whoever is running the website from day to day. 

The data cleanup

To be able to fully utilize a solution like a CDP, you need to clean up and organize your data. Specifically, Thompson advised looking for data that isn’t used or isn’t accurate, and eliminating any stray sources of PII that could cause trouble down the line. 

Then you want to design the framework for bringing in data, including offline data. The person who was performing the manual processes previously will be a great resource in this stage. 

“Avoid being tempted to just toss everything in, as it will cost you more,” he said. “And you’ll have a lot of information just sitting there that you may not use.” 

Share your results 

“So you’ve done a lot of work to line up your quick wins, build a strong steering team, you’re actually firing up a few use cases and development,” said Thompson. “Make sure that along the way, you’re really showing these results to your stakeholders and your partners.”

Thompson said Bluestem developed some key reports that could be shared widely to help gain momentum around the implementation of the CDP. 

Don’t stop iterating

Once you’ve gotten some quick wins under your belt, it’s important to continue innovating. 

“It’s easy to rest back on the initial wins that you’ve had,” said Thompson.
But it’s really tricky to think of new ways to win with your CDP.” 

Thompson said Bluestem had been successful with personalizing their homepage based on a shopper’s previous behaviors, so they’re shown products they’ve demonstrated their interest in. 

“So we took this building block and let it shape some new use cases,” Thompson said.
“We took those audiences and actually started shaping whole campaigns around them. We made cold weather campaigns for people who were fans of coats, fireplaces, boots, other wintery gear. We also made a toy campaign for anyone who our model said could be toy buyers or who Audience Stream had seen browse or buy them in the past. And we made a cleanup event targeted at customers who are browsing tools.”

“We continue to run these campaigns and shape new ones as we’ve improved bounce rates,” Thompson continued. “We’ve improved our revenue per visits or funnel depth from all of these.”

Thompson also recommended evolving the steering team over time, tackling new channels, and introducing the CDP to different departments within the company. As you do that, he suggested you develop a ticketing system to handle all of the incoming requests. 

“Make sure, especially early on, that you sit down and walk through step by step with the requester,” so you understand what they’re looking to achieve, advises Thompson. “So many people think of the CDP as the magic box because you’ve done some magic things with it. And they’d be surprised at how many options they have and how specific their requests may need to be.” 

Finally, Thompson encouraged marketers to continue to explore the functionality within the CDP, noting that tools he wasn’t even aware of initially — Audience Sizing and Jobs — have become his favorite features.

The post Adopting a CDP is just the beginning: How Fingerhut’s parent planned a successful onboarding process appeared first on Marketing Land.

Building connections between siloed channels, technologies and teams

Integrate CMO Deb Wolf shares her views on the challenges facing today’s marketers and the ways they can be overcome.

The post Building connections between siloed channels, technologies and teams appeared first on Marketing Land.

Building connections between siloed channels, technologies and teams

Oftentimes, when we visualize the perfect buyer’s journey, we’re looking at it from a marketer’s perspective, imagining what we’re trying to achieve with our initiatives. But, ultimately, the definition of the perfect journey is in the eye of the buyer. The buyer doesn’t really care about your programs, your channels or what technologies you’re using, they’re just trying to get enough information to make a purchase decision.

We recently spoke to Integrate CMO Deb Wolf about the perfect buyer’s journey and the obstacles marketing teams face when trying to deliver the ideal customer experience. In the lightly-edited conversation below, you’ll find Wolf’s specific tips for building connections between siloed channels, technologies and teams, as well as the reasons why this is so important today.   

Q. As a B2B buyer yourself as well as a seasoned marketer, could you share your thoughts on the perfect buyer’s journey? 

A. When we buy marketing technology or services as a customer, we want to understand the mission of the company we’re doing business with, we want to understand their products and their functionality. We want to understand which customers actually use their solutions and what value they gain from them.

There’s a natural progression of the information customers are looking for and it’s not linear. Just like any B2B purchase, we may have 16 to 20 potential people who are involved in the decision process and we all have different needs. My perspective as the approver of the buying decision is different than that of the user of the system. They’re going to look for more details on the functionality, whereas I’m going to look for more value. And what procurement needs, or our privacy people or security people need, are entirely different than what we need as the users. 

So tailoring the experience to whoever it is that’s looking for information about your company is what makes it perfect for that buyer. As marketers, we need to treat our buyers the same way we’d want to be treated.

Where are we failing to make connections?

Q. Can you describe some of the different silos that we see in marketing today that prevent us from delivering the ideal buyer journey?

A. Silos tend to exist across four different areas in marketing: channels, technology, data and your own team. They’re ultimately interwoven, but I think it really starts with the teams and the way in which we work. 

Marketing teams have many specialists and few generalists. Event marketers plan events. Demand generation professionals drive leads. PR folks have been focused on earned media. And few, if any, of those marketers are looking at that entire buyer or account journey. 

We don’t really have a role within the marketing organization whose job it is to build a horizontal buyer’s journey. That has to be done through a collaboration across teams in order to create the experience we’re trying to provide. There are few people who are really thinking about the impact that the entire experience leaves on our potential customers. 

If we think about our demand channels as swim lanes — with each different specialist area in its own lane — it seems like sometimes our teams are in a race against each other. Everyone wants to be the first to have a conversation with the customer, the first to get credit for driving the lead, etc.  

Nine times out of ten what you’ll hear from marketers is they have this desire to delight the customer with the right content in the right channel at the right time. And they can describe what they think of as the ultimate buyer’s journey. But they lose their way when it comes to executing it.

There’s so much technology involved. That’s one of the other challenges. Each of those channels is associated with a different part of the marketing technology stack. Many marketing organizations can have upwards of 50 to 60 different pieces of technology in their stack today. 

When you ask a marketer what’s core to their system, they’ll tell you it’s a marketing automation system. But they’ll also tell you that their comms team is using a different piece of technology to monitor coverage, their event organizers have technology they use for registering people at events and scanning badges on the floor. 

The biggest challenge is all of the data that this technology creates. Data comes from all of those different siloed technology channels, and campaigns and, at the end, a marketing operations person has the goal of trying to make sense of it all. 

When we think about all of these silos, you can sum them up as the way your team operates, the technology from which they’re operating, the channels across which they’re driving, and then, ultimately, the data that it creates.

How did we get here? 

Q.  So how do you think we got here? How did we get into this position where we have all these silos?

A. My theory is that we have a lot of high performing marketers that are just driven to succeed — it’s one of the natural traits that you see across the marketing persona in any of the different areas that we’ve talked about. 

So, typically what happens is you end up having a marketer who thinks: “My job is to do this. I have budget aligned to do this. And, ultimately, I live in a world where I’m heads-down on trying to accomplish that thing, so I can be successful.” 

Part of this disconnect between disciplines stems from marketing teams being decentralized — they could live in business units, they could be regionally based, and now we’re all living remotely. So the discussions that used to happen over a water cooler don’t even happen over a water cooler anymore. I think this starts with our teams, and how we align work and think about getting work done.

Q. That makes a lot of sense. So what are the consequences of this situation for the buyer?

A. When I think about these poor buyers, they’re really focused on one thing and one thing only, and that’s finding the right solution for the problem they’re trying to solve. 

In the past, a traditional B2B sales engagement had buyers working one-to-one with the salesperson and it was very personalized. Salespeople would answer questions and get buyers the kind of information they needed. But now, marketing has filled in a lot of that space. 

But so many times, we are not providing buyers with the kind of information they want, which means that, ultimately, they’re not going to believe in our brand. This is a brand experience from the moment they start looking at your organization. And if you can’t provide them with a great customer experience, I’m not sure that they think you’re going to be a very good vendor for them to deal with. 

A lot of B2B buyers today have become highly consumerized. They expect the B2B buying process to be like the B2C buying process, only it’s not. When you look at B2C and you think about how advanced we’ve gotten in understanding the buying needs of the consumer, then you try to mirror that in the account needs or the B2B buyer needs within a larger decision-making process, I think we’ve failed the buyer altogether. Ultimately, it leaves a bad taste in their mouth and a bad first impression of your brand.

First steps toward building necessary connections

Q. So, do you think marketers want to break down those silos that are causing these disconnects?

A. I do. When you ask marketers what they’re trying to achieve today — and we just did some research in the August timeframe — the one thing they’ll tell you is that they have more data than they know what to do with. They say: “Don’t give us more data; we have data coming out of every part of every piece of technology that we have. How can you help us piece that data together?” A better buying experience, that’s what we’re really trying to do. We’re trying to get as much information to those buyers as we can, so that we provide them with that optimal experience. 

Most marketers are pretty brand savvy, so they want the relationship that a potential customer has with their organization to be very positive. But what’s holding them back are these organizational structures that we talked about, the technology that we talked about, and this mindset that focuses on single channel execution. 

Rather than thinking “I’m driving this campaign or event or webinar” they need to think “I’m part of this customer journey. I need to help the customer achieve what they want to achieve.” And that requires a lot more work cross-functionally to bring technology together in a place where you can actually understand the performance of specific campaigns and activate an omni-channel buyer’s journey. It’s only then that you can provide those buyers with the next best thing to do. When you’re pulling so much data out of so many different types of technology, it’s hard to activate anything and move them along the funnel. 

A new definition of success

Q. So how can marketers begin to break through? What are the steps that they need to take?

A. First, this is about getting your data together and really understanding who you’re even marketing to. If you have incomplete or inaccurate data from any of these different campaigns, and we get a lot of that, that’s the first problem you need to solve. 

I think a lot of marketers are dealing with marketing databases that are somewhere in the range of 40% marketability — meaning that only 40% of the records have all of the information you’d want to know about a buyer in order to be able to market to them. If you don’t have all that, if you have incomplete and inaccurate data, that’s no way to make a first impression. 

Nobody wants to get an email or an invitation to an event that says “Dear D. Wolf.” What about my first name? It’s so impersonal. That kind of information is key as a first step in starting off a great customer journey. 

Q. What excites you about the opportunities for a great buyer’s journey?

A. One of the most important and interesting things about what’s going on in marketing teams today is the future of marketing work. What are the roles that we don’t have today that will be more focused on the entire buying journey? You’ve seen this with things like account based marketing. Five years ago, we didn’t have an account-based marketing manager — that title did not exist in a marketing team. 

And today, you’re starting to see roles that originate maybe in demand gen, but really touch an integrated function across all of the different channels that we’re using today. That’s one of the super exciting things I’m seeing. What is it going to mean for the future of our teams and the future of people that are just coming into marketing today? 

Perhaps it won’t have occurred to them to think about marketing more from a specialist standpoint and they’ll asking questions like: 

  • How do our top-of-funnel demand marketers expand their efforts into mid funnel? 
  • How do they use all their channels to digitally nurture? 
  • How do they quit thinking about email as the one way to get in front of their prospects and move them along the funnel? 
  • How do we use things like intent data and the buying signals that buyers are giving us? Today, we score these leads based on who the person was and what they did, but this is just two dimensional scoring based on what the marketer thinks. 
  • How can we start using the signals the buyer is giving us to actually point us toward how these campaigns should be run — to infuse more intelligence into what we’re doing from a marketing standpoint? 

Those are all super exciting because we’re going to have to conquer and figure out and understand and experiment with our marketing and see where we go. 

Q.  It seems like one of the challenges might be the psychology of that very driven specialty marketing person who really wants success and wants all the budget to come to their area. 

A. Today, we KPI our employees based on a lot of output, like “how many events did you complete? How much press did you get? How many demand campaigns did you run?” But what we’re really more interested in is the outcome. 

You can’t look at the outcome in one single channel; the outcome is a revenue-based outcome for the organization. And so you have to look at all of it together, and it shouldn’t be done retrospectively, as it is today. Today, you’ll have a marketing ops person who takes all these different channels and pulls them together to get some picture of what actually happened in this account that closed. 

Instead, we should be looking at marketing success metrics like how many accounts we got to and what the outcomes were across those accounts. How many new buyers did we bring in? How did we expand business? These are new outcomes that you can’t answer just by looking at channels or technology. You have to change the mindset of the marketer. 

The post Building connections between siloed channels, technologies and teams appeared first on Marketing Land.

How Autodesk overhauled its marketing to align for better performance management

Speaking at the MarTech virtual conference, Zoe Marquardt described how the company overcame stubborn challenges.

The post How Autodesk overhauled its marketing to align for better performance management appeared first on Marketing Land.

Autodesk faced a challenge with which many marketers are familiar. Teams that were expert in their various disciplines — channel marketing, regional marketing, industry marketing, etc. — had diverged into silos in the absence of a strong marketing performance management program. There was no unified view of how marketing spend affected program goals across all of these initiatives. 

Overcoming the pain point trifecta

In short, the company faced three major challenges: 

  1. Various marketing teams were operating in silos.
  2. Tracking of marketing spend was poor.
  3. The marketing team was unable to track spend toward performance. 

In a recent talk at the MarTech virtual conference, Autodesk’s Zoe Marquardt, the company’s consultant on marketing performance management (MPM), explained how the company managed to tackle these three interrelated problems by instituting an MPM program. 

“Without breaking down those silos, you can’t get to a point where you’re tracking spend in a unified way across campaigns,” said Marquardt. “You can’t even develop unified campaigns across the company.”  

Alignment and unity to drive success with MPM

As it undertook this initiative, Autodesk had four goals: 

  1. To get more visibility into marketing tactics and their performance. 
  2. To enable a quarterly planning process with a steering-level review to ensure alignment.
  3. To define expectations for next quarter’s execution and spend.
  4. To enable the company to validate if expectations are met after executing.

“We have achieved [these objectives] to some extent,” said Marquardt. “Of course, we’re always looking to improve and build upon these goals, to get more visibility into marketing tactics and their performance.”

To help it, Audodesk adopted Allocadia’s MPM software, and, by getting all marketing groups across the company to utilize the platform, have been able to achieve an overview of all of the company’s campaigns and tactics.

Click to see a larger version of this image

Marquardt says the company ties tactics with campaigns and performance using the software, and can also tailor dashboards to address the needs of various constituents within the company.

Autodesk now conducts quarterly planning meetings to ensure that all spending aligns with the company’s overarching goals.

“It’s one thing to be able to track your tactics and their performance, it’s another thing to be able to then, return to those individuals who are setting strategy at a higher level to say, ‘you know, does this make sense? This appears to be doing well, but is it really in line with our business objectives?'” she said. “So we do that on a quarterly basis with a steering team.”

At each quarterly meeting, marketers come in with already-defined expectations for the next quarter’s execution and spend, which makes it easier for the steering team to come in and validate those decisions.

Additionally, the quarterly planning session is a time to look over the previous quarter and see whether the campaigns and tactics actually perform as expected.

“Exceeding the numbers you expected for a quarter is one thing, but it’s another to really understand why you’re doing well,” said Marquardt. “And, of course, to do that you have to understand your audiences, your campaigns, and everything that’s in that intersection when you’re actually executing marketing strategy.”

Measuring what was intangible

Click to see a larger version of this image

At the quarterly meetings, and on a regular basis, marketers go into Allocadia and view the marketing performance summary dashboard.

“It’s kind of that quintessential image of MPM at Autodesk at this point,” Marquardt said. “Because it’s this entry point right into the MPM program and marketing execution that would have been impossible a year ago.”

Marquardt notes that the left side of the dashboard shows planned spend rolled up by campaign. On the right, users can click into the Demand Generation Performance charts to view the campaigns and spend that drove that performance.

Each marketer enters in their quarterly plan for a campaign and how much will be spent, then the information on that tactic carries through to execution, and results and actuals that are pumped back in to feed the dashboard summary.

“So we really do have this holistic view,” said Marquardt. “Holistic in the sense that it covers marketing at Autodesk and holistic in the sense that it covers that beginning to end process when a marketer is planning, and then seeing the results and then adapting performance from there, of course.”

Marquardt says the dashboard, and the multiple other dashboards related to it, is a framework that the company established.

Allocadia is a central component and, within that interface, the marketers create tactics — a paid media execution, a webcast, or any other sort of digital marketing execution — and then, depending on what kind of activity it is, connect it to external software.

Click to see a larger version of this image.

The post How Autodesk overhauled its marketing to align for better performance management appeared first on Marketing Land.

How to show Lighthouse Scores in Google Sheets with a custom function

Learn how to use machine learning to streamline your reporting workflows right within Google Sheets.

The post How to show Lighthouse Scores in Google Sheets with a custom function appeared first on Marketing Land.

Automation and machine learning have tremendous potential to help all of us in marketing. But at the moment a lot of these tools are inaccessible to people who can’t code or who can code a bit but aren’t really that comfortable with it.

What often happens is that there ends up being one or two people in the office who are comfortable with writing and editing code and then these people produce scripts and notebooks that everyone else runs. The workflow looks a bit like this:

I will show you a simple way to streamline this workflow to remove the steps where people need to run a script and format the output. Instead they can run the automation directly from within Google Sheets.

The example I will show you is for a Sheets custom function that returns the Lighthouse score for a URL like in this gif:

The method I will show you isn’t the only way of doing this, but it does illustrate a much more general technique that can be used for many things, including machine learning algorithms.

There are two parts:

  1. A Google Cloud Run application that will do the complicated stuff (in this case run a Lighthouse test) and that will respond to HTTP requests.
  2. An Appscript custom function that will make requests to the API you created in step 1 and return the results into the Google Sheet.

Cloud run applications

Cloud Run is a Google service that takes a docker image that you provide and makes it available over HTTP. You only pay when an HTTP request is made, so for a service like this that isn’t being used 24/7 it is very cheap. The actual cost will depend on how much you use it, but I would estimate less than $1 per month to run thousands of tests.

The first thing we need to do is make a Docker image that will perform the Lighthouse analysis when we make an HTTP request to it. Luckily for us there is some documentation showing how to run a Lighthouse audit programatically on Github. The linked code saves the analysis to a file rather than returning the response over HTTP, but this is easy to fix by wrapping the whole thing in an Express app like this:

const express = require('express');
const app = express();
const lighthouse = require('lighthouse');
const chromeLauncher = require('chrome-launcher');

app.get('/', async (req, res) => {
    // Check that the url query parameter exists
    if(req.query && req.query.url) {
        // decode the url
        const url = decodeURIComponent(req.query.url)    
        const chrome = await chromeLauncher.launch({chromeFlags: ['--headless', '--no-sandbox','--disable-gpu']});
        const options = {logLevel: 'info', output: 'html', port: chrome.port};
        const runnerResult = await lighthouse(url, options);

        await chrome.kill();
        res.json(runnerResult.lhr)
    }
});

const port = process.env.PORT || 8080;
app.listen(port, () => {
  console.log(`Listening on port ${port}`);
});

Save this code as index.js.

Then you will also need a file called package.json which describes how to install the above application and a Dockerfile so we can wrap everything up in Docker. All the code files are available on Github.

package.json
{
    "name": "lighthouse-sheets",
    "description": "Backend API for putting Lighthouse scores in Google sheets",
    "version": "1.0.0",
    "author": "Richard Fergie",
    "license": "MIT",
    "main": "index.js",
    "scripts": {
        "start": "node index.js"
    },
    "dependencies": {
        "express": "^4.17.1",
        "lighthouse": "^6.3"
    },
    "devDependencies": {}
}
Dockerfile
# Use the official lightweight Node.js 10 image.
# https://hub.docker.com/_/node
FROM node:12-slim

# Our container needs to have chrome installed to
# run the lighthouse tests
RUN apt-get update && apt-get install -y \
  apt-transport-https \
  ca-certificates \
  curl \
  gnupg \
  --no-install-recommends \
  && curl -sSL https://dl.google.com/linux/linux_signing_key.pub | apt-key add - \
  && echo "deb https://dl.google.com/linux/chrome/deb/ stable main" > /etc/apt/sources.list.d/google-chrome.list \
  && apt-get update && apt-get install -y \
  google-chrome-stable \
  fontconfig \
  fonts-ipafont-gothic \
  fonts-wqy-zenhei \
  fonts-thai-tlwg \
  fonts-kacst \
  fonts-symbola \
  fonts-noto \
  fonts-freefont-ttf \
  --no-install-recommends \
  && apt-get purge --auto-remove -y curl gnupg \
  && rm -rf /var/lib/apt/lists/*


# Create and change to the app directory.
WORKDIR /usr/src/app

# Copy application dependency manifests to the container image.
# A wildcard is used to ensure copying both package.json AND package-lock.json (when available).
# Copying this first prevents re-running npm install on every code change.
COPY package*.json ./

# Install production dependencies.
# If you add a package-lock.json, speed your build by switching to 'npm ci'.
# RUN npm ci --only=production
RUN npm install --only=production

# Copy local code to the container image.
COPY . ./

# Run the web service on container startup.
CMD [ "node", "--unhandled-rejections=strict","index.js" ]

Build the docker image and then you can test things locally on your own computer like this:

First start the image:

docker run -p 8080:8080 lighthouse-sheets

And then test to see if it works:

curl -v "localhost:8080?url=https%3A%2F%2Fwww.example.com"

Or visit localhost:8080?url=https%3A%2F%2Fwww.example.com in your browser. You should see a lot of JSON.

The next step is to push your image to the Google Container registry. For me, this is a simple command:

docker push gcr.io/MY_PROJECT_ID/lighthouse-sheets

But you might have to setup the docker authentication first before you can do this. An alternative method is the use Google Cloud Build to make the image; this might work better for you if you can’t get the authentication working.

Next you need to create a Cloud Run service with this docker image.

Open Cloud Run and click “Create service”

Name and adjust settings. You must give your service a name and configure a few other settings:

It is best to pick a region that is close to where most of the audience for your sites live. Checking the site speed for a UK site from Tokyo won’t give you the same results as what your audience get.

In order for you to call this service from Google Sheets it must allow unauthenticated invocations. If you’re worried about locking down and securing the service to prevent other people from using it you will have to do this by (for example) checking from an API secret in the HTTP request or something like that.

Next you must select the container you made earlier. You can type in the name if you remember it or click “Select” and choose it from the menu.

Then click “Show Advanced Settings” because there is further configuration to do.

You need to increase the memory allocation because Lighthouse tests need more than 256Mb to run. I have chosen 1GiB here but you might need the maximum allowance of 2GiB for some sites.

I have found that reducing the concurrency to 1 improves the reliability of the service. This means Google will automatically start a new container for each HTTP request. The downside is that this costs slightly more money.

Click “Create” and your Cloud Run service will be ready shortly.

You can give it a quick test using the URL. For example:

curl -v "https://lighthouse-sheets-public-v4e5t2rofa-nw.a.run.app?url=https%3A%2F%2Fwww.example.com"

Or visit https://lighthouse-sheets-public-v4e5t2rofa-nw.a.run.app?url=https%3A%2F%2Fwww.example.com in your browser.

The next step is to write some Appscript so you can use your new API from within Google Sheets.

Open a new Google Sheet and the open up the Appscript editor.

This will open a new tab where you can code your Google Sheets custom function.

The key idea here is to use the Appscript UrlFetchApp function to perform the HTTP request to your API. Some basic code to do this looks like this:

function LIGHTHOUSE(url) {
  const BASE_URL = "https://lighthouse-sheets-public-v4e5t2rofa-nw.a.run.app"
  var request_url = BASE_URL+"?url="+encodeURIComponent(url)
  var response = UrlFetchApp.fetch(request_url)
  var result = JSON.parse(response.getContentText())
  return(result.categories.performance.score * 100)
}

The last line returns the overall performance score into the sheet. You could edit it to return something else. For example to get the SEO score use result.categories.seo.score instead.

Or you can return multiple columns of results by returning a list like this:

[result.categories.performance.score, result.categoryies.seo.score]

Save the file and then you will have a custom function available in your Google Sheet called LIGHTHOUSE.

The easiest way to get started with this is to copy my example Google Sheet and then update the code yourself to point at your own API and to return the Lighthouse results you are most interested in.

Enhance your spreadsheet know-how

The great thing about this method is that it can work for anything that can be wrapped in a Docker container and return a result within 30 seconds. Unfortunately Google Sheets custom functions have a timeout so you won’t have long enough to train some massive deep learning algorithm, but that still leaves a lot that you can do.

I use a very similar process for my Google Sheets addon Forecast Forge, but instead of returning a Lighthouse score it returns a machine learning powered forecast for whatever numbers you put into it.

The possibilities for this kind of thing are really exciting because in Search Marketing we have a lot of people who are very good with spreadsheets. I want to see what they can do when they can use all their spreadsheet knowledge and enhance it with machine learning.

This story first appeared on Search Engine Land.

https://searchengineland.com/how-to-show-lighthouse-scores-in-google-sheets-with-a-custom-function-343464

The post How to show Lighthouse Scores in Google Sheets with a custom function appeared first on Marketing Land.

How to make your data sing

Stop reporting absolute numbers and put your data into context with ratios to engage your stakeholders with the smaller, but important, data points.

The post How to make your data sing appeared first on Marketing Land.

It is amazing; the horrible job many digital marketers do when reporting their work to clients. This includes both internal and external clients. Just think about how many marketing presentations and reports you’ve seen that simply contain screenshots from Google Analytics, Adobe Analytics, Adwords, Google Console, or reports from a backend ecommerce system. This isn’t the way to influence people with your data.

The biggest issue is that most marketers are not analytics people. Many marketers do not know how to collect all of the necessary data or how to leverage that data, and to a lesser degree, know how to present it in a meaningful way. Typically, this is the job of a data analyst. The same way purchasing a pound of nails, a hammer and a saw doesn’t make you a carpenter, gaining access to your analytics reporting tool does not make you a data analyst. This is why many reports contain those convoluted screenshots, and present data out of context, contributing little to no meaning. 

Data out of context

Many reports merely report the facts (the data) with a number and no context. Data out of context is just data. For example, simply making a statement that Adwords generated 5,000 sessions to a website last month is meaningless without context. The number 5000 is neither a good nor a bad data point without a reference point or a cost factor. It’s not until you add in other factors (open the box) that you can demonstrate whether or not your efforts were a success. If the previous month’s Adwords campaign only drove in 1,000 sessions, then yes without other data, 5000 sessions looks good. But what if the cost to drive those additional 4,000 sessions was 10 fold the previous month’s spend? What if the previous month, Adwords drove 5,000 sessions but at double the spend?

It is only by adding in the additional information in a meaningful way that marketers can turn their reporting from a subjective presentation into an objective presentation. In order to do this, stop reporting absolute numbers and put your data into context with ratios. For example, when assessing Cost per Session, toss in a 3rd factor (goal conversion, revenue, etc.) and create something similar to “Cost per Session : Revenue”.  This will put the data into context. For example, if every session generated costs $1 : $100 (Cost per session : revenue) vs. $2.25 : $100 (Cost per session : revenue) the effectiveness of a marketing spend becomes self-evident. In this example, it is clear the first result is superior to the second. By normalizing the denominator (creating the same denominator) the success or failure of an effort is easily demonstrated. 

Data is boring

Yes, presenting data is boring. Simply looking at a mega table of a collection of data will cause many to lose interest and tune out any message you might be trying to present. The best way to avoid this is to make your data sing!

Make your data sing

Just like in the marketing world, the easiest way to grab someone’s attention and make your message sing is with imagery. Take all that great data in your mega table, and turn it into an easy to understand graph, or when necessary, simplified data tables. Even better, (if you can) turn it into interactive graphs. During your presentation, don’t be afraid to interact with your data.  With some guidance, your audience can dive into the data they are most interested in.

Learn to use data visualization tools like Data Studio, Tableau, DOMO, Power BI and others. Leveraging these tools allows you to take boring data and not only give it meaning but to make the data sing, which will turn you into a data hero.

Interacting with your data

Back at the end of July 2019, my firm acquired an electric vehicle. We wanted to know if the expense was worth it. Did the cost savings of using electricity over gasoline justify the difference in the ownership cost of the vehicle (lease payments +/- insurance cost and maintenance costs).

Below is a typical data type report with all the boring detailed data. This is a mega table of data and only those truly interested in the details will find it interesting. If presented with this table most would likely only look at the right-hand column to see the total monthly savings. If presented with just this data, many will get bored, and will look up and start counting the holes in the ceiling tiles instead of paying attention.

The following graphs demonstrate many of the ways to make this data sing, by putting all of the data into context through interactive graphics.

The above graph (page 1 of the report) details the cost of operating the electric vehicle. The first question we were always curious about was how much it was costing us to operate per 100 km. By collecting data on how much electricity was used to charge the car, how many kilometers we drove in a given month and the cost for that electricity, we are able to calculate the operating cost. In the graph you can easily see the fluctuation in operating costs, with costs going up in winter months (cost of operating the heater in the car) and again in June & July (cost of running the AC). You can also see the impact of increases in electricity prices.

To truly evaluate the big question “Was acquiring an electric vehicle worth it?” we’d need to estimate how much gasoline would have been consumed by driving the same distance against the average cost for gas during the same months. On page 2 of the report the data is now starting to sing as the difference in the savings of electrical over gas becomes clear. The chart becomes interactive and allows the user to hover over any column to reveal the data details.

To make the data truly sing, we’d need to not just compare the operating costs, but the costs of ownership. Do the savings in the operating costs justify the price difference between the vehicles? We know that the difference in lease costs, insurance and annual maintenance is in the range of $85-$90/month

The above graph (page 3 of the report) demonstrates, the impact of plummeting gas prices and the reduced driving done during April 2020 due to the COVID-19 shutdown. In April 2020 a mere monthly savings of approximately $41 dollars was achieved. Therefore, there were no savings in owning a more expensive electric vehicle over an equivalent gas-powered vehicle (the difference in lease costs and insurance, etc. is in the range of $85-90/month). While it might not be sing, it definitely was screaming out when we saw it. 

Check out the entire report for yourself.  It is accessible here so you can view all the pages/charts. The report is interactive allowing you to hover given months to see data details or even change the reporting date range.

By embracing not only data visualization but the visualization of meaningful data, we as marketers can raise the bar and increase engagement with our audience. Think of the four pages of this report, which page talks most to you? Which way of presenting the data makes it sing for you? Odds are it was not the first table with all the detailed data.

The post How to make your data sing appeared first on Marketing Land.

Here’s an alternative to cookies for user tracking

Instead of having your analytics toolset read a cookie, pass a unique identifier associated with the user ID. Learn how to do it and keep it privacy compliant.

The post Here’s an alternative to cookies for user tracking appeared first on Marketing Land.

For over 20 years, website analytics has leveraged the use of persistent cookies to track users. This benign piece of code was a mass improvement over using a user’s IP address or even the combination IP and browser. Since it was first introduced, the use of these cookies has become the focus of privacy legislation and paranoia. So what alternative is there?

If your website or mobile application requires the creation of user accounts and logins, it’s time to plan to transition away from cookie-based tracking to user ID tracking. In simple terms, instead of having your analytics toolset read a cookie, you pass a unique identifier associated with the user ID and then track the user via this identifier. Typically the identifier is the login ID.

Preparing for advanced tracking

Step 1

Ensure that the user ID you’ve deployed doesn’t contain Personal Identifiable Information (PII). Too often, sites require users to use their personal email address as a login ID or event their account number. These are PII. If this is the case with your organization, then the trick is to assign a random unique client identifier to all existing accounts as well as for any future accounts as they are created. 

Step 2

Have your developers start to push the User ID to the data layer. This way, the variable will be there waiting for your analytics software to read it once you’re ready to implement the new tracking method. Check with your analytics software on the variable name for this element as it varies from analytics software to software.

Step 3

Create a new view/workspace within your analytics software and configure it to track users by their user ID. Most analytic packages will still set a temporary cookie to track user behavior prior to their login and then will connect the sessions. This way you can see what a user does on your site even prior to them logging in and what site visitors who never login do.

Benefits of tracking users by user ID

Improved accuracy

The use of cookies is flawed in many ways. If users jump between devices (from desktop, to mobile, to a tablet, or office computer to home computer) you can’t track that it was the same user. This generates inflated unique user counts.

What if a user clears their cookies (perhaps they’re utilizing antivirus software that purges all cookies every time the browser is closed)? Once again this leads to inflated user count data.

By tracking a user via their user ID, you’ll obtain a more accurate count of unique users on your site.

Cross Device Tracking

This is perhaps one of the greatest benefits of tracking users by their user ID. You can now see how users interact with your site and/or mobile app. How many use a combination of devices. Is there a specific preference for which type of device might simply be used to add to a shopping cart, only to have the order processed on another device?

Greater Analytics Insight

Armed with enhanced analytics data, new and potentially powerful insights can be harvested. With this new knowledge, you can better direct internal resources to focus and enhance the user experience and optimize the user flow for greater profits.

Real life examples

The following examples demonstrate the power of tracking users by their user ID. 

Overview – Device Overlap

The following image shows what percentage of accounts use which type of device and the percentage that use a combination of devices. For example, while 66.6% use only a desktop, 15.8% use a combination of Mobile and Desktop.

User Behavior – Device Flow

Reviewing the device flow leading up to a transaction can provide some of the greatest insights from this enhanced analytics tracking methodology.

While it might not be surprising that the two most common device (by number of Users) paths were Desktop only and Mobile only, what was surprising to me and to the client was number 3. While the device path of Desktop -> mobile -> Desktop is only experienced by approx. 3% of users, it accounts for approximately 8% of all transaction and over 9% of all revenue generated.

The minimal overall use of tablets was also a bit surprising. Of course the mix of devices does vary from client to client.

Assisted conversions

By dropping the use of cookies, the quality of the data of assisted conversions is significantly increased. For example, how many people read an email (can easily be tracked when opened and attributed to a user ID) on a mobile device, click into the site, browse around to and review the items that are being promoted (maybe add them to their shopping cart). Then think about it for a bit before logging-in later via a desktop to complete the transaction?

For example, from the above report, one can objectively assign a more accurate value to SEO efforts by examining the role Organic Search traffic played in generating sales. While a source of an immediate sale (in this case) from organic search generated traffic represents 1.3% of total revenue as an assist in the sales cycle, it played a role in over 10.4% of generated revenue.

Enhanced user insights

In this example, the client allows its customers to also have multiple logins for their account. Essentially a user ID represents a customer/client and not a single user. The client operates in the B2B world where multiple people within its clients’ organizations may require unique logins and rights (who can order, who can just view product details, who can view or add to the cart but not place an order, etc.). By leveraging both tracking by user ID and recording a unique login id within their analytics, these additional insights can be obtained.

user-breakdown.jpg

The above report not only breaks down revenue by division, but demonstrates how within different division users use the site differently. In region 1, there is almost a 1:1 relationship between user ids and login ids. Yet in Division 3, the ration is over 4:1, this means that for every customer there is an average over 4 logins being utilized in Division 3.

How can they leverage this data for more effective marketing? By understanding that within divisions there are differences, carefully crafted email marketing can be created to target customers differently with multiple logins vs. single account/login customers. 

A further dive into the data could also reveal which login IDs are only product recommenders (only view products) from those who make specific product requests (add to the shopping cart and never place the order) from those who only process orders and from those who do it all. Each one needs to be marketed to differently with different messaging to optimize the effectiveness of the marketing effort. It’s through detailed analytics that this audience definition can be obtained.

Is tracking by user ID right for me?

Making the decision to change how you track your users is a difficult choice. First, does your site/mobile app require users to login at a reasonably early part of their journey? This is ideal for e-commerce sites and sites where the vast majority of user interaction takes place after the user logins into the site/application.

If you’re running a general website with the goal to merely share information and generate “contact us” type leads, the answer to making this switch is no.

If you have a combination of a general information site plus a registered user section, then yes you might want to consider making this change and perhaps just for the registered user section.

If you do make this change, don’t stop running your other analytics views/workspaces that use cookies. Keep them running. By operating two different views, you’ll be eventually able to reconcile the differences between the two, plus it makes it easier to explain to those who you report to, why you’ll be reporting a dramatic drop in the number of users. Of course, when you first make the switch, all users will be first-time users so expect a major increase in new visitor traffic.

If you decide to make this change, don’t forget to review the impact of the change with your legal department. They will tell you if you need to update your privacy policy.

The post Here’s an alternative to cookies for user tracking appeared first on Marketing Land.

The importance of valuing latent orders to successful Amazon Sponsored Products management

Advertisers must consider the lag time between ad click and conversion as well as historic performance around key days to estimate shift.

The post The importance of valuing latent orders to successful Amazon Sponsored Products management appeared first on Marketing Land.

Sponsored Products is the most widely adopted Amazon search ad format, and typically accounts for more than six times as much ad spend as Sponsored Brands ads for the average Tinuiti (my employer) advertiser. As such, it’s incredibly important for advertisers to understand the full value that these ads drive.

Part of this is understanding the click-to-order period between when a user clicks on an ad and when that user ends up converting. Given how Amazon attributes orders and sales, it’s crucial that advertisers have an idea of how quickly users convert in order to value traffic effectively in real time.

Amazon attributes conversions and sales to the date of the last ad click

When assessing performance reports for Sponsored Products, advertisers should know that the orders and sales attributed to a particular day are those that are tied to an ad click that happened on that day. This is to say, the orders and sales reported are not just those that occurred on a particular day.

Advertisers viewing Sponsored Products conversions and sales in the UI are limited to only seeing those orders and sales attributed to the seven days following an ad click. However, marketers pulling performance through the API have greater flexibility and can choose different conversion windows from one to thirty days, which is how the data included in this post was assembled.

In the case of Sponsored Display and Sponsored Brands campaigns, performance can only be viewed using a 14-day conversion window, regardless of whether it is being viewed through the UI or through an API connection.

For marketers who wish to use a thirty-day conversion window in measuring Sponsored Products sales and conversions attributed to advertising, this means that it would take thirty days after the day in question in order to get a full picture of all conversions. Taking a look across Tinuiti advertisers, the first 24 hours after an ad click accounted for 77% of conversions and 78% of sales of all those that occurred within 30 days of the ad click in Q2 2020.

Unsurprisingly, the share of same-SKU conversions that happen in the first 24 hours is even higher, as shoppers are more likely to consider other products the further removed they become from an ad click.

For the average Amazon advertiser, we find that more than 20% of the value that might be attributed to ads happens more than one day after the ad click, meaning advertisers must bake the expected value of latent orders and sales into evaluating the most recent campaign performance. The math of what that latent value looks like varies from advertiser to advertiser.

Factors like price impact the length of consideration cycles

The time it takes for consumers to consider a purchase is naturally tied to the type of product being considered, and price is a huge factor. Taking a look at the share of 30-day conversions that occur more than one day after the click by the average order value (AOV) of the advertiser, this share goes up as AOV goes up. Advertisers with AOV over $50 saw 25% of orders occur more than 24 hours after the ad click in Q2 2020, whereas advertisers with AOV less than $50 saw 22% of orders occur more than 24 hours after the ad click.

Put simply, consumers usually take longer to consider pricier products before purchasing than they take to consider cheaper products, generally speaking. Other factors can also affect how long the average click-to-order cycle is for a particular advertiser.

In addition to latent order value varying by advertiser, there can also be meaningful swings in what latent order value looks like during seasonal shifts in consumer behavior, such as during the winter holiday season and around Prime Day.

Key shopping days speed up conversion process

The chart below depicts the daily share of all conversions attributed within seven days of an ad click that occurred during the first 24 hours. As you can see, one-day order share rose significantly on Black Friday and Cyber Monday as users launched into holiday shopping (and dropped in the days leading into Black Friday).

After these key days, one-day share returned to normal levels before rising in the weeks leading up to Christmas Day before peaking on December 21 at a level surpassing even what was observed on Cyber Monday. December 21 the last day many shoppers could feel confident in placing an order in time to receive it for the Christmas holiday, and it showed in how quickly the click-to-purchase path was for many advertisers.

Of course, Amazon created its own July version of Cyber Monday in the form of Prime Day, and we see a similar trend around one-day conversion share around the summer event as well.

This year’s Prime Day has been postponed, but reports indicate that the new event might take place in October.

As we head into Q4, advertisers should look at how the click-to-order window shifts throughout key times of the year in order to identify periods in which latent order value might meaningfully differ from the average.

Conclusion

Like any platform, advertisers are often interested in recent performance for Amazon Ads to understand how profitable specific days are. This is certainly important in determining shifts and situations in which budgets should be rearranged or optimization efforts undertaken, and that’s even more true now given how quickly performance and life are changing for many advertisers as well as the population at large.

However, in order to do so effectively, advertisers must take into consideration the lag that often occurs between ad click and conversion. Even for a platform widely regarded as the final stop for shoppers such as Amazon, more than 20% of 30-day conversions occur after the first 24 hours of the click, and this share can be much higher for advertisers that sell products with longer consideration cycles.

Further, advertisers should look to historic performance around key days like Cyber Monday and Prime Day to understand how these estimates might shift. Depending on product category, other holidays like Valentine’s Day or Mother’s Day might also cause shifts in latent order value.

Not all advertisers necessarily want to value all orders attributed to an ad over a month-long (or even week-long) attribution window equally, and particularly for products with very quick purchase cycles, it might make sense to use a shorter window. That said, many advertisers do find incremental value from orders that occur days or weeks removed from ad clicks, and putting thought into how these sales should be valued will help ensure your Amazon program is being optimized using the most meaningful performance metrics.

The post The importance of valuing latent orders to successful Amazon Sponsored Products management appeared first on Marketing Land.

Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data

Marketing and analytics professionals need to work together to not only increase the accuracy of our data, but to educate people about how to leverage it.

The post Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data appeared first on Marketing Land.

One thing online businesses need to use as a cornerstone of their business decision making  process is their digital analytics data (analytics data from a variety of sources: i.e. web analytics, search console, paid search, paid social, social media, etc.). Yet, according to a MIT Sloan Management Review only 15% of more than 2,400 business people surveyed trust their data. While there is no analytics method available that will guarantee 100% accuracy of your digital analytics data, by auditing your data you can ensure the data is as accurate as possible. This will provide you with the confidence to not only trust your data but to leverage the information provided in making objective business decisions, instead of subjective decisions. It is that lack of trust that explains why a mere 43% (according to the same survey) say “they frequently can leverage the data they need to make decisions.” This low confidence in one’s data equals failure.

As marketing and analytics professionals, we need to work together to not only increase the accuracy of our data, but to educate people about the data and how to leverage it. The first step in this process is auditing your analytics configurations and thereby identifying issues, and correcting them to ensure the integrity of the data. 

The analytics audit process

Step 1: Acknowledge analytics data isn’t perfect

When you start your analytics process gather together all those who have a stake in the outcome and find out why they don’t trust the data. Most likely they have good reasons. Don’t make claims that your goals is to make it 100% accurate because that is impossible. Use the opportunity to explain at a high level that analytics data captures a sampling of user activity and for various technical reasons, no system will be perfect and that’s why they are most likely seeing data difference between things like their Adwords account and their web analytics data. Use an example of taking a poll. Pollsters take a sample ranging in size of 1,000-2,000 people of a total population of over 350,000 in the USA and then state their data is accurate within a few percentage points 4 out of 5 times. In other words, they are way off 20% of the time.  However, businesses, politicians and the general public respond and trust this data. When it comes to web analytic data even at the low end of accuracy, your data is likely still capturing an 80% sample which is far more accurate then the data presented by pollsters, yet it is less trusted. Let the stakeholders know, as a result of the audit and implementing fixes, you could be improving data capture accuracy to 90% or even 95%, and that is data you can trust 100%.

Step 2: Identify what needs to be measured

One of the biggest issues when it comes to analytics data is, the analytics software isn’t configured to collect only the correct data. The software becomes a general catch-all. While on the surface it sounds perfect to just capture everything (all that you can), when you cast a huge net you also capture a lot of garbage. The best way to ensure the right data is being captured and reported on is to review the current marketing and measurement plans. Sadly, too few organizations don’t have these, so during your meeting, make sure to ask what the stakeholders’ primary items they want measured are.

Identify and gather all the “Key Performance Indicators” (KPI) currently being reported on. You’ll need this before you start the audit. Verify all KPI are still valuable to your organization and not just legacy bits of data that have been reported for years. Sadly in many organizations, they are still reporting on KPIs that actually hold little to no value to anyone within the organization.

Step 3: Review the current analytics configuration

Now is the time to roll-up those sleeves and get dirty. You’ll need admin level access (where possible) to everything or at a minimum full view rights. Next you’ll need a spreadsheet which lists the specific items that you need to review and ensure are configured correctly and if not, a place to note what is wrong and a column to set a priority to get them fix.

The spreadsheet I’ve developed over the years has over 100 standard individual items to review, grouped into specific aspects of a digital analytics implantation plus depending on the specific client additional items may be added. The following eight are some of most critical items that need to be address.

  1. Overall Analytics Integrity: is your analytics code (or tag manager code) correctly installed on all pages of your website? Check and make sure your analytics code is correctly deployed. Far too often the javascript (or the code snippet) is located in the wrong place on the web page or perhaps it is missing some custom configuration. Simply placing the code in the wrong place can cause some data to not be captured.

    Verify the code is on all pages/screens. Too often either section of a site are missed or the code doesn’t work the same on all pages resulting in lost data or potentially double counting.

    If you run both a website and an app, are their analytics data properly synced for data integration, or is it best to run them independently?
  2. Security: Review who has access to the analytics configuration and determine when the last time an individual’s access and rights were reviewed. You’d be surprised how many times, it has been discovered that former employees still have admin access. This should be something that is reviewed regularly, plus a system has to be in place to notify the analytics manager when an employee departs an organization to terminate their access. While you may think since you’re using the former employee’s email address all is fine because HR will cancel that email address, they may still have access. Many analytics systems do not operate within your corporate environment and are cloud-based. As long as that former employee remembers their email address and the password to that specific analytics account they’ll have access.
  3. Analytics Data Views: This is an especially critical feature when it comes to web analytics (i.e. Google Analytics, Adobe Analytics, etc.). Is your analytics system configuring to segregate your data into at least 3 different views? At a minimum, you need “All Data” (no filtering), “Test” (including only analytics test data or website testing) and “Production” (only customer-generated data). In many organizations, they also segment their data further into “Internal Traffic” (staff using the website) and “External Traffic” (primarily external users).

    If these don’t exist, then it is likely you are collecting and reporting on test traffic and internal users. How you’re employees use a website is completely different than customers and should at a minimum be excluded or segmented into their own data set.
  4. Review Filters: Filters are a common tool used in analytics to exclude or include specific types of activity. Most filters don’t need to be reviewed too often, but some do need to be reviewed more frequently. The ones that need to be reviewed most often are ones that include or exclude data based on a user IP address. IP addresses do have a nasty habit of changing over time. For example, a branch location switched ISPs and received a new IP address. When it comes to IP based filters it is recommended they be reviewed every 6 months, but if not possible at least once per year. As a tip, after they’ve been reviewed and verified, rename the filter by adding the date they were last reviewed.

    Don’t forget to ensure that exclude filters are in place to exclude search engine bots and any 3rd party utilities used to monitor a website. This machine-generated traffic has a nasty habit, of getting picked up and reported on which skews all the data.
  5. Personally Identifiable Information (PII): Too many developers have a nasty habit of passing PII via the data layer especially on ecommerce sites. They are unaware that it is possible that this data can end in the company’s analytics database. By storing PII on a 3rd party system, you either need to reference this in your privacy policy and even then you might be breach of various privacy laws. One of the most common types of these errors is passing a users email address as a URI parameter. The easiest way to check for this is to run a pages report for any URI containing a “@” symbol. Over the years, I’ve seen customer’s names capture and much more.

    If this happens, ideally your developers should fix what is causing this, but at a minimum you’ll need a filter to strip this type of information from the URI before it is stored in your analytics database.
  6. E-commerce Data: This is the most common issue we hear from organizations: “The sales figures reported in the analytics doesn’t match our e-commerce system!” As stated above, analytics isn’t perfect nor should it be treated as a replacement for an e-commerce/accounting backend. However, if you are capturing 85-95% (and possibly higher) of the transactional data then you can effectively leverage this data to evaluate marketing efforts, sales programs, AB tests, etc.

    From an e-commerce perspective, the easiest way to audit this is to compare the reported data in a given time period to what the backend system reports. If it is near 90%, then don’t worry about it. If it is below 80%, you have an issue. If it is somewhere in between, then it is a minor issue that should be looked into but is not a high priority.
  7. Is everything that needs to be tracked being tracked: What does your organization deem important? If your goal is to make the phones ring, then you need to be tracking clicks on embedded phone numbers. If your goal is forms driven submissions, are you tracking form submissions correctly? If you’re trying to direct people to local locations, then are you capturing clicks on location listings, on embed maps, etc.?

    What about all those social media icons scattered on your website to drive people to your corporate Twitter, LinkedIn, Facebook accounts? Are you tracking clicks on those?
  8. Campaigns: Is there a formal process in place to ensure links on digital campaigns are created in a consistent manner? As part of this are your marketing channels correctly configured within your analytics system?

You now have an outline for where to start your analytics audit. Think of your organization’s analytics data and reporting systems like a car. It always seems to be working fine until it stops working. You need to take your car in from time to time for a tune-up. This is what an analytics audit is. The audit will identify things that need to be fixed immediately (some small and some big) plus other items that can be fixed over time. If you don’t fix the items discovered during the audit your analytics system won’t operate optimally and people won’t want to use it. How frequently should an analytics audit be conducted after everything has been fixed? Unlike a car, there is no recommended set amount of time between audits. However, every time your digital properties undergo major updates or if there have been a series of minor updates that can easily be viewed together as a major update, it is time to repeat the audit process.

The post Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data appeared first on Marketing Land.