Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data

Marketing and analytics professionals need to work together to not only increase the accuracy of our data, but to educate people about how to leverage it.

The post Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data appeared first on Marketing Land.

One thing online businesses need to use as a cornerstone of their business decision making  process is their digital analytics data (analytics data from a variety of sources: i.e. web analytics, search console, paid search, paid social, social media, etc.). Yet, according to a MIT Sloan Management Review only 15% of more than 2,400 business people surveyed trust their data. While there is no analytics method available that will guarantee 100% accuracy of your digital analytics data, by auditing your data you can ensure the data is as accurate as possible. This will provide you with the confidence to not only trust your data but to leverage the information provided in making objective business decisions, instead of subjective decisions. It is that lack of trust that explains why a mere 43% (according to the same survey) say “they frequently can leverage the data they need to make decisions.” This low confidence in one’s data equals failure.

As marketing and analytics professionals, we need to work together to not only increase the accuracy of our data, but to educate people about the data and how to leverage it. The first step in this process is auditing your analytics configurations and thereby identifying issues, and correcting them to ensure the integrity of the data. 

The analytics audit process

Step 1: Acknowledge analytics data isn’t perfect

When you start your analytics process gather together all those who have a stake in the outcome and find out why they don’t trust the data. Most likely they have good reasons. Don’t make claims that your goals is to make it 100% accurate because that is impossible. Use the opportunity to explain at a high level that analytics data captures a sampling of user activity and for various technical reasons, no system will be perfect and that’s why they are most likely seeing data difference between things like their Adwords account and their web analytics data. Use an example of taking a poll. Pollsters take a sample ranging in size of 1,000-2,000 people of a total population of over 350,000 in the USA and then state their data is accurate within a few percentage points 4 out of 5 times. In other words, they are way off 20% of the time.  However, businesses, politicians and the general public respond and trust this data. When it comes to web analytic data even at the low end of accuracy, your data is likely still capturing an 80% sample which is far more accurate then the data presented by pollsters, yet it is less trusted. Let the stakeholders know, as a result of the audit and implementing fixes, you could be improving data capture accuracy to 90% or even 95%, and that is data you can trust 100%.

Step 2: Identify what needs to be measured

One of the biggest issues when it comes to analytics data is, the analytics software isn’t configured to collect only the correct data. The software becomes a general catch-all. While on the surface it sounds perfect to just capture everything (all that you can), when you cast a huge net you also capture a lot of garbage. The best way to ensure the right data is being captured and reported on is to review the current marketing and measurement plans. Sadly, too few organizations don’t have these, so during your meeting, make sure to ask what the stakeholders’ primary items they want measured are.

Identify and gather all the “Key Performance Indicators” (KPI) currently being reported on. You’ll need this before you start the audit. Verify all KPI are still valuable to your organization and not just legacy bits of data that have been reported for years. Sadly in many organizations, they are still reporting on KPIs that actually hold little to no value to anyone within the organization.

Step 3: Review the current analytics configuration

Now is the time to roll-up those sleeves and get dirty. You’ll need admin level access (where possible) to everything or at a minimum full view rights. Next you’ll need a spreadsheet which lists the specific items that you need to review and ensure are configured correctly and if not, a place to note what is wrong and a column to set a priority to get them fix.

The spreadsheet I’ve developed over the years has over 100 standard individual items to review, grouped into specific aspects of a digital analytics implantation plus depending on the specific client additional items may be added. The following eight are some of most critical items that need to be address.

  1. Overall Analytics Integrity: is your analytics code (or tag manager code) correctly installed on all pages of your website? Check and make sure your analytics code is correctly deployed. Far too often the javascript (or the code snippet) is located in the wrong place on the web page or perhaps it is missing some custom configuration. Simply placing the code in the wrong place can cause some data to not be captured.

    Verify the code is on all pages/screens. Too often either section of a site are missed or the code doesn’t work the same on all pages resulting in lost data or potentially double counting.

    If you run both a website and an app, are their analytics data properly synced for data integration, or is it best to run them independently?
  2. Security: Review who has access to the analytics configuration and determine when the last time an individual’s access and rights were reviewed. You’d be surprised how many times, it has been discovered that former employees still have admin access. This should be something that is reviewed regularly, plus a system has to be in place to notify the analytics manager when an employee departs an organization to terminate their access. While you may think since you’re using the former employee’s email address all is fine because HR will cancel that email address, they may still have access. Many analytics systems do not operate within your corporate environment and are cloud-based. As long as that former employee remembers their email address and the password to that specific analytics account they’ll have access.
  3. Analytics Data Views: This is an especially critical feature when it comes to web analytics (i.e. Google Analytics, Adobe Analytics, etc.). Is your analytics system configuring to segregate your data into at least 3 different views? At a minimum, you need “All Data” (no filtering), “Test” (including only analytics test data or website testing) and “Production” (only customer-generated data). In many organizations, they also segment their data further into “Internal Traffic” (staff using the website) and “External Traffic” (primarily external users).

    If these don’t exist, then it is likely you are collecting and reporting on test traffic and internal users. How you’re employees use a website is completely different than customers and should at a minimum be excluded or segmented into their own data set.
  4. Review Filters: Filters are a common tool used in analytics to exclude or include specific types of activity. Most filters don’t need to be reviewed too often, but some do need to be reviewed more frequently. The ones that need to be reviewed most often are ones that include or exclude data based on a user IP address. IP addresses do have a nasty habit of changing over time. For example, a branch location switched ISPs and received a new IP address. When it comes to IP based filters it is recommended they be reviewed every 6 months, but if not possible at least once per year. As a tip, after they’ve been reviewed and verified, rename the filter by adding the date they were last reviewed.

    Don’t forget to ensure that exclude filters are in place to exclude search engine bots and any 3rd party utilities used to monitor a website. This machine-generated traffic has a nasty habit, of getting picked up and reported on which skews all the data.
  5. Personally Identifiable Information (PII): Too many developers have a nasty habit of passing PII via the data layer especially on ecommerce sites. They are unaware that it is possible that this data can end in the company’s analytics database. By storing PII on a 3rd party system, you either need to reference this in your privacy policy and even then you might be breach of various privacy laws. One of the most common types of these errors is passing a users email address as a URI parameter. The easiest way to check for this is to run a pages report for any URI containing a “@” symbol. Over the years, I’ve seen customer’s names capture and much more.

    If this happens, ideally your developers should fix what is causing this, but at a minimum you’ll need a filter to strip this type of information from the URI before it is stored in your analytics database.
  6. E-commerce Data: This is the most common issue we hear from organizations: “The sales figures reported in the analytics doesn’t match our e-commerce system!” As stated above, analytics isn’t perfect nor should it be treated as a replacement for an e-commerce/accounting backend. However, if you are capturing 85-95% (and possibly higher) of the transactional data then you can effectively leverage this data to evaluate marketing efforts, sales programs, AB tests, etc.

    From an e-commerce perspective, the easiest way to audit this is to compare the reported data in a given time period to what the backend system reports. If it is near 90%, then don’t worry about it. If it is below 80%, you have an issue. If it is somewhere in between, then it is a minor issue that should be looked into but is not a high priority.
  7. Is everything that needs to be tracked being tracked: What does your organization deem important? If your goal is to make the phones ring, then you need to be tracking clicks on embedded phone numbers. If your goal is forms driven submissions, are you tracking form submissions correctly? If you’re trying to direct people to local locations, then are you capturing clicks on location listings, on embed maps, etc.?

    What about all those social media icons scattered on your website to drive people to your corporate Twitter, LinkedIn, Facebook accounts? Are you tracking clicks on those?
  8. Campaigns: Is there a formal process in place to ensure links on digital campaigns are created in a consistent manner? As part of this are your marketing channels correctly configured within your analytics system?

You now have an outline for where to start your analytics audit. Think of your organization’s analytics data and reporting systems like a car. It always seems to be working fine until it stops working. You need to take your car in from time to time for a tune-up. This is what an analytics audit is. The audit will identify things that need to be fixed immediately (some small and some big) plus other items that can be fixed over time. If you don’t fix the items discovered during the audit your analytics system won’t operate optimally and people won’t want to use it. How frequently should an analytics audit be conducted after everything has been fixed? Unlike a car, there is no recommended set amount of time between audits. However, every time your digital properties undergo major updates or if there have been a series of minor updates that can easily be viewed together as a major update, it is time to repeat the audit process.

The post Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data appeared first on Marketing Land.

Soapbox: Is my data telling me the truth?

How email security programs affected my perception of data integrity.

The post Soapbox: Is my data telling me the truth? appeared first on Marketing Land.

As marketers, we face the overwhelming challenge of demonstrating proof that our tactics are effective. But how can we convince management if we are not convinced of our own data?

Here’s the reality, which I recently learned for myself: If you’re running email marketing, it’s very likely that your performance reports are not disclosing the full truth… inflated CTRs (click-through rates) and open rates being the main culprits. 

Email security programs – loved by recipients, hated by senders

Barracuda. SpamTitan. Mimecast. Email bots that serve a single purpose: to protect users from unsafe content. These programs scan inbound emails and attachments for possible threats, including viruses, malware, or spammy content by clicking on links to test for unsafe content.

For email marketers, this creates several challenges:

  • Inflated CTRs and open rates due to artificial clicks and opens 
  • Disrupting the sales team’s lead followup process as a result of false signals
  • Losing confidence in data quality (quantity ≠ quality)

Real or artificial clicks?

In reviewing recent email marketing performance reports, I noticed an unusual pattern: Some leads were clicking on every link in the email…header, main body, footer, even the subscription preferences link — yet they were not unsubscribing. Not only that, but this suspicious click activity was happening almost immediately after the email was deployed. I speculated that these clicks were not “human”, but rather “artificial” clicks generated from email filters. 

Hidden pixels are your frenemy

To test my hypothesis, I implemented a hidden 1×1 pixel in the header, main body, and footer section in the next email. The pixels were linked and tagged with UTM tracking — and only visible to bots.

Sure enough, several email addresses were flagged as clicking on the hidden pixels.

All that brings me back to the question of whether or not marketing data can be trusted. It’s critical to “trust, but verify” all data points before jumping to conclusions. Scrutinizing performance reports and flagging unusual activity or patterns helps. Don’t do an injustice to yourself and your company by sharing results that they want (or think they want) to hear. Troubleshoot artificial activity and decide on a plan of action:

  • Use common sense and always verify key data points
  • Within your email programs, identify and exclude bots from future mailings
  • Share results with management, sales, and other stakeholders

A word of caution… 

Tread carefully before you start implementing hidden pixels across your email templates. Hiding links might appear to email security programs as an attempt to conceal bad links. You could be flagged as a bad sender, so be sure to run your email through deliverability tools to check that your sender score isn’t affected.

As the saying goes, “There are three kinds of lies: lies, damned lies, and statistics.” Sigh.

With different solutions circulating within the email marketing community, this is likely the “best solution out of the bad ones”. It all depends on what works best with your scenario and business model. 

Godspeed, marketer! 

The post Soapbox: Is my data telling me the truth? appeared first on Marketing Land.

Soapbox: Marketing data overload? Answer 2 questions that will instantly help

Success means having the answer to why the data matters and next steps are needed to turn it into action.

The post Soapbox: Marketing data overload? Answer 2 questions that will instantly help appeared first on Marketing Land.

Marketers are continuously dealing with a never-ending tsunami of data from multiple sources and wondering what to do with it all. It’s easy to come down with a bad case of “analysis paralysis” if you aren’t careful.

Whenever feeling overwhelmed by too much data, I’m reminded of Andy Pearson, CEO at PepsiCo and named by Fortune in 1980 as “one of the ten toughest bosses in America.” That distinction was well deserved because God help anyone in a meeting with Andy who wasn’t well prepared. If you just shared a bunch of facts and figures with him, he would scream out in the boardroom, “What? So what? Now what?!” Many PepsiCo careers were made or destroyed based on what the presenter did next.

I have a vivid image of Andy in my mind any time I’m in a meeting where too much data is being shared. I get the urge to (politely) ask Andy’s questions to the presenter: “What? So what? Now what?!” Marketers deal with unprecedented amounts of data today, so it’s critical to keep this admonition top-of-mind when analyzing and sharing key data with your team and management.

The “what” is in ample abundance – audience demographics, attitudes, behaviors, digital preferences, competitive data and includes all the operational and financial data at your disposal. The key to success needs to be on the “so what” and “now what” if all that wonderful data is going to be turned into consumer insights and meaningful action to build your business.

Soapbox is a special feature for marketers in our community to share their observations and opinions about our industry. You can submit your own here.

The post Soapbox: Marketing data overload? Answer 2 questions that will instantly help appeared first on Marketing Land.

Don’t misinterpret the data: Evidence-based advertising needs experience-based context

Data will always be the foundation of our evidence, but we need to consider our experience to structure and interpret this raw information.

The post Don’t misinterpret the data: Evidence-based advertising needs experience-based context appeared first on Marketing Land.

“That sounds like a great idea, but what does the data tell us?” In recent years, the principle of evidence-based advertising has taken hold of the industry, bringing the tension between advertising as a science and an art to the foreground. For some, like Professor of Marketing Science Byron Sharp, the answer is clear: in the world of Big Data, evidence must take precedence over conventional wisdom. But what exactly is evidence, and how is it best used?

Broadly speaking, evidence is information that provides a foundation for some belief. As such, raw data can certainly be used as evidence, but so can intuition. After all, I don’t need to consult a spreadsheet to be confident that the sun will rise each morning; my belief is founded on years of experience stored in a mental “database.” Advertising is no different.

This is not to say that hard numbers are gratuitous, or that industry expertise holds all the answers. We must recognize, however, that there are different kinds of evidence, and each serves a distinct function in the analytic and decision-making process. To tease out these distinctions, we can turn to the academic world, where evidence is precisely classified.

In academic work, evidence is divided into three categories: primary, secondary, and tertiary. These are distinguished by 1) where they come from, and 2) how they are stored. Primary sources are typically original accounts or physical evidence, while secondary and tertiary evidence layers interpretation and analysis over primary information.

Primary evidence

Raw consumer data is primary evidence: it expresses documented consumer behavior or literal consumer responses to some prompt. The results of ad effectiveness research, like breakthrough, branding, and brand-impact are all primary evidence; assuming participants responded honestly, they reflect the ad’s impact on consumer perspectives.

Unlike most primary evidence, ad data is typically modeled to make interpretation clear: likeability goes up when more respondents like the ad, brand, or product. But knowing how an ad affects research metrics isn’t in itself an insight, much less a strategy. Consequently, primary evidence must always be subjected to analysis, whose product is secondary evidence.

Secondary evidence

Analysis synthesizes research results into insights – actionable data-driven learnings – through deductive or inductive logic. The resulting secondary evidence is an interpretation of primary data: it uses facts to support conclusions about why consumers responded as they did, and what that suggests about the ad’s performance.

A close up of a hand

Description automatically generated

Secondary sources represent what analysts think primary evidence (consumer responses) meant; they can guide our interpretation thereof, but their validity is contingent on whether the analysis was correct. A quarterly report, for example, may draw on primary evidence to more accurately represent consumer sentiment, but only by taking the risk of misinterpreting the data.

Tertiary evidence

This article is a primary source, as it expresses my perspective on evidence-based ads, but my article on the science of storytelling is a tertiary source: it brings together different analyses to make a broader point about the intersection of neuroscience and advertising. A year-end report that synthesizes quarterly analysis into an overarching narrative is also tertiary evidence.

A picture containing text

Description automatically generated

Tertiary sources are an aggregation of primary and secondary data, forming a curated body of knowledge; they allow us to compare analyses, and to develop a more authoritative interpretation. Our experience in the ad industry can be thought of as tertiary evidence: all the ads and research we have been exposed to nuance our interpretation of new data.

Using primary evidence

Despite its name, primary evidence is almost never the right place to start an investigation. Secondary and tertiary evidence can help us define our questions and develop hypotheses before we begin collecting primary data. When we finally do sit down with a dataset, we should again look to secondary and tertiary sources for context.

In fact, this is a process that most of us undertake without thinking. We design research with the aid of past successes and failures, and when we analyze results, we look for patterns and flags that we’ve seen before. Unfortunately, in doing so implicitly we may draw on personal assumptions or misguided conventional wisdom instead of evidence-based learnings.

It is here that the balancing act between evidence and experience begins. Disregarding experience squanders years of information we have collected in our mental database; taking its veracity and logical coherence for granted, however, can lead us to baseless conclusions. Secondary documents draw on primary evidence to help substantiate and vet our intuition.

Using secondary evidence

Secondary evidence is both the product of analysis, and a vital tool in the analytical process itself. When we interpret primary data, we create secondary evidence. As primary evidence is rarely contextualized, and often does not lend itself immediately to interpretation, it is usually necessary to draw on existing secondary sources to guide or corroborate our analysis. 

When looking at consistent consumer behavior, for example, we understand it likely represents a trend. We’ve seen this type of pattern before: regular observations of some phenomenon have led us to draw a reliable conclusion. In this case, we can rely on evidence-based intuition to analyze the data, but more complex conclusions may require additional evidence.

As noted, our intuition itself is often implicitly secondary evidence. It is important to remember, though, that all secondary evidence must point back to primary sources. It may seem obvious, but this distinction is what separates analysis from assumption. Tertiary evidence is a good way to evaluate secondary sources and the analysis they rely on.

Using tertiary evidence

Tertiary evidence is often a review of secondary sources that evaluates their merit and posits a higher-order conclusion. A collection of case studies in some research methodology or advertising practice is a prime example. Consistency in their results may suggest the existence of an advertising principle, while discrepancies could indicate methodological errors.

A person in a library

Description automatically generated

The movement from primary to tertiary evidence is one of synthesis and abstraction. This can be extremely useful, but it also distances us from raw data: primary evidence. Tertiary sources are typically arranged to evidence a specific argument. Necessarily, this is to the exclusion of other insights that may be gleaned from the initial evidence.

In other words, tertiary sources result from the analysis of analysis. They can be useful both in evaluating completed research and identifying new research questions. Above all, they unite historical learnings to substantiate industry principles. By evidencing or challenging conventional wisdom, they advance our understanding of consumer behavior and advertising techniques.

Putting it all together

When we commission a new piece of research, we should always begin with extant secondary and tertiary evidence. Whether it is stored in slide decks or our memory, this historical research can inform what questions we ask, and how. That way, we won’t collect data that isn’t useful, and the data we do collect will be optimized to support the answers we need.

A picture containing sky, outdoor, person, object

Description automatically generated

Through analysis, we then convert these primary sources into secondary evidence. Again, our interpretation of the raw data will be nuanced by our experiences with similar ads or research. The insights we develop will thus yield usable and reliable conclusions about the ad, brand, or product in question. To present them, we should group them into a piece of tertiary evidence.

Much like the Pyramid Principle, different kinds of evidence build on one another to add meaning and mitigate misinterpretation. Hard data will always be the foundation of our evidence, but without drawing on the secondary and tertiary evidence of experience we are left without a reliable way to structure and interpret this raw information.

The post Don’t misinterpret the data: Evidence-based advertising needs experience-based context appeared first on Marketing Land.

The data behind incrementality on Amazon

The key to driving incremental sales combines segmented bidding strategies, contextualizing ACoS metrics and proper campaign structure.

The post The data behind incrementality on Amazon appeared first on Marketing Land.

Every marketer worth their salt is concerned about incrementality. Companies have, and should be, reevaluating their budgets, channels and service providers based on the ability to drive sales from advertising that they wouldn’t have captured otherwise.

When it comes to Amazon, this issue is particularly important, because current SERPs can naturally cannibalize an otherwise organic sale with an ad for the same product that shows up prior to the organic result. For marketers, the key to managing this issue and driving incremental sales is through a combination of segmented bidding strategies for brand, category, and competitor key terms, contextualizing Advertising Cost of Sale metrics, and proper campaign structure.

The non-incremental trap of branded keywords

Each of the larger term segments – branded, generic, and competitor – needs to be thought of in terms of the consumer’s place in the purchase funnel:

  • Brand keywords capture shoppers deepest in your purchase funnel
  • Competitor keywords capture shoppers that are deep in someone else’s purchase funnel
  • Category keywords capture shoppers at the top of your purchase funnel 

On Amazon, when a user searches for some variation on a brand name, the A9 algorithm generally does whatever it can to surface as many of those brand’s products as possible on that search page. That includes both top sellers, which will get the top organic placements, along with the long tail, which will occupy spots further down the page. Amazon takes the intent of the user – “I want to see products from this brand” – very seriously. This is borne out in the underlying data – it is much, much harder to rank organically on a category term, as compared to a branded term, as seen in the examples below.

A screenshot of a cell phone

Description automatically generated

This shows why it’s a real challenge for your brand’s keywords to be incremental. It’s almost guaranteed that your relevant products will show up organically on the SERP – with your top sellers showing up high in the results. Additionally, consumers are more likely to click on the top few results of a branded search, as compared to generic searches.

A screenshot of a cell phone

Description automatically generated

The biggest takeaway here is that advertising your top products on your own branded terms is a particularly bad practice. You’re capturing sales via paid placements you were likely to capture organically anyway. If brand defense is imperative, consider advertising new or longer-tail products on your branded terms instead. That way you’re defending your brand term, but you’re doing so by helping to sell products that aren’t yet ranking well organically, while still not cannibalizing sales of your top products.

Maximizing sales across category and competitor terms

In terms of incrementality, nothing is better than capturing a sale from your competitor. However, you’re likely to find that the ACoS of competitor keywords is significantly worse than that of generic or category keywords, as shown in the example below.

A screenshot of a cell phone

Description automatically generated

The best strategy here depends a lot on your competitive landscape. Conquesting your competitor’s terms means having a deep understanding of the terms which you can reasonably bid against successfully. Terms relating to stronger competitors with deeper brand loyalty/recognition may necessitate a less aggressive strategy to control costs, while it may be well worth your while to bid forcefully against terms related to relatively weak competitors where it’s easier to pick off customers with your top products.

As opposed to branded terms, the universe of category keywords is understandably the largest on Amazon, with new relevant terms developing over time. In this larger and more dynamic environment, it’s important that you set bids based on the expected conversion rate of a given term.

A screenshot of a cell phone

Description automatically generated

The issue here, which I’ve written about in a previous column, is that over 90% of category keywords do not get more than one click per day, given a constant bid. Additionally, with roughly 80 clicks being necessary to get a confident estimate of the true conversion rate of a given keyword, running this test could take nearly two and half months. Meanwhile, conversion rates change roughly every month – as one extreme example, think of the expected conversion rate for “easter candy” in April versus May or June.

To succeed with category keywords you must have an exploration strategy that values the rate of data acquisition. At my current workplace, we use a probalistic binary search model, that adjusts bids from very high to low in order to more quickly determine the expected conversion rate.

Outside of this more refined statistical method, what marketers can do to better find and exploit meaningful keywords on Amazon is deploy a more granular campaign structure. Because keywords define audience segments, each audience segment needs a different set of considerations in terms of aggressiveness and expectations.

A screenshot of a cell phone

Description automatically generated

To spell this out, brand keyword campaigns should have high ROAS expectations, focus on emerging products, and get tested for incrementality. Competitor keyword campaigns should have the lowest ROAS expectations and focus on launching and dominant products. Finally, Category keyword campaigns should be expected to give you a break-even ROAS and must be handled with a strong exploration strategy. These overarching themes are important to keep in mind as you scale your marketing efforts on Amazon because they are critical to driving incremental sales growth.

The post The data behind incrementality on Amazon appeared first on Marketing Land.

Align your marketing plan with your analytics measurement plan

Audit your analytics on a regular basis to ensure the data you are collecting is as accurate as possible and that all data that should be collected is reported.

The post Align your marketing plan with your analytics measurement plan appeared first on Marketing Land.

All great marketing departments and teams should have a marketing plan and know it intimately. What is surprising is how often when conducting an analytics audit when the first thing I ask for is a copy of their marketing plan, how often I’m presented with a “deer in headlights” look or at best given the response “Oh we have one, but haven’t updated it in years. We just know it!”

Why is having a current marketing plan that is disseminated and known by your entire marketing team and other teams within your organization critical and how does this have anything to do with your corporate analytics? For the simple reason: Without one, how does the marketing team actually know what they should be doing. More importantly, how can success be measured?

The key to marketing success is to merge a marketing plan with a measurement plan into a unified plan.

Key measurement elements of a marketing plan

The first step in developing or validating a marketing plan is ensuring the marketing department’s mission statement aligns with the corporate mission statement. This is where many marketing teams make their first mistake. If the two don’t align properly, then how can the marketing department effectively obtain corporate buy-in and ensure their marketing efforts are effective in helping the organization meet its overall goals?

Once the marketing department has validated its mission statement, it needs to define specific objectives. Then, it is time to merge these elements with a measurement plan that defines specific marketing tactics that can be planned, budgeted for, approved, executed and measured.

 Clearly define these tactics. Examples can be:

  • More posts (paid and non-paid) on specific social apps (i.e. Facebook, Twitter, Reddit, etc.)
  • More engagement with the public on social media sites
  • Creation of branded ads 

Perhaps the most difficult task in this process is that of defining the appropriate Key Performance Indicators to measure how these tactics measure up. Some KPI in support of the above examples might be:

  • Increase in branded organic search traffic
  • Overall increase in organic search traffic
  • Increased activity/engagement on corporate social media accounts including click-throughs on posts
  • Increase click-through rates on branded ads
  • Increase in online sales by specific channels (organic search, branded campaigns, social accounts, etc.)

When defining your KPI keep in mind the following four factors that make up a useful KPI:

  1. Must utilize obtainable data
  2. Must relate directly to the marketing objective
  3. Should not be an absolute value but a ratio or comparison. For example, a KPI for improved customer engagement might be to increase average session duration. Or, measure average time on site comparing period one to period two. Or, a KPI for measuring effective campaigns on the “average number of orders per 100 sessions” by campaign
  4. Must be easily reportable and understandable by the target audience.

With the KPIs in place, ensure your analytics account is configured correctly. Ensure that you can accurately – without too much effort – report on the identified KPIs. Have all the required channels been defined? Don’t just rely on default channels from your analytics tool. Make sure the marketing activities to support the marketing department’s mission statement are realistic and approved.

Marketing measurement plans are typically in a layout grid format. Do a quick search and you’ll discover many suggested layouts. My favorite is simple:

With a merged marketing measurement plan in place, the next task is to align your corporate analytics to capture appropriate data and making it reportable becomes an easier task, as well as getting buy-in from other departments.

Imagine if your marketing plan didn’t include a KPI on sales by channel? Could you get your IT group to make the necessary coding changes to push transaction values to your analytics tool? Frequently they simply default to saying: “Sales information is available from our e-commerce tool.” While true, you can extract sales data from a backend tool, in virtually all cases, you can’t attribute those sales to specific marketing efforts. It is only with an approved marketing plan in place can you apply leverage and get this data integrated with your analytics.

What about custom channels? Do you need to segregate paid social, from non-paid social (your team’s participating on social sites on your own posts), from public sharing of your content? Yes, these are three unique social channels that should be tracked and reported on, if your company is utilizing these channels as part of their marketing plan.

You can create custom analytic reports that demonstrate how effective various marketing efforts are in support of not only the marketing department’s mission statement but also the corporate mission statement. This allows you to easily evaluate and adjust with objectively to demonstrate just how successful these efforts are to the c-suite.

Remember that marketing mission statements are a living and breathing thing. The world of online marketing is constantly changing as are the tools that help execute marketing plans and those that measure results. Plan on reviewing the mission statement at a minimum annually and possibly quarterly or semi-annually if appropriate for the organization. Don’t forget to get analytics audited by an independent auditor on a regular basis to ensure the data being collected is as accurate as possible, and that all data that should be collected is reported.

The post Align your marketing plan with your analytics measurement plan appeared first on Marketing Land.

What are analytics experts looking to in 2020 with data and privacy?

Logan Gordon, Simo Ahava, Astrid Illum, Abby Matchett and Sayf Sharif share insights to help you gain executive buy-in about privacy policy issues this year.

The post What are analytics experts looking to in 2020 with data and privacy? appeared first on Marketing Land.

While researching the state of tracking and data privacy, I talked to many smart industry experts and asked several to share their advice for 2020. It’s one thing for me to offer their executive summary, it’s another to hear it directly from them.

Plus, these folks will be helpful as you look for executive buy-in. “But Simo Ahava and Abby Matchett said…”

What do the experts think?

This must begin with a huge thanks to the following smart folks who shared their time and talent with us as we, collectively, prepare for the upcoming year. One of the best things about web analytics and digital marketing communities is the perspective that we are all in it together. I would encourage you to follow these fearless leaders, contribute to the conversation with them, and don’t be afraid to reach out for guidance.

Logan Gordan

The changes aren’t over yet, and I would expect continual developments geared toward greater privacy and greater transparency for the foreseeable future.

My advice is to color inside the lines. Attempts to work around or even toe the line will find themselves having to reinvent their approach on a regular basis as new privacy protections take effect. Instead, privacy-first approaches will find themselves having to spend less effort to comply with the changing data landscape.

Simo Ahava

This is the time to build a solid and robust benchmark. Go through your data from the past two years and try to identify the rate of cookie loss. The longer the period of time you’re investigating the higher the cookie loss.

Similarly, if you’re not already doing so, implement an ad block detection system. The best way to do this is to run some client-side JavaScript that uses a namespace of a known tracker — name it e.g. “ads.js” — and then send hits to some custom data store you own (so not Google Analytics) if that file is blocked by the browser.

Then, segment your data by browser. Check especially the usage statistics for Firefox and Safari, as they are the most prominent tracking prevention browsers out there. Note that this isn’t an exact science. Especially Chromium-based browsers (Chrome, Edge, Brave) might make it difficult to distinguish one browser from the other.

Once you have a benchmark, you know the scope of the problem. You can apply these numbers to your analyses by introducing margins of error based on the cookie loss statistics and the amount of ad blocking in use. For example, if your data shows that 20% of all visitors to your site block Google Analytics, you can be less worried about the 10% of the discrepancy between transactions collected by GA vs. your backend.

Astrid Illum

I believe that the current quickening pace towards restrictions on storing and using data will continue – involving both tech providers and the judiciary. But local rulings will provide interpretations on application to specific cases pointing in different directions since there is a lack of understanding of the basic issues at stake in the technical underpinnings of modern websites. Rulings in some countries will point in one direction, and in another direction in another country. This will make the situation a difficult one to operate in for most companies.

While we are waiting for the ramifications of existing laws to unfold and while a deeper understanding of the basic issues at stake is not yet widely held by the people applying said laws – marketers have to adopt a dual strategy: First off keep to the strictest interpretation of the laws to mitigate risk and secondly work to create a language around use of data that showcases the major part of why sharing data is important: To improve our digital products. Current language lumps together all kinds of data collection in one big suspect pot – in large part due to specific types of tools, practices and methods that are unduly invasive or boundless. Marketers and their technical colleagues in analytics should work together to rescue all the valiant uses of data our modern world is built on.

Abby Matchett

I think 2020 is going to be the year of evaluation. Marketing strategies, data collection strategies, and platforming strategies are all going to be called into question as regulations tighten and browsers participate more actively in privacy regulation.

For marketers dealing with data loss and other privacy concerns, this change is an opportunity to re-evaluate their initiatives. This is a time to take stock of their programs, and identify their key objectives – ensuring that their marketing initiatives are aligning with the overall business objectives. Marketers will need to adapt to the changing environment, which really will be the new norm!

Sayf Sharif

You are not an attorney so do not feel like you need to tell your bosses or clients what to do. Give them the breadth of options and the strengths and weaknesses of the approaches to how they deal with privacy, GDPR, web tracking implications, etc. Stay on top of what options there are, and how those options impact negatively or positively your ability to provide an ROI on analytics work. Offer to speak with their attorneys and provide them technical advice/guidance on what you can do, and how you can do it, but ultimately let the attorneys make the decisions on how they want to proceed.

As an aside, I see many consultants making recommendations of what to do, what not to do at conferences for instance, and at the end of the day a consultant should not be making a specific recommendation here, only providing options and advice on impact for their clients, rather than legal advice as in “this is what you need to do” because that liability for the decision lies at the feet of the consultant. It’s not our responsibility to determine what moral/ethical/legal direction their company can go, we should focus on what we can technically do, what the new limitations of browsers are, and then provide those options to our clients to make the decisions themselves, while also being aware of what the laws are, and ultimately doing our best to not break any laws knowingly ourselves even at the direction of our clients.

The post What are analytics experts looking to in 2020 with data and privacy? appeared first on Marketing Land.

The state of tracking and data privacy in 2020

Here’s where search marketers find themselves in the current entanglement of data and privacy and where we can expect it to go from here.

The post The state of tracking and data privacy in 2020 appeared first on Marketing Land.

January 2020 felt like a turning point. CCPA went into effect, Google Chrome became the latest browser to commit to a cookie-less future and, after months of analytics folks sounding the alarm, digital marketers sobered to a vision of the future that looks quite different than today.

This article is not a complete history of consumer privacy nor a technical thesis on web tracking, although I link to a few good ones in the following paragraphs.

Instead, this is the state of affairs in our industry, an assessment of where search marketers find themselves in the current entanglement of data and privacy and where we can expect it to go from here.

This is also a call to action. It’s far from hyperbole to suggest that the future of digital and search marketing will be greatly defined by the actions and inactions of this current calendar year.

Why is 2020 so important? Let’s assume with some confidence that your company or clients find the following elements valuable, and review how they could be affected as the associated trends unfold this year.

  1. Channel attribution will stumble as tracking limitations break measurability and show artificial performance fluctuations.
  1. Campaign efficiency will lose clarity as retargeting efficacy diminishes and audience alignment blurs.
  1. Customer experience will falter as marketers lose control of frequency capping and creative sequencing. 

Despite the setbacks, it is not my intention to imply that improved regulation is a misstep for the consumers or companies we serve. Marketing is at its best when all of its stakeholders benefit and at its worst when an imbalance erodes mutual value and trust. But the inevitable path ahead, regardless of the destination, promises to be long and uncomfortable unless marketers are educated and contribute to the conversation.

That means the first step is understanding the basics.

A brief technical history of web tracking (for the generalist)

Search marketers know more than most about web tracking. We know enough to set people straight at dinner parties — “No, your Wear OS watch is not spying on you” — and follow along at conferences like SMX when a speaker references the potentially morbid future of data management platforms. Yet most of us would not feel confident in front of a whiteboard explaining how cookies store data or advising our board of directors on CCPA compliance. 

That’s okay. We’ve got other superpowers, nice shiny ones that have their own merit. Yet the events unfolding in 2020 will define our role as marketers and our value to consumers. We find ourselves in the middle of a privacy debate, and we should feel equipped to participate in it with a grasp of the key concepts. 

What is the cookie? 

A cookie stores information that is passed between browser and server to provide consistency as users navigate pages and sites. Consistency is an operative word. For example, that consistency can benefit consumers, like the common shopping cart example. 

Online shoppers add a product to the cart and, as they navigate the site, the product stays in the shopping cart. They can even jump to a competitor site to price compare and, when they return, the product is still in the shopping cart. That consistency makes it easier for them to shop, navigate an authenticated portion of a site, and exist a modern multi-browser, multi-device digital world.

Consistency can also benefit marketers. Can you imagine what would happen to conversion rates if users had to authenticate several times per visit? The pace of online shopping would grind to a crawl, Amazon would self combust, and Blockbuster video would rise like a phoenix.

But that consistency can violate trust. 

Some cookies are removed when you close your browser. Others can accrue data over months or years, aggregating information across many sites, sessions, purchases and content consumption. The differences between cookie types can be subtle while the implications are substantial.

Comparing first- and third-party cookies

It is important for marketers to understand that first- and third-party cookies are written, read and stored in the same way. Simo Ahava does a superb job expanding on this concept in his open-source project that is absolutely recommended reading. Here’s a snippet.

It’s common in the parlance of the web to talk about first-party cookies and third-party cookies. This is a bit of a misnomer. Cookies are pieces of information that are stored on the user’s computer. There is no distinction between first-party and third-party in how these cookies are classified and stored on the computer. What matters is the context of the access.

The difference is the top-level domain that the cookie references. A first-party cookie references and interacts with the one domain and its subdomains. 

  • searchengineland.com
  • searchengineland.com/staff
  • events.searchengineland.com

A third-party cookie references and interacts with multiple domains. 

  • searchengineland.com
  • events.marketingland.com
  • garberson.org/images

Marketing Land has a helpful explainer, aptly called WTF is a cookie, anyway? If you’re more of a visual learner, here is a super simplistic explanation of cookies from The Guardian. Both are from 2014 so not current but the basics are still the basics.

Other important web tracking concepts

Persistent cookies and session cookies refer to duration. Session cookies expire at the end of the session when the browser closes. Persistent cookies do not. Data duration will prove to be an important concept in the regulation sections. 

Cookies are not the only way to track consumers online. Fingerprinting, which uses the dozens of browser and device settings as unique identifiers, has gotten a lot of attention from platform providers, including a foreshadowed assault in Google’s Privacy Sandbox announcement.

Privacy Sandbox is Google’s attempt at setting a new standard for targeted advertising with an emphasis on user privacy. In other words, Google’s ad products and Chrome browser hope to maintain agreeable levels of privacy without the aggressive first-party cookie limitations displayed by other leading browsers like Safari and Firefox.

Storage is a broad concept. Often it applies to cookie storage, and how browsers can restrict the storage of cookies, but there are other ways to store information. LocalStorage uses Javascript to store information in browsers. It appeared that alternate storage approaches offered hope for web analysts and marketers affected by cookie loss until recent browser updates made those tactics instantly antiquated.   

Drivers: How we got here

It would be convenient if we could start this story with one event, like a first domino to fall, that changed the course of modern data privacy and contributed to the world we see in 2020. For example, if you ask a historian about WWI, many would point to a day in Sarajevo. One minute Ol’ Archduke Ferdinand was enjoying some sun in his convertible, the next minute his day took a turn for the worse. It is hard to find that with tracking and data privacy. 

Facebook’s path to monetization certainly played a part. In the face of market skepticism about the social media business model, Facebook found a path to payday by opening the data floodgates.

While unfair to give Facebook all the credit or blame, the company certainly supported the narrative that data became the new oil. An iconic Economist article drew several parallels to oil, including the consolidated, oligopolistic tendencies of former oil giants.

“The giants’ surveillance systems span the entire economy: Google can see what people search for, Facebook what they share, Amazon what they buy,” the Economist wrote. “They own app stores and operating systems, and rent out computing power…”

That consolidation of data contributed to an increase in the frequency and impact of data leaks and breaches. Like fish in a bucket, nefarious actors knew where to look to reap the biggest rewards on their hacking efforts.

It was a matter of time until corporate entities attempted to walk the blurring line of legality, introducing a new weaponization of data that occurred outside of the deepest, darkest bowels of the internet.

Enter Cambridge Analytica. Two words that changed the way every web analyst introduced themselves to strangers. “I do analytics but, you know, not in, like, a creepy way.”

Cambridge Analytica, the defunct data-mining firm entwined in political scandal, shed a frightening light on the granularity and unchecked accessibility of platform data. Investigative reporting revealed to citizens around the world that their information could not only be used by advertising campaigns to sell widgets, but also by political campaigns to sell elections. For the first time in many homes, the effects of modern data privacy became tangible and personal.  

Outcomes: Where we are today

The state of data privacy in 2020 can perhaps best be understood by framing it in terms of drivers and destinations. Consumer drivers, like those mentioned in the previous section, created reactions from stakeholders. Some micro-level outcomes, like actions taken by individual consumers, were predictable. 

For example, the #deletefacebook hashtag first trended after the Cambridge Analytica story broke and surveys found that three-quarters of Americans tightened their Facebook privacy settings or deleted the app on their phone. 

The largest outcomes are arguably happening at macro levels, where one (re-)action affects millions or hundreds of millions of people. We have seen some of that from consumers with the adoption of ad blockers. For publishers and companies that live and die with the ad impression, losing a quarter of your ad inventory due to ad blockers was, and still is, devastating. 

Political Outcomes

Only weeks after Cambridge Analytica found its infamy in the headlines, the European Union adopted GDPR to enhance and defend privacy standards for its citizens, forcing digital privacy discussions into both living rooms and board rooms around the world.  

Let’s use the following Google Trends chart for “data privacy” in the United States to dive deeper into five key outcomes.

General Data Protection Regulation (GDPR) has handed out more than 114 million in fines to companies doing business in the EU since becoming enforceable in May 2018. It’s been called “Protection + Teeth” in that the law provides a variety of data protection and privacy rights to EU citizens while allowing fine enforcement of up to €20 million or 4 percent of revenue, whichever hurts violators the most.

Months later, the United States welcomed the California Consumer Privacy Act (CCPA), which went into effect in January 2020 — becoming enforceable in July. Similar to GDPR, a central theme is transparency, in that Californians have the right to understand which data is collected and how that data is shared or sold to third parties.

CCPA is interesting for a few reasons. California is material. The state represents a double-digit share of both the US population and gross domestic product. It is also not the first time that California’s novel digital privacy legislation influenced a nation-wide model. The state introduced the first data breach notification laws in 2003, and other states quickly followed.

California is not alone with CCPA, either. Two dozen US state governments have introduced bills around digital tracking and data privacy, with at least a dozen pending legislation. That includes Nevada’s SB220 which became enacted and enforceable within a matter of months in 2019.

Corporate Outcomes

Corporate responses have come in many forms, from ad blockers I mentioned to platform privacy updates to the dissolution of ad-tech providers. I will address some of these stories and trends in the following section, but, for now, let’s focus on the actions of one technology that promises to trigger exponential effects on search marketing: web browsers.

The Safari browser introduced Intelligent Tracking Prevention (ITP) in 2017 to algorithmically limit cross-site tracking. Let’s pause to dissect the last few words in that sentence.

  • Algorithmically = automated decisions that prioritize scale over discernment
  • Limit = block immediately or after a short duration
  • Cross-site tracking = first- and third-party cookies

ITP 1.0 was only the beginning. From there, the following iterations tightened cookie duration, storage, and the role of first-party cookies for web analytics. Abigail Matchett explains the implications for users of Google Analytics.

“All client-side cookies (including first-party trusted cookies such as Google Analytics) were capped to seven days of storage. This may seem like a brief window as many users do not visit a website each week. However, with ITP 2.2 and ITP 2.3… all client-side cookies are now capped to 24-hours of storage for Safari users… This means that if a user visits your site on Monday, and returns on Wednesday, they will be granted a new _ga cookie by default.”

You are beginning to see why this is a big deal. Whether intended or not, these actions reinforce the use of quantitative metrics rather than quality measures by obstructing attribution. There is far more than can be said on ITP so if you are ready for a weekend read, I recommend this thorough technical assessment of the ITP 2.1 effects on analytics.

If ITP got marketer’s attention, Google reinforced it by announcing that Chrome would stop supporting third-party cookies in two years, codifying for marketers that cookie loss was not a can to be kicked down the road. 

“Cookies have always been unreliable,” Simo Ahava told me. “To be blind-sided by the recent changes in web browsers means you haven’t been looking at data critically before. We are entering a post-cookie world of web analytics.”

Where it goes from here

The state of tracking and data privacy can take several paths from here. I outline a few of the most plausible then ask others in the analytics and digital space to offer their insights and recommendations. 

2020 Path A: Lack of clarity leads to little change from search marketers

This outcome seemed like a real possibility in the first week of January as California enacted CCPA while enforcement deadlines got delayed. It was not yet clear what enforcement would look like later in the year and it appeared, despite big promises, that tomorrow would look a lot like today. 

This path looked less likely after the second week of January. That leads us to the next section.

2020 Path B: Compounding tracking limitations keep marketers on their heels

Already in 2020 we have seen CCPA take effect, Chrome put cookies on notice, stocks for companies that rely on third-party cookies tumble, and the sacrifice of data providers that threatened consumer trust.

And that’s just January.

2020 Path C: Correction as consumer fear eases in response to industry action

The backlash to tracking and privacy is a reaction to imbalance. Consumers are protecting their data, politicians are protecting their constituents, and platforms are protecting their profits. As difficult as it is to see from our vantage point today, it’s most likely that these imbalances will normalize as stakeholders feel safe. The question is how long it will take and how many counter adjustments are required in the wake of over or under correcting.

As digital marketers, who in some ways represent both the consumers with whom we identify and the platforms with whom we depend, are in a unique position to expedite the correction and return to balance.

The post The state of tracking and data privacy in 2020 appeared first on Marketing Land.

Soapbox: Are ad blockers breaking the foundations of digital marketing?

Now is the time to start rethinking what’s next for website analytics.

The post Soapbox: Are ad blockers breaking the foundations of digital marketing? appeared first on Marketing Land.

I work for a B2B SaaS product and one of my tasks is to produce a monthly analytics report that breaks down our lead conversion rates. I track the conversion rate of website lands > demo requests > trials > closed deals.

Last month I was asked to create a Zapier integration that sent an alert to a Slack channel every time someone requested a demo through the website.

When doing the month’s report, I saw our Google Analytics demo request events were 22% lower than the number of messages sent to the Slack channel. It turns out ~20% of our visitors were blocking Google Analytic’s tracking.

After some research, I found that an average of 24% of internet users use an ad blocker. As more users become frustrated by ads, and Safari is looking to win the war for user privacy with intelligent tracking prevention built directly into the browser, the prevalence of ad blockers for all users will keep rising.

Are ad blockers going to break all of our analytics some point soon? It’s looking like they will and we need to start rethinking what’s next in how we use our website analytics.

Soapbox is a special feature for marketers in our community to share their observations and opinions about our industry. You can submit your own here.

The post Soapbox: Are ad blockers breaking the foundations of digital marketing? appeared first on Marketing Land.

Machine learning will free up time to be more strategic with accounts in 2020

Data is giving us an opportunity to look at bigger picture decisions in our accounts rather than the day-to-day work we have been doing.

The post Machine learning will free up time to be more strategic with accounts in 2020 appeared first on Marketing Land.

Contributor and SMX speaker, Brooke Osmundson, explains how machine learning is changing our account work and why we need to be smarter about layering our campaign assets in 2020.

Below is the video transcript:

Hi, my name is Brooke Osmundson and I am the associate director of research for NordicClick Interactive. And today I want to talk about the top things that marketers should focus on the most for 2020.

The first thing I want to talk about is machine learning. It’s no secret that it is part of our lives right now, and part of your jobs. But what I’m thinking is going to happen is, it’s going to shift your focus on what you’re doing day-to-day within your accounts. We’re going to see less tactical pieces that we have to focus our time on. And it’s really going to help you be more strategic in your account. So with machine learning, what data can give you, what can it do for you to free up more of your time to start thinking more bigger picture and focus on those bigger picture decisions.

The second piece I want to talk about is audience layering on top of your campaigns with the differences in search or match types. You know they’re kind of not a thing anymore, so we’ve got to be smarter about layering on the assets that we have available to us in our campaigns in order to really reach our right customer based on what we know about them.

The post Machine learning will free up time to be more strategic with accounts in 2020 appeared first on Marketing Land.