Bootstrapping Relevance: Making Web Conversions Meaningful for Long Sales Cycles

Most hurricanes that reach the United States start off the coast of West Africa. Those storms join and split with other minor systems as they move across the Atlantic. Some dissipate into a mild breeze; others devastate coastal areas along the Eastern seaboard. So what does an afternoon rainshower over Cape Verde tell you about […]

The post Bootstrapping Relevance: Making Web Conversions Meaningful for Long Sales Cycles appeared first on CXL.

Most hurricanes that reach the United States start off the coast of West Africa. Those storms join and split with other minor systems as they move across the Atlantic. Some dissipate into a mild breeze; others devastate coastal areas along the Eastern seaboard.

So what does an afternoon rainshower over Cape Verde tell you about the next Category 5 hurricane? Often, little more than a form fill tells you about the potential for a five-figure sale months down the road.       

Google Analytics insights frequently end with raw counts of goal completions, leaving a yawning gap between on-site behavior and sales for companies with long sales cycles.

More challenging still, the space between marketers’ realities and solutions is equally vast: Seamless integration of marketing and sales data or a Google Analytics 360 subscription is aspirational.

This post details four steps that any organization can follow to estimate the value of on-site conversions more accurately:

  1. Identify every potential touchpoint.
  2. Organize existing data into an idealized customer journey.
  3. Integrate data into goal completions.
  4. Analyze and act on that data.

No solution is perfect, but incremental progress is possible—and worthwhile.

Why bother? Analytics incentivize behavior

The data-related challenges of long sales cycles are well known: Between a form fill and a sale, there may be dozens of touchpoints spanning weeks or months. Those interactions occur across teams (marketing, sales, customer support) and platforms (analytics, CRM, email).

The challenge of joining those datasets resigns many marketers to limited measurement: We know our data is incomplete, so we might as well just count form fills.

Yet analytics incentivize behavior, and if marketing teams can’t see past total goal completions (euphemistically, “leads”), they’ll devote resources to those efforts—even if a painfully low percentage ever become sales-qualified leads.

The limits of attribution

A common focus for companies with long sales cycles is attribution. But even data-driven attribution, robust as it may be, usually improves attribution of form fills or PDF downloads—marketing metrics that may be weak indicators of sales.

Goal completions can become stronger predictors of sales by pushing data about the relative value of each goal completion back into analytics.

Attribution’s relevance depends on the known value of the conversion.

Regardless of how much data you have, you will make decisions on how to allocate marketing resources. Partial data—or even anecdotal data—can, at the bare minimum, form the basis for experimentation and a means to test your assumptions.

It starts with a survey of all known customer data.

Step 1: Identify every potential touchpoint.

“Long lead time before the sale is an opportunity to do more data collection,” offered Snowplow Analytics’ Anthony Mandelli, “which will ultimately help you in the long run.”

Compare the number of touchpoints in a year-long sales process to the purchase of novelty socks (Mandelli’s example). The latter is a single image, the former a feature-film—a complete narrative with deep insight into what influences consumer behavior.

“It’s a long sales cycle for a reason,” Mandelli continued. “Leads are conducting online and offline research.” The starting point, then, is to “get all your data together somewhere—start with the first interaction, then all the way to purchase.”

That data may include:

  • Form fills
  • PDF downloads
  • Phone calls
  • Email opens/clicks
  • Webinar signups/views
  • Demo requests
  • Free trial signups, etc.

It may also include reports from your sales team, estimates by executives, or other offline sources. At the outset, you simply want to know all the potential sources of data (regardless of whether you’re able to gather them into a Customer Data Platform that curates “a single source of truth“).

You may be missing key data or may not be able to integrate it in future steps, but knowing what exists—and what is or isn’t accessible—helps establish the immediate path forward and guides future improvements.

Step 2: Organize existing data into an idealized customer journey.

Sketching an idealized user journey—or reviewing one already created—is not about forcing users into a linear funnel but about creating a structure to help organize your data.

A customer journey map, Hull’s Ed Fry explains, “highlights the macro-conversions that many teams in the company optimize for (like a new user signing up) vs. micro-conversions that concern few other people.” Each stage in the journey, in turn, is delineated by a conversion:

In a customer journey, the step-by-step progress of a user usually includes a measurable conversion in a digital channel. (Image source)

In an example Mandelli shared, a flooring company had no visibility into what happened between a potential buyer’s $10 sample purchase and a $10,000 sale. Building an idealized user journey—based on data from a real customer—helped the company organize the data they had by the steps the customer took:

  1. Web ad (Google AdWords or Bing)
  2. Visit the website
  3. Order a sample from the website
  4. Review samples
  5. Receive drip email marketing campaign
  6. Purchase flooring (through the web or on the phone

With existing data points plotted along the idealized user journey, ask yourself: “Where are the biggest gaps between touches?” (In the above example, it’s Step 4.) “The goal is not to sink under analysis paralysis,” writes Fry. “It is to simply understand the backbone of your customer journeys.”

A data gap does not invalidate conversion values for long sales cycles. Charles Farina of Analytics Pros explained:

If you are able to qualify a lead quickly, work to connect your metrics to center on qualified leads. From there, try and work further down the funnel.

In other words, if a form fill can be qualified with a second interaction (say, responding to a phone call), that data—the percentage of form fills who become qualified leads—can guide conversion valuation, even if months pass before those qualified leads become sales.

Even with complete data, Farina suggested, you’ll rarely optimize based on close-of-sale metrics: It simply takes too long. If you make changes to service pages today, would you put everything on hold for months while you waited to see how many leads from the updated pages became customers?

What you really need, Farina suggested, is a two-stage optimization process:

Focus on bringing more quality into your funnel, then use the fully connected journey to make additional optimizations on top.

For many, the perspective is liberating: Data points from one or two steps post–form fill can make conversion data vastly more relevant, no matter how long the sales cycle stretches past the initial conversion.

Step 3: Integrate data into goal completions.

There are elegant solutions for integrating Analytics data with CRM data and similar sources:

The potential value of an integration—like pulling Salesforce data into Google Analytics—is clear, but securing the budget is, for most, unrealistic. (Image source)

In the prior example of the flooring company, Snowplow joined the data from web analytics and marketing automation tools to provide ongoing visibility about how users progressed through the journey. But that ongoing portrait—while closer to the ideal—isn’t mandatory.

If you don’t have a sizeable analytics budget or an in-house team of developers to manage multiple connections, use a snapshot of your post-conversion data to adjust Goal Values in Google Analytics.

1. Make periodic calculations for Google Analytics Goal Values

Goal Values assign dollar values to conversions—replacing the faulty “a conversion is a conversion” logic with estimated revenue from on-site actions.

To set Goal Values, you need to calculate the value of a lead on a goal-by-goal basis. In its simplest form, the process divides the total number of goal completions by the revenue from those conversions.

  • 100 form fills
  • 5 form fills convert to sales
  • Each sale generates $10,000 in revenue

Thus, a form fill is worth $500. The calculation requires two data points outside Google Analytics: The number of web leads who became customers, and the value of each sale. (If you don’t have access to both, skip to the second option.)

In a perfect world, the calculations are exact enough to establish ROI for marketing efforts. However, for long sales cycles, obtaining that degree of accuracy is almost impossible—but that shouldn’t keep you from using Goal Values.

Goal Values Are fixed numbers…with relative value

When it comes to long sales cycles, setting the Goal Value of a form fill is less about ROI and more about weighting the impact of on-site behavior. Relative differences in dollar values, as detailed in the fourth step, allow for better comparisons of how each page or channel performs.

For example, if a lead who initiates an engagement with a phone call—tracked via CallRail or Marchex—closes at twice the rate of a form fill, that difference will be reflected in the Goal Value. Likewise, a newsletter signup from a blog post will probably be weighted less (by using sales data from newsletter subscribers).

To think of it another way, not assigning Goal Values gives every goal the same value: $0. If your Goal Values aren’t accurate enough to determine ROI—whether left as $0 or calculated based on sales data—you might as go with the calculated estimate that at least has a chance of being directional.

Note: If seeing “inaccurate” Goal Value figures will ruffle feathers in other departments, create a new View with the same Goals and add estimated Goal Values.

Use Lookup Tables to generate dynamic Goal Values

Not all form fillers—even of the same form—are equal. A Lookup Table in Google Tag Manager (GTM), as Bounteous details, can set dynamic Goal Values based on form inputs.

So, for example, if a form question includes the size of the company, you can adjust the Goal Value based on the likelihood of conversion, average order value, or lifetime value of that demographic.

Set a different Output (Goal Value) for each based on Input (the form-field options):

The Default Value is used if none of the other criteria is met.

Create a Data Layer variable to capture the business category data (the Input field) upon submission. Then, create an Event that pulls in the business category information and the associated lead value from the Lookup Table.

Finally, use the Event value as the Goal Value for the that conversion:

Even if you don’t know the value of a given type of lead—or any lead at all—you still have another option.

2. Estimate the relative value of online touchpoints

If quantitative data on lead conversion rates and order value isn’t available, you can add relative values. Branko Kral of Orbit Media detailed the process for a stem-cell clinic with a long sales cycle and limited data.

They identified the primary touchpoints, then assigned relative values from $100 to $10—the actual dollar values were irrelevant—to gauge the impact of campaigns that spurred a range of micro- and macro-conversions:

  • First-time calls – lead to most new business
  • Repeating calls – also highly valuable
  • Call-back requests – capture contact info and explicitly ask to be contacted
  • Blog subscriptions – capture contact info and indicate trust
  • Video views > 50% of the video length – patients who book often mention they’ve watched the patient testimonial videos
  • Email link clicks – typical for inquiries higher up the funnel
  • Social share clicks – spread the word
  • Views of a Contact Us page – a subtle but valuable indicator of interest

It’s easy to poke holes in the process: How do you know that a social share click is worth say, half that of a video view? You don’t. However, that initial, heuristic estimate is a baseline for hypothesis development and testing.

After all, if you don’t assign Goal Values, you’re still allocating resources based on which actions you perceive to be most valuable. Adding relative Goal Values to on-site conversions makes it easy to visualize the implications of your assumptions throughout your site.

Step 4: Analyze and act on that data.

Adding calculated or relative Goal Values to conversions populates one metric (Page Value) and makes others—even basic channel grouping reports—more instructive.

Page Value   

The Page Value metric provides URL-by-URL valuations of every page. (Image source)

In Google Analytics, Page Value “is the average value for a page that a user visited before landing on the goal page or completing an Ecommerce transaction (or both).” As Effin Amazing notes:

Goals are a Session dimension metric, which means that you cannot use them in a Hit dimension report like Pages report, Event reports, or any type of Custom report built around a Hit dimension.

Page Value bridges the gap between these Session dimensions and Hit dimensions by tying a specific page URL to a monetary value when users complete a goal or transaction.

It’s one way to see the value of content at a URL level. With a Goal Value calculated from actual sales data, the Page Value metric may (roughly) estimate revenue; without it, it still offers a weighted estimate of importance for pages in the conversion process.

That URL-by-URL view can break down further into:

  • Mediums (e.g. organic vs. direct visits to the same page or group of pages)
  • Website sections (e.g. /case-studies/ vs. /whitepapers/)
  • Anything else you can think to add as a secondary dimension.

A caveat on taking action

A one-time estimate of close rates or average order value is good for only so long. The more often (monthly, quarterly) those calculations can be reworked—and Goal Values adjusted—the more reliable that data will be. (Goal Values are not assigned retroactively.)

Further, if an initial estimate suggests that email visitors are more lucrative than those from other channels, that may justify a push to acquire more email addresses—only to capture the addresses of less-relevant, less ready-to-buy visitors.

Every update of your Goal Values, then, is an opportunity to spot diminishing returns and shift marketing resources to another channel or site section. Disappointing as it may be to realize that you’ve exhausted a strategy, you’ll never notice unless you rerun the numbers—all you’ll see is conversions trending up, a vanity metric reaching ever-higher to nowhere.

Conclusion

When it comes to long sales cycles and web conversions, “perfect” is often the enemy of anything. But just because you don’t have uninterrupted lead-to-sale data doesn’t mean you can’t make your web analytics more meaningful.

Indeed, the second and third interactions after an on-site conversion—those you’re most likely to have on hand—may be the most influential metrics no matter how much data you accumulate.

Importing calculated Goal Values based on those metrics back into Google Analytics offers a more accurate valuation of the actions that take place on your website.

Even if those values are relative, you gain visibility into the assumptions you have about your site. Whether or not they hold true, the outcome will improve your marketing.

The post Bootstrapping Relevance: Making Web Conversions Meaningful for Long Sales Cycles appeared first on CXL.

4 Essential Methods of Session Stitching in Google Analytics

With a single click, a user can destroy Google Analytics data: Moving from an AMP page to the main site or the main site to a payment processor can turn one visit into multiple sessions, mucking up source data along the way. Critically, those clicks often happen at high-value transition points—from anonymous visitor to logged-in […]

The post 4 Essential Methods of Session Stitching in Google Analytics appeared first on CXL.

With a single click, a user can destroy Google Analytics data: Moving from an AMP page to the main site or the main site to a payment processor can turn one visit into multiple sessions, mucking up source data along the way.

Critically, those clicks often happen at high-value transition points—from anonymous visitor to logged-in user or from the pre- to post-purchase moment.

Session stitching repairs technical fault lines, preserving clean analytics data and rescuing attribution information. This post covers four common use cases:

  1. User ID tracking
  2. AMP tracking
  3. Subdomain tracking
  4. Cross-domain tracking

What is session stitching?

In Google Analytics, session stitching connects user activities that occur within a single session but—because of technical tracking limitations—incorrectly generate multiple sessions.

Any effort to stitch sessions hinges on two elements, which Simo Ahava details:

  • A tracker object that tracks to the same Google Analytics Property ID (UA-XXXXXX-Y)
  • A _ga cookie that has the same Client ID

Modern browsers do not allow sites from one domain to share cookies with another domain. Overcoming that limitation is central to patching together sessions that otherwise rip apart.

The term “session stitching” is used by digital marketers more often than Google, which, most recently, has preferred two other terms:

  1. Session unification. “Session unification allows hits collected before the User ID is assigned to be associated with an ID.”
  2. Site linking. “Cross domain tracking makes it possible for Analytics to see this as a single session by a single user. This is sometimes called site linking.”

Notably, the Google Analytics support article on session unification retains the phrase “session stitching” in image alt text and titles (for the curious); a stray reference to session stitching also appears in the support article for the Google AMP Client ID API.

In this article, I use session stitching is an overarching term that includes all efforts to correctly group user activities that occur within a single session. As Yehoshua Coren notes, the underlying technical reality is more important than the terminology: “It’s really more than that. It’s a clientId integrity issue.”

The challenge—no matter what you call it—impacts data quality and attribution for almost every site.

Who should care most about session stitching?

Session stitching is useful for every site but essential for a few:

  • Sites with logins. Sites with logins rely on “session unification” to gather data on events that lead to a user login.
  • AMP-heavy sites. Proper tracking preserves attribution data when users migrate from an AMP page to a locally hosted non-AMP page.
  • Large, multi-domain organizations. Multiple domains require cross-domain tracking to preserve attribution information as users migrate across domains (or subdomains).
  • Sites with third-party payment processors. Without session stitching, sites that rely on third-party payment processors may lose attribution data for ecommerce conversions.
  • Sites that use social logins. Like third-party payment processors, social logins can incorrectly reclassify post-login users as referrals from the social network.
  • Sites with iframe forms. Iframes embed a cross-domain tracking challenge within a page on your site.

1. User ID tracking

For sites with a log-in feature, User ID tracking connects multiple visits over time—allowing a business to see, for example, which SaaS features resonate with customers.

Session unification joins the pre-login activity with the post-login User ID—creating a single session from the two. That way, you can see which behaviors precede a log-in, something especially valuable if that login represents a point of conversion.

Thus, instead of capturing only part of the second session (first image), session unification joins pre-login hits (white) with post-login hits (blue):

no session unification

with session unification
(Image source)

Importantly, session unification connects hits only if “those hits happen within the same session in which a specific ID value is assigned for the first time.” In other words, it includes data from the session that precedes the login—not previous sessions.

Google Analytics applies session unification during daily analytics processing—“at 5am each day, based on the western most timezone selected in any reporting view that is associated with the property.”

That time lag can lead to “higher direct sessions and direct revenue during intra-day dates because [. . .] campaign referral information is sent during the first hit of a session where the user has not yet logged in.”

By default, session unification toggles On when you set up User ID tracking. Why would you ever want to turn it off? I asked Silver Ringvee, one of our CXL agency analysts. While he didn’t see an obvious use case, he speculated on a potential one:

There could be some occasions, though, where you really want to focus on the actual logged in users (and not users that logged in at some point of the journey). So, if you don’t care about what happened before they got the ID, you might want to turn it off.

You can turn off session unification in Admin > Property > Tracking Info > User-ID:

user-id setting in google analytics

While User ID tracking is most relevant for sites with logins—SaaS products, ecommerce sites—there are other ways to incentivize login. Sites like Quora and Glassdoor gate high-value content behind log-in walls, for example.

For those login types, session unification delivers important data on the most engaging content—the answers or articles that catalyze logins and signups.

2. AMP tracking

Google’s AMP rollout created tracking issues: AMP clicks from search take users to the “AMP on cache” version, which is hosted on Google’s CDN.

As Perficient’s Eric Enge told me, “A lot of people still don’t get this correct. The subtleties of tracking cross domains (from the Google cache to your actual domain) are lost on most publishers.”

Ultimately, users can access AMP pages in one of three ways; each impacts where the Client ID is stored:

  1. Google Search. AMP page is accessed via a Google Search result and displayed inside an “AMP viewer.” The Client ID is stored on google.com.
  2. Proxy/Cache. AMP page is accessed from a proxy/cache. The Client ID is stored on cdn.ampproject.org.
  3. Direct AMP. AMP page is accessed directly on the publisher domain. The Client ID is stored on the publisher domain.

In either of the first two cases, a click to another page on the publisher’s site from the AMP page generates a referral and a new session—rather than counting the click as the second interaction in a single session.

amp tracking issue diagram
Stone Temple details the delivery of information from an in-search AMP click to a non-AMP page. (Image source)

Left unmanaged, the resulting analytics data suffers from several issues:

  • Inflated session counts
  • High bounce rate on AMP pages
  • Low pages per session/session duration for AMP pages

As with other session stitching issues, the solution is to pass the same Client ID between pages on different domains, which Google makes possible via the AMP Client ID API.

How to use the Google AMP API to pass the same Client ID

Setting up AMP tracking has two steps: Analytics code changes and Referral Exclusions.

client id sharing amp tracking
Passing the Client ID from the host of the AMP page to the publisher’s domain preserves source information and combines the user activities into a single session. (Image source)

1. Analytics code changes. Proper AMP tracking starts with small additions to the Google Analytics code on AMP and non-AMP pages. Google provides details on how to make changes for analytics.js, gtag.js, and Google Tag Manager.

Because some browsers refuse third-party cookies, Google announced the AMP Linker in September 2018, which decorates URLs with Client ID information, bypassing the cookie-based limitation. AMP Linker does not require additional setup if you’ve already enabled the Google AMP Client ID API.

amp link decorator
(Image source)

2. Referral Exclusions. Additionally, you need to add ampproject.org as a Referral Exclusion. If you serve AMP content from multiple subdomains, Google recommends adding a Referral Exclusion for each.

referral exclusion list settings google analytics

As Enge details, the current solution isn’t perfect: “You can’t see the difference between your own hosted AMP pages vs. Google CDN hosted pages.”

That limitation affects sites with “canonical AMP pages”—AMP pages hosted on the publisher domain that are the standard (canonical) version of the mobile page. A solution, the same article offers, is to create a Hit-level custom dimension.

After initial installation, the changes will affect near-term Google Analytics data:

  • Total users and sessions will decline. Stitching AMP and non-AMP sessions will combine wrongly separated users and sessions.
  • Related metrics will become more accurate. Bounce rate on AMP pages, for example, will drop.
  • New Users will rise. The Google AMP API makes a one-time reset of the Client ID for AMP visitors. As Google notes: “Depending on the frequency with which users visit your site(s), this could cause a noticeable, temporary fluctuation in your New Users metric and related reporting.”

3. Subdomain tracking

Subdomain tracking has gotten considerably easier and relies on a setting for the Cookie Domain. Previously a manual step, setting the Cookie Domain (cookieDomain) to “auto” is now the default option in Google Analytics scripts and the Google Analytics Settings variable in Google Tag Manager.

Simo Ahava explains that setting the Cookie Domain to “auto” applies a recursive algorithm that

tries to write the cookie, starting from the most generic domain-level (the top-level domain), and stopping once it succeeds. What should be left is the root domain, and thus the cookie will be available to all subdomains.

Because the algorithm sets the cookie at the highest possible level (the root domain), a user who lands on a subdomain and later migrates to the core domain won’t generate a new Client ID—or initiate a new session.

A second step is to add your root domain to the Referral Exclusion list so that visits between subdomains and the core domain don’t initiate new sessions. (The first step ensures only that Google sees the visitor as the same user.) Google automatically adds the root domain to the Referral Exclusion list when you create your Google Analytics property, but the setup is worth double-checking.

In theory, these updates automate subdomain tracking—the Cookie Domain and Referral Exclusion lists are set, by default, to the correct values.

4. Cross-domain tracking

[This post contains video, click to play]

Cross-domain tracking is the most complex of any session-stitching process because many solutions are bespoke: Proper implementation depends on the setup of your site, payment processor, log-in tool, or—Lord help you—iframe.

If multiple sites share the same tracking code but no other technical changes are made:

  1. Analytics will duplicate sessions between domains (since the Client ID won’t transfer from one to the other).
  2. The original attribution information will be lost, converted into a referral from the other domain, which, since it shares the same tracking code, will appear as a self-referral.

As with AMP tracking, successful cross-domain tracking requires passing the Client ID from one site to another without passing the cookie itself. There are several core use cases, each with unique solutions.

Intra-company cross-domain tracking

cross-domain tracking diagram
(Image source)

Large organizations often manage several domains but want to track visitors as they move from one to another. Assuming the sites share the same Google Analytics code, tracking users across multiple domains has three additional steps.

The first two steps alter the tracking code to allow domains to pass and receive client IDs via links:

  • Auto Link Domains. Add all domains as a comma-separated list within the Google Analytics Settings variable in Google Tag Manager or amend your Google Analytics code to include those domains.

autolink-domains screenshot

  • allowLinker. To ensure domains can receive Client IDs passed via links, add a field in the Google Analytics Settings variable in Google Tag Manager named “allowLinker,” and set the value to “true.” (If the user flow is one directional, you need to allow the linker only on destination—not source—domains.)

allowlinker screenshot

The linker appends a timestamp and other metadata to validate the Client ID, which reduces the likelihood that a shared link with the Client ID affects Analytics data.

The final step is to add all domains to your Referral Exclusion list. Otherwise, you’ll generate mountains of self-referrals—Google Analytics will correctly recognize one user between domains but will still generate a new session.

To analyze data gathered from cross-domain tracking efficiently, prepend the hostname to the URL path. Otherwise, paths shared by multiple domains will be grouped together. Both URLs below would show only /about-us/ in page-level reports:

https://example.com/about-us/

https://another-example.com/about-us/

You can prepend the hostname by setting up a custom filter with the following values:

custom filter to show domain name

(If you’re trying to rescue historical data that wasn’t properly filtered, you can use a secondary dimension with the hostname to differentiate URLs in a view.)

Third-party payment processing

With third-party payment processing, a correct setup is vital: Without it, you’ll lose attribution data for all transactions, which will show up as referrals from the payment processor. However, you have limited control over the payment processor’s page.

One solution is to set up a Referral Exclusion for the domain of your payment processor; however, that effort—a manual one—could become a whack-a-mole task if:

  • You work with many payment processors.
  • Payment processors frequently change domains.
  • Excluding a proccesor’s domain also risks excluding other referral traffic (e.g. you get referral visits from a link on PayPal’s blog).
third-party payment processing
The comprehensive answer? Exclude all referrals to your receipt page. (Image source)

Ahava details a creative, comprehensive solution: creating a Referral Exclusion for all traffic to your receipt or “thank you” page. The Referral Exclusion preserves the original source data and prevents Google Analytics from generating a new session when users return to your site from the payment processor’s domain.

Implementing Ahava’s solution has two steps:

  • Creation of a custom JavaScript variable. Set a referrer value of “null” for the URL of your thank you page.
  • Modification of the tags that fire on the thank-you page. For any tag that fires on your thank you page, set the “referrer” field to the recently created variable.

A blanket ban on referrals to a given page may seem risky, but thank you pages are accessible (or should be accessible) only within the checkout funnel—no one starts their user journey on the thank you page—so there’s no risk of losing valuable source data.

Social logins

social login page

Social logins can’t rely on a blanket domain Referral Exclusion—while a Google login may come from accounts.google.com (a subdomain you could safely exclude), others, like Facebook, come via facebook.com, and almost every site has non-login referral traffic from Facebook.

A common solution is to open the authorization in a new tab or window, which maintains continuity in the session on your site. However, ad blockers may interfere with this process, or you may prefer—for the sake of user experience—not to open a new window.

Another solution—much like Ahava’s strategy for thank you pages—is to override or ignore referrer information for the post-login page hosted on your site. Setting the referrer value to your own domain or “null” ensures that the source registers as Direct, thereby preserving the single session. The strategy works only if the post-login page has a unique URL.

Iframes

Iframes are a challenge for session stitching, in part, because iframe content typically loads before Analytics tags fire. That means traditional tracking solutions—like appending the Client ID to the URL—require adjustment, as Google’s Developer Guide details:

To solve this problem you can configure the page inside the iframe to delay creating its tracker until after it receives the client ID data from the parent page. And on the parent page you configure it to send the client ID to the iframe page using postMessage.

Painful as cross-domain tracking on iframes can be—Ahava refers to them as “untrackable little shit-monsters that exist in the void between websites”—they’re (too) often used in web forms by vendors who focus more on moving form data into a CRM than making those interactions trackable in Google Analytics.

Bounteous explains the process for using postMessage in cross-domain iframe tracking:

we can have our child iframe emit a message, which we can ‘listen’ for and use to notify GTM that an important interaction has occurred. This is great for tracking things like simple form submissions within iframes [. . .] We’ll need to take the following steps:

1.) Post a message from our child iframe
2.) Listen for the message in our parent frame
3.) When we catch the message, push an event into the GTM Data Layer

There is an important caveat: You must be able to add code to the iframe. If not, the process will not work.

Ahava has authored two solutions for cross-domain tracking, the latest of which uses customTask:

The customTask API is a feature of the Universal Analytics library (used by Google Tag Manager’s tags, too). It lets you get and set values from and to the (Measurement Protocol) hit as it’s being generated.

For cross-domain iframe tracking, “customTask leverages a setInterval() script which polls the page periodically until the target iframe is found or a timeout is reached.”

iframe tracking code
The parameters for the iframeDecorator object in Ahava’s solution. (Image source)

When Google Analytics registers a hit to the parent page, Ahava’s solution prompts customTask to look for an iframe that matches a preset CSS selector, then to decorate the iframe URL with the Client ID of the initial hit.

Even that solution, however, is fragile, especially if the iframe includes redirects—”untrackable shit-monsters” indeed.

Conclusion

Session stitching aligns Google Analytics data with what we know to be true: Users, in one sitting, navigate between domains, complete purchases, or fill out forms that briefly transition them to and from another domain.

The critical nature of those interactions—pre- and post-login, anonymous visitor versus known lead, and potential customer versus past-purchaser—make session stitching well worth the effort.

Weaving together user interactions enhances attribution data and reduces blind spots at critical junctures in the user journey.

The post 4 Essential Methods of Session Stitching in Google Analytics appeared first on CXL.

Google Analytics 360: The Features Worth $150k a Year

For many, Google Analytics 360 is a black box. Marketing and sales collateral from Google is spartan, and common refrains about key features—like unsampled data—seem unworthy of a six-figure bill for most sites. That disconnect exists because many, myself included, have understood Google Analytics 360 primarily as an expansion of the data caps we encounter […]

The post Google Analytics 360: The Features Worth $150k a Year appeared first on CXL.

For many, Google Analytics 360 is a black box. Marketing and sales collateral from Google is spartan, and common refrains about key features—like unsampled data—seem unworthy of a six-figure bill for most sites.

That disconnect exists because many, myself included, have understood Google Analytics 360 primarily as an expansion of the data caps we encounter with the free version. As it turns out, those caps represent a fraction of overall value—like differentiating a presidential suite from a standard room based on square footage.

Charles Farina of Analytics Pros, who has used Google Analytics 360 for years, gave me an hour-long walkthrough of the platform to highlight the most meaningful differences: those that drive ROI.

Google Analytics 360 overview

Google Analytics 360 (GA360) is one of seven components of the Google Marketing Platform. With paid access to GA360—$150,000 per year, billed monthly at $12,500 with an annual contract—users also get access to 360 versions of other products:

google marketing platform
The announcement of the Google Marketing Platform, in June 2018, combined paid ad platforms and the Google Analytics 360 Suite. (Image source)

The GA360 license is all-inclusive: There are no tiers or additional features to unlock. (Users get credits toward BigQuery; extensive querying of GA data in BigQuery could generate added costs.)

Some differences in functionality between 360 versions and their free counterparts are limited. Tag Manager 360, for example, touts “enterprise level support” as the primary benefit.

This post focuses on Analytics 360 and the integrations with other platform products that occur within the Google Analytics UI.

Google Analytics 360 vs Google Analytics

Dry lists of feature comparisons are available in other posts, like this one from Blast Analytics or this one from Google. I won’t replicate those resources here, but a few oft-cited, quantitative differences are worth mentioning:

  • Sampling. The free version of GA begins sampling data for non-default reports that exceed 500,000 sessions. GA360 doesn’t begin sampling data until reports exceed 100 million sessions. The free version also stops recording data at 10 million hits per month compared to 2 billion for GA360.
  • Time lag. GA360 pushes all data into its reporting interface within four hours and often does so in a matter of minutes. That near-real-time data entry is faster than the free version, which usually takes a full day to process data.
  • Export size. GA360 allows 3 million rows; the free version offers 50,000.
  • Custom dimensions and metrics. GA360 offers 200 of each compared to the free version, which provides 20.

And yet, most businesses are not another custom dimension (or 50) away from actionable data; few make different decisions because their data is based on 87% of all sessions. (As Farina noted, “if you have 50% sampling, it’s still very likely that the data is directional.”)

Other well-known differences focus less on raw numbers and more on enterprise business needs:

  • Roll-up reporting. GA360 allows users to roll up reporting from multiple properties efficiently with capabilities not available in the free version—deduplicating users, stitching sessions, inheriting custom dimensions and metrics, etc.
  • Data-driven attribution modeling. GA360 moves beyond the standard attribution models available in the free version and—using machine learning—creates custom attribution models with data from GA and connected accounts, including TV ad buys.
attribution 360 modeling
A report showing weighted attribution in Attribution 360, which allows users to create custom attribution models for GA data.

Each of the above features enables the collection of more data, improves the quality of data, or increases the accuracy of calculations from it. Still, those differences only hint at the bottom-line benefits of GA360, which center on:

  1. Connections between Google Analytics data and personally identifiable information.
  2. Integrations with a wider range of ad networks.
  3. Granular, actionable data visualizations.

Farina walked me through each.

The Google Analytics 360 benefits that generate ROI

1. Connections between Google Analytics data and personally identifiable information

In a previous agency job, I’d seen clients switch from Google Analytics to Adobe Analytics for one reason—to connect anonymous analytics data to specific users. Google Analytics has unambiguous warnings about collecting personally identifiable information (PII), which chases some to Adobe:

google analytics pii
(Image source)

A platform change for that reason, it turns out, is unwarranted—if you take your GA data to the international waters of BigQuery.

BigQuery

BigQuery, part of the Google Cloud Platform, is a fully managed data warehouse. Integrating GA data with BigQuery is possible only with GA360. BigQuery starts with 13 months of historical GA data, collecting new data indefinitely moving forward.

“At the end of the day,” Farina explained, “if you can get data into BigQuery and you have a question you can write to that data, BigQuery will go out and answer that question without you having to worry about storage or compute or memory.”

big query ga data
BigQuery bridges the gap between anonymous GA IDs and CRM data.

It is possible to export data from the free version of GA into another platform, but the process is incomplete and doesn’t scale: It relies on the GA API—a source of report data but not raw data—or a plugin, like the one for Google Sheets.

Because it has different Terms of Service, BigQuery can join GA data with PII—from your CRM or anything else you choose to connect. Once data is in BigQuery, SQL scripts return a user-by-user table with the requested data:

bigquery table with analytics data
BigQuery can join data in GA to a CRM via, for example, a hidden field in a contact form that passes the anonymous GA ID into a field tied to an individual ID in a CRM.

As Farina detailed, some companies use BigQuery as their primary data warehouse; others treat BigQuery as a way station before passing data on (via export) to a preferred cloud storage system.

(BigQuery is HIPPA compliant, making it a viable repository for medical data or, consequently, any other type of personal data.)

GA360 and Salesforce integration

In 2017, Google announced a partnership with Salesforce; the two companies deepened that partnership in 2018. (Salesforce is now a reseller of GA360.) The collaboration yielded several integrations. Those with access to both products can:

  • Move Salesforce data from Sales Cloud into GA360 for attribution reports, bid optimization, and audience creation.
  • Push GA360 data into the Salesforce Marketing Cloud reporting UI.
  • Connect GA360 audiences to Salesforce Marketing Cloud for inclusion in Salesforce campaigns (e.g. email, SMS).
  • Create audience lists from customer interactions in the Salesforce Marketing Cloud.
  • Import Salesforce Sales Cloud user attributes, Einstein Lead Scoring, and ecommerce metrics into GA360.

Those integrations enable the creation of funnels like the one below, which draws on data from both platforms:

google analytics salesforce funnel
The Salesforce integration lets users create funnels with GA and Salesforce data. (Image source)

In another use case Farina suggested, companies could customize email content based on browsing behavior. If you manage a daily digest for The Seattle Times, for example, you could include more sports stories for sports junkies and more political headlines for partisans.

I asked Farina if the Salesforce integration was likely the first of many GA–CRM connections. But the Google–Salesforce partnership, Farina speculated, is unique: “Google doesn’t have a strong martech—no email tool, CRM, CMS; Salesforce doesn’t have enterprise analytics.”

The decision to integrate Salesforce with GA360, he continued, arose from the ongoing consolidation of martech stacks by Adobe: Adobe purchased Marketo in 2018, pressuring Google and Salesforce to offer a competitive alternative.

2. Integrations with a wider range of ad networks

Ad spend, rather than total traffic, may be the easiest way to justify a GA360 investment. If you’re currently spending $100,000 per month in Google Ads, Farina postulated, how much more efficient could you be with GA360? A 10% increase in efficiency would nearly cover the cost of GA360.

Not surprisingly, Google has case studies detailing strong improvements:

The free version of Google Analytics already includes robust (yet underused) integrations with Google Ads. As Farina highlighted, you can build segments based on a combination of conditions, then export that audience to Google Ads for remarketing.

GA360 extends the capabilities available for Google Ads to other platforms and networks, like Campaign Manager, as well as non-Google networks, like Index Exchange, in Display & Video 360.

Display & Video 360

display video 360 screenshot
Display & Video 360 extends GA functionality for Google Ads to more ad networks.

Display & Video 360 pulls click- and view-through data from display and video ads into Google Analytics. The ability to include display view-throughs in Multi-Channel Funnels strengthens attribution models. In the example below, the eye icon represents display impressions:

view-through impressions

View-through data sheds light on potential catalysts for conversions from direct or organic sessions. Adding a secondary dimension, like Campaign, identifies instances when display impressions for a particular campaign were not the first brand interaction (and, therefore, deserve less credit in any attribution model):

display view-through with campaign

Because the data exists in BigQuery as well, audiences transition fluidly between anonymous Google Analytics users and known leads or customers in a CRM.

Thus, the value of GA360 is not merely getting more granular data on ad impressions but attaching that data to real users for smarter retargeting or tailored email campaigns.

3. Granular, actionable data visualizations

Two high-value data visualizations are unique to GA360: Custom Funnels and Advanced Analysis.

Custom Funnels

Building a useful funnel in the free version of Google Analytics, Farina conceded, is nearly impossible. In GA360, it’s simple—agonizingly so for those who have labored through funnel creation or analysis in the standard version.

Farina demonstrated how GA360 translates any combination of variables into a funnel in seconds:

custom funnels in ga360
Custom Funnels are easy to create and simple to export as an audience or segment.

Like other GA360 features, the primary benefit of Custom Funnels is not only visualizing user behavior but translating that visualization into action through export to a marketing automation platform.

Advanced Analysis

Advanced Analysis, still in Google’s beta purgatory, “sits very closely between Data Studio and Analytics,” according to Farina.

Its drag-and-drop interface offers several report types, including a Segment Overlap that identifies users who share characteristics. That visualization, in turn, is available for export back into Analytics, where you can drill down from the audience level to the individual user:

advanced analysis segment overlap
Advanced Analysis combines elegant visualization with granular user information.

Product-market fit and alternative solutions

A common refrain from Farina was that GA360 is an enterprise product—most users fail to max out the capabilities of the free version and wrongly assume that an unsampled report or limited export holds back analysis and growth.

Companies that are a good product-market fit for GA360 likely fall into one of three categories:

  • Extremely high-traffic sites. According to Quantcast, about 600 U.S. sites generate more than 1 million monthly visitors—enough so that a month’s worth of data is sampled below the 50% threshold. For those sites (and many with less traffic), an enterprise analytics tool is essential.
  • Large B2B companies that already use Salesforce. The managed integration of Google Analytics and Salesforce data would likely cover the costs for any independent effort to bind analytics and CRM data.
  • Companies with high ad spend. As noted earlier, a $100,000 monthly ad spend requires a 12.5% increase in efficiency to cover the cost of GA360.

What about alternative analytics platforms like Heap, Segment, or similar options? Where do they fit into the analytics conversation?

To Farina, they’re good options for “advanced businesses with small data sets that don’t have the ad spend, volume, or are not yet at a level where $12,500 per month is something that they can allocate.”

A potential challenge of a patchwork system, Farina continued, is aligning all teams on the same data:

Even if you use Heap, it’s likely that Google Analytics is still a primary tool that marketing uses, where Heap might be something that the data science team starts to use more.

And the challenge that we’ve seen again and again is that, at that point, you have two different data sets and two different implementations and two different sets of metrics and conversions.

That can be a real challenge, especially when the data is not directional between platforms, and you get into this area where no one trusts it, no one is using it, and you’re not getting value out of either side.

I asked Farina a final question: If Google is worried about competition from Adobe, why not just give away other 360 features for free? Or charge $100 per month?

Some aspects of Google Analytics 360 are a clear drain on server resources, but others, like the ability to connect GA data to a CRM, could quickly undermine a primary selling point for Adobe.

“There’s a user journey,” Farina argued. “We already have great solutions for mid-market. You can use something like Google Analytics and add Segment or Heap if you’re not at the level of being able to benefit from a Google Analytics 360 or Adobe.”

Conclusion

If you continually bump up against the data caps of the free version of Google Analytics, a switch to Google Analytics 360 may be necessary—even though the business case might remain murky. You’ll get more complete data, but how will you drive more revenue with it?

The key benefits of GA360, then, are about putting data to work:

  • Using BigQuery to connect on-site behavior with individual users for targeting via marketing automation platforms.
  • Exporting tailored audiences in Google Analytics back into ad platforms for smarter remarketing.
  • Using integrated ad spend data to create more reliable attribution models that, in turn, dictate ad spend.

Ultimately, Farina’s reference to the “user journey” applies to more than the analytics platform. It also includes overall marketing maturity: Even user-specific data or actionable attribution modeling will fail to deliver ROI unless those insights direct marketing efforts beyond analytics.

The post Google Analytics 360: The Features Worth $150k a Year appeared first on CXL.

Five Strategies for Slaying the Data Puking Dragon.

If you bring sharp focus, you increase chances of attention being diverted to the right places. That in turn will drive smarter questions, which will elicit thoughtful answers from available data. The result will be data-influenced actions that result in a long-term strategic advantage. It all starts with sharp focus. Consider these three scenarios… Your […]

The post Five Strategies for Slaying the Data Puking Dragon. appeared first on Occam’s Razor by Avinash Kaushik.

If you bring sharp focus, you increase chances of attention being diverted to the right places. That in turn will drive smarter questions, which will elicit thoughtful answers from available data. The result will be data-influenced actions that result in a long-term strategic advantage.

It all starts with sharp focus.

Consider these three scenarios…

Your boss is waiting for you to present results on quarterly marketing performance, and you have 75 dense slides. In your heart you know this is crazy; she won’t understand a fraction of it. What do you do?

Your recent audit of the output of your analytics organization found that 160 analytics reports are delivered every month. You know this is way too many, way too often. How do you cull?

Your digital performance dashboard has 16 metrics along 9 dimensions, and you know that the font-size 6 text and sparkline sized charts make them incomprehensible. What's the way forward?

If you find yourself in any of these scenarios, and your inner analysis ninja feels more like a reporting squirrel, it is ok. The first step is realizing that data is being used only to resolve the fear that not enough data is available. It’s not being selected strategically for the most meaningful and actionable insights.

As you accumulate more experience in your career, you’ll discover there are a cluster of simple strategies you can follow to pretty ruthlessly eliminate the riffraff and focus on the critical view. Here are are five that I tend to use a lot, they are easy to internalize, take sustained passion to execute, but always yield delightful results…

1. Focus only on KPIs, eliminate metrics.

Here are the definitions you'll find in my books:

Metric: A metric is a number.

KPI: A key performance indicator (KPI) is a metric most closely tied to overall business success.

Time on Page is a metric. As is Impressions. So are Followers and Footsteps, Reach and Awareness, and Clicks and Gross Ratings Points.

Each hits the bar of being “interesting,” in a tactical oh that’s what’s happening in that silo soft of way. None, passes the simple closely tied to overall business success standard. In fact, hold on to your hats, a movement up or down 25% in any of those metrics may or may not have any impact on your core business outcomes.

Profit is obviously a KPI, as is Likelihood to Recommend. So too are Installs and Monthly Active Users, Orders and Loyalty, Assisted Conversions and Call Center Revenue.

Each KPI is of value in a strategic oh so that is why we are not making money or oh so that is why we had a fabulous quarter sort of way. A 25% movement in any of those KPIs could be the difference between everyone up and down getting a bonus or a part of the company facing layoffs. Often, even a 5% movement might be immensely material. What metric can say that?

When you find yourself experiencing data overload, don an assassin's garb, identify the metrics and kill them. They are not tied to business success, and no senior leader will miss them. On the ground, people will use metrics as micro diagnostic instruments, but they already do that.

A sharp focus on KPIs requires concentrating on what matters most. Every business will have approximately six KPIs for a CEO. Those six will tie to another six supplied to the CMO.

After you go through the assassin’s garb process above, if it turns out that you have 28 KPIs… You need help. Hire a super-smart consultant immediately!

2. Focus only on KPIs that have pre-assigned targets.

This is a clever strategy, I think you are going to love it.

Targets are numerical values you have pre-determined as indicators success or failure.

Turns out, creating targets is insanely hard.

You have to be great at forecasting, competitive intelligence, investment planning, understanding past performance, organization changes and magic pixie dust (trust me on that one).

Hence, most companies will establish targets only for the KPIs deemed worthy of that hard work.

Guess what you should do with your time? Focus on analysis that is worth your hard work!

Start by looking at your slides/report/dashboard and identify the KPIs with established targets. Kill the rest.

Sure, there will be howls of protest. It'll be John. Tell him that without targets you can’t identify if the performance is good or bad, a view every CEO deserves.

John will go away and do one of two things:

1. He will agree with you and focus on the KPIs that matter.

2. He will figure out how to get targets for all 32 metrics along all 18 dimensions.

You win either way. :)

An added benefit will be that with this sharp focus on targets, your company will get better at forecasting, competitive intelligence, investment planning, org changes, magic pixie dust and all the other things that over time become key assets. Oh, your Finance team will love you!

Special caution: Don't ever forget your common sense, and strive for the Global Maxima. It is not uncommon for people to sandbag targets to ensure they earn a higher bonus. If your common sense suggests that the targets are far too low, show industry benchmarks. For example, the quarterly target may be 400,000 units sold. Common sense (and company love) tell you this seems low, so you check actuals to find that in the second month, units sold are already 380,000. Suspicion confirmed. You then check industry benchmarks: It is 1,800,000. WTH! In your CMO dashboard, report Actuals, Target and Benchmark. Let him or her reach an independent, more informed, conclusion about the company’s performance.

3. Focus on the outliers.

Turns out, you are the analyst for a multi-billion dollar corporation, with 98 truly justifiable KPIs (you are right: I'm struggling to breathe on hearing that justification, but let's keep going). How do you focus on what matters most?

Focus your dashboards only on the KPIs where performance for that time period is three standard deviations away from the mean.

A small statistics detour.

If a data distribution is approximately normal then about 68 percent of the data values are within one standard deviation of the mean, about 95 percent are within two standard deviations, and about 99.7 percent lie within three standard deviations. [Wikipedia]

By saying focus on only reporting on KPIs whose performance is three standard deviations from the mean, I’m saying ignore the normal and the expected. Instead, focus on the non-normal and the unexpected.

If your performance does not vary much, consider two standard deviations away from the mean. If the variation is quite significant, use six (only partly kidding!).

The point is, if performance is in the territory you expect, how important is it to tell our leaders: The performance is as it always is.

Look for the outliers, deeply analyze the causal factors that lead to them, and take that to the executives. They will give you a giant hug (and more importantly, a raise).

There are many ways to do approach this. Take this image from my January 2007 post: Analytics Tip #9: Leverage Statistical Control Limits

Having an upper control limit and a lower control limit makes it easy to identify when performance is worth digger deeper into. When you should freak out, and when you should chill.

Look for outliers. If you find them, dig deeper. If not, move on permanently, or at least for the current reporting cycle.

Use whichever statistical strategies you prefer to find your outliers. Focus sharply.

4. Cascade the analysis and responsibility for data.

In some instances you won't be able to convince the senior leader to allow you to narrow your focus. He or she will still want tons of data, perhaps because you are new or you are still earning credibility. Maybe it is just who they are. Or they lack trust in their own organization. No problem.

Take the 32 metrics and KPIs that are going to the CMO. Pick six critical KPIs for the senior leader.

Cluster the remaining 26 metrics.

You'll ask this question:

Which of these remaining 26 metrics have a direct line of sight to the CMO’s six, and might be KPIs for the VPs who report to the CMO?

You might end up with eight for the VPs. Great.

Now ask this question:

Which of these remaining 18 metrics have a direct line of sight to the eight being reported to the VPs, and might be KPIs for the directors who report to the VPs?

You might end up with 14 for the directors.

Awesome.

Repeat it for managers, then marketers.

Typically, you'll have none remaining for the Marketers.

Here's your accomplishment: You've taken the 32 metrics that were being puked on the CMO and distributed them across the organization by level of responsibility. Furthermore, you've ensured everyone's rowing in the same direction by creating a direct line of sight to the CMO’s six KPIs.

Pat yourself on the back. This is hard to do. Mom is proud!

Print the cascading map (CMO: 6 > VPs: 8 > Directors: 14 > Managers: 4), show it to the CMO to earn her or his confidence that you are not throwing away any data. You've simply ensured that each layer reporting to the CMO is focused on its most appropriate best sub-set, thus facilitating optimal accountability (and data snacking).

I’ll admit, this is hard to do.

You have to be deeply analytically savvy. You have to have acquired a rich understanding of the layers of the organization and what makes them tick. You have to be a persuasive communicator. And, be able to execute this in a way that demonstrates to the company that there’s real value in this cascade, that you are freeing up strategic thinking time.

You’ll recognize the overlap between the qualities I mention above and skills that drive fantastic data careers. That’s not a coincidence.

Carpe diem!

5. Get them hooked on text (out-of-sights).

If everything else fails, try this one. It is the hardest one because it'll demand that you are truly an analysis ninja.

No senior executive wants data. It hurts me to write that, but it is true.

Every senior executive wants to be influenced by data and focus on solving problems that advance the business forward. The latter also happens to be their core competence, not the former.

Therefore, in the next iteration of the dashboard, add two more pieces of text for each metric:

1. Why did the metric perform this way?

Explain causal factors that influenced shifts. Basically, the out-of-sights (see TMAI #66 if you are a subscriber to my newsletter). Identifying the four attributes of an out-of-sight will require you to be an analysis ninja.

2. What actions should be taken?

Explain, based on causal factors, the recommended next step (or steps). This will require you to have deep relationships with the organization, and a solid understanding of its business strategy.

When you do this, you'll begin to showcase multiple factors.

For the pointless metrics, neither the Why nor the What will have impact. The CMO will kill these in the first meeting.

For the decent metrics, it might take a meeting or three, but she'll eventually acknowledge their lack of value and ask you to cascade them or kill them.

From those remaining, a handful will come to dominate the discussion, causing loads of arguments, and resulting in productive action. You'll have known these are your KPIs, but it might take the CMO and her team a little while to get there.

After a few months, you'll see that the data pukes have vanished. If you've done a really good job with the out-of-sights and actions, you'll notice notice that the focus has shifted from the numbers to the text.

Massive. Yuge. Victory.

If more examples will be of value, I have two posts with illuminating examples that dive deeper into this strategy…

Strategic Dashboards: Best Practices, Tips, Examples | Smart Dashboard Modules: Insightful Dimensions And Best Metrics

You don't want to be a reporting squirrel, because over time, that job will sap your soul.

If you find yourself in that spot, try one of the strategies above. If you are desperate, try them all. Some will be easier in your situation, while others might be a bit harder. Regardless, if you give them a shot, you'll turn the tide slowly. Even one month in, you’ll feel the warm glow in your heart that analysis ninjas feel all the time.

Oh, and your company will be data-influenced — and a lot more successful. Let's consider that a nice side effect. :)

Knock 'em dead!

As always, it is your turn now.

Have you used any of the above mentioned strategies in your analytics practice? What other strategies have been effective in your company? What is the hardest metric to get rid of, and the hardest KPI to compute for your clients? Why do you think companies keep hanging on to 28 metric dashboards?

Please share your ideas, wild theories, practical tips and examples via comments.

Thank you.

The post Five Strategies for Slaying the Data Puking Dragon. appeared first on Occam's Razor by Avinash Kaushik.