Here’s an alternative to cookies for user tracking

Instead of having your analytics toolset read a cookie, pass a unique identifier associated with the user ID. Learn how to do it and keep it privacy compliant.

The post Here’s an alternative to cookies for user tracking appeared first on Marketing Land.

For over 20 years, website analytics has leveraged the use of persistent cookies to track users. This benign piece of code was a mass improvement over using a user’s IP address or even the combination IP and browser. Since it was first introduced, the use of these cookies has become the focus of privacy legislation and paranoia. So what alternative is there?

If your website or mobile application requires the creation of user accounts and logins, it’s time to plan to transition away from cookie-based tracking to user ID tracking. In simple terms, instead of having your analytics toolset read a cookie, you pass a unique identifier associated with the user ID and then track the user via this identifier. Typically the identifier is the login ID.

Preparing for advanced tracking

Step 1

Ensure that the user ID you’ve deployed doesn’t contain Personal Identifiable Information (PII). Too often, sites require users to use their personal email address as a login ID or event their account number. These are PII. If this is the case with your organization, then the trick is to assign a random unique client identifier to all existing accounts as well as for any future accounts as they are created. 

Step 2

Have your developers start to push the User ID to the data layer. This way, the variable will be there waiting for your analytics software to read it once you’re ready to implement the new tracking method. Check with your analytics software on the variable name for this element as it varies from analytics software to software.

Step 3

Create a new view/workspace within your analytics software and configure it to track users by their user ID. Most analytic packages will still set a temporary cookie to track user behavior prior to their login and then will connect the sessions. This way you can see what a user does on your site even prior to them logging in and what site visitors who never login do.

Benefits of tracking users by user ID

Improved accuracy

The use of cookies is flawed in many ways. If users jump between devices (from desktop, to mobile, to a tablet, or office computer to home computer) you can’t track that it was the same user. This generates inflated unique user counts.

What if a user clears their cookies (perhaps they’re utilizing antivirus software that purges all cookies every time the browser is closed)? Once again this leads to inflated user count data.

By tracking a user via their user ID, you’ll obtain a more accurate count of unique users on your site.

Cross Device Tracking

This is perhaps one of the greatest benefits of tracking users by their user ID. You can now see how users interact with your site and/or mobile app. How many use a combination of devices. Is there a specific preference for which type of device might simply be used to add to a shopping cart, only to have the order processed on another device?

Greater Analytics Insight

Armed with enhanced analytics data, new and potentially powerful insights can be harvested. With this new knowledge, you can better direct internal resources to focus and enhance the user experience and optimize the user flow for greater profits.

Real life examples

The following examples demonstrate the power of tracking users by their user ID. 

Overview – Device Overlap

The following image shows what percentage of accounts use which type of device and the percentage that use a combination of devices. For example, while 66.6% use only a desktop, 15.8% use a combination of Mobile and Desktop.

User Behavior – Device Flow

Reviewing the device flow leading up to a transaction can provide some of the greatest insights from this enhanced analytics tracking methodology.

While it might not be surprising that the two most common device (by number of Users) paths were Desktop only and Mobile only, what was surprising to me and to the client was number 3. While the device path of Desktop -> mobile -> Desktop is only experienced by approx. 3% of users, it accounts for approximately 8% of all transaction and over 9% of all revenue generated.

The minimal overall use of tablets was also a bit surprising. Of course the mix of devices does vary from client to client.

Assisted conversions

By dropping the use of cookies, the quality of the data of assisted conversions is significantly increased. For example, how many people read an email (can easily be tracked when opened and attributed to a user ID) on a mobile device, click into the site, browse around to and review the items that are being promoted (maybe add them to their shopping cart). Then think about it for a bit before logging-in later via a desktop to complete the transaction?

For example, from the above report, one can objectively assign a more accurate value to SEO efforts by examining the role Organic Search traffic played in generating sales. While a source of an immediate sale (in this case) from organic search generated traffic represents 1.3% of total revenue as an assist in the sales cycle, it played a role in over 10.4% of generated revenue.

Enhanced user insights

In this example, the client allows its customers to also have multiple logins for their account. Essentially a user ID represents a customer/client and not a single user. The client operates in the B2B world where multiple people within its clients’ organizations may require unique logins and rights (who can order, who can just view product details, who can view or add to the cart but not place an order, etc.). By leveraging both tracking by user ID and recording a unique login id within their analytics, these additional insights can be obtained.


The above report not only breaks down revenue by division, but demonstrates how within different division users use the site differently. In region 1, there is almost a 1:1 relationship between user ids and login ids. Yet in Division 3, the ration is over 4:1, this means that for every customer there is an average over 4 logins being utilized in Division 3.

How can they leverage this data for more effective marketing? By understanding that within divisions there are differences, carefully crafted email marketing can be created to target customers differently with multiple logins vs. single account/login customers. 

A further dive into the data could also reveal which login IDs are only product recommenders (only view products) from those who make specific product requests (add to the shopping cart and never place the order) from those who only process orders and from those who do it all. Each one needs to be marketed to differently with different messaging to optimize the effectiveness of the marketing effort. It’s through detailed analytics that this audience definition can be obtained.

Is tracking by user ID right for me?

Making the decision to change how you track your users is a difficult choice. First, does your site/mobile app require users to login at a reasonably early part of their journey? This is ideal for e-commerce sites and sites where the vast majority of user interaction takes place after the user logins into the site/application.

If you’re running a general website with the goal to merely share information and generate “contact us” type leads, the answer to making this switch is no.

If you have a combination of a general information site plus a registered user section, then yes you might want to consider making this change and perhaps just for the registered user section.

If you do make this change, don’t stop running your other analytics views/workspaces that use cookies. Keep them running. By operating two different views, you’ll be eventually able to reconcile the differences between the two, plus it makes it easier to explain to those who you report to, why you’ll be reporting a dramatic drop in the number of users. Of course, when you first make the switch, all users will be first-time users so expect a major increase in new visitor traffic.

If you decide to make this change, don’t forget to review the impact of the change with your legal department. They will tell you if you need to update your privacy policy.

The post Here’s an alternative to cookies for user tracking appeared first on Marketing Land.

The importance of valuing latent orders to successful Amazon Sponsored Products management

Advertisers must consider the lag time between ad click and conversion as well as historic performance around key days to estimate shift.

The post The importance of valuing latent orders to successful Amazon Sponsored Products management appeared first on Marketing Land.

Sponsored Products is the most widely adopted Amazon search ad format, and typically accounts for more than six times as much ad spend as Sponsored Brands ads for the average Tinuiti (my employer) advertiser. As such, it’s incredibly important for advertisers to understand the full value that these ads drive.

Part of this is understanding the click-to-order period between when a user clicks on an ad and when that user ends up converting. Given how Amazon attributes orders and sales, it’s crucial that advertisers have an idea of how quickly users convert in order to value traffic effectively in real time.

Amazon attributes conversions and sales to the date of the last ad click

When assessing performance reports for Sponsored Products, advertisers should know that the orders and sales attributed to a particular day are those that are tied to an ad click that happened on that day. This is to say, the orders and sales reported are not just those that occurred on a particular day.

Advertisers viewing Sponsored Products conversions and sales in the UI are limited to only seeing those orders and sales attributed to the seven days following an ad click. However, marketers pulling performance through the API have greater flexibility and can choose different conversion windows from one to thirty days, which is how the data included in this post was assembled.

In the case of Sponsored Display and Sponsored Brands campaigns, performance can only be viewed using a 14-day conversion window, regardless of whether it is being viewed through the UI or through an API connection.

For marketers who wish to use a thirty-day conversion window in measuring Sponsored Products sales and conversions attributed to advertising, this means that it would take thirty days after the day in question in order to get a full picture of all conversions. Taking a look across Tinuiti advertisers, the first 24 hours after an ad click accounted for 77% of conversions and 78% of sales of all those that occurred within 30 days of the ad click in Q2 2020.

Unsurprisingly, the share of same-SKU conversions that happen in the first 24 hours is even higher, as shoppers are more likely to consider other products the further removed they become from an ad click.

For the average Amazon advertiser, we find that more than 20% of the value that might be attributed to ads happens more than one day after the ad click, meaning advertisers must bake the expected value of latent orders and sales into evaluating the most recent campaign performance. The math of what that latent value looks like varies from advertiser to advertiser.

Factors like price impact the length of consideration cycles

The time it takes for consumers to consider a purchase is naturally tied to the type of product being considered, and price is a huge factor. Taking a look at the share of 30-day conversions that occur more than one day after the click by the average order value (AOV) of the advertiser, this share goes up as AOV goes up. Advertisers with AOV over $50 saw 25% of orders occur more than 24 hours after the ad click in Q2 2020, whereas advertisers with AOV less than $50 saw 22% of orders occur more than 24 hours after the ad click.

Put simply, consumers usually take longer to consider pricier products before purchasing than they take to consider cheaper products, generally speaking. Other factors can also affect how long the average click-to-order cycle is for a particular advertiser.

In addition to latent order value varying by advertiser, there can also be meaningful swings in what latent order value looks like during seasonal shifts in consumer behavior, such as during the winter holiday season and around Prime Day.

Key shopping days speed up conversion process

The chart below depicts the daily share of all conversions attributed within seven days of an ad click that occurred during the first 24 hours. As you can see, one-day order share rose significantly on Black Friday and Cyber Monday as users launched into holiday shopping (and dropped in the days leading into Black Friday).

After these key days, one-day share returned to normal levels before rising in the weeks leading up to Christmas Day before peaking on December 21 at a level surpassing even what was observed on Cyber Monday. December 21 the last day many shoppers could feel confident in placing an order in time to receive it for the Christmas holiday, and it showed in how quickly the click-to-purchase path was for many advertisers.

Of course, Amazon created its own July version of Cyber Monday in the form of Prime Day, and we see a similar trend around one-day conversion share around the summer event as well.

This year’s Prime Day has been postponed, but reports indicate that the new event might take place in October.

As we head into Q4, advertisers should look at how the click-to-order window shifts throughout key times of the year in order to identify periods in which latent order value might meaningfully differ from the average.


Like any platform, advertisers are often interested in recent performance for Amazon Ads to understand how profitable specific days are. This is certainly important in determining shifts and situations in which budgets should be rearranged or optimization efforts undertaken, and that’s even more true now given how quickly performance and life are changing for many advertisers as well as the population at large.

However, in order to do so effectively, advertisers must take into consideration the lag that often occurs between ad click and conversion. Even for a platform widely regarded as the final stop for shoppers such as Amazon, more than 20% of 30-day conversions occur after the first 24 hours of the click, and this share can be much higher for advertisers that sell products with longer consideration cycles.

Further, advertisers should look to historic performance around key days like Cyber Monday and Prime Day to understand how these estimates might shift. Depending on product category, other holidays like Valentine’s Day or Mother’s Day might also cause shifts in latent order value.

Not all advertisers necessarily want to value all orders attributed to an ad over a month-long (or even week-long) attribution window equally, and particularly for products with very quick purchase cycles, it might make sense to use a shorter window. That said, many advertisers do find incremental value from orders that occur days or weeks removed from ad clicks, and putting thought into how these sales should be valued will help ensure your Amazon program is being optimized using the most meaningful performance metrics.

The post The importance of valuing latent orders to successful Amazon Sponsored Products management appeared first on Marketing Land.

What’s behind the hype about Customer Data Platforms?

How CDPs aim to solve some of marketers’ most vexing challenges.

The post What’s behind the hype about Customer Data Platforms? appeared first on Marketing Land.

The global market for customer data platforms is expected to rise dramatically over the next few years. The CDP Institute pegged industry revenue for 2019 at $1 billion and it expects the sector to reach at least $1.3 billion in 2020. Meanwhile, ResearchandMarkets predicts the industry will grow from $2. billion in 2020 to $10.3 billion by 2025, expanding at an astounding compound annual growth rate (CAGR) of 34.0% during the forecast period.

This growth is being driven by the proliferation of devices and customer touchpoints, higher expectations for marketers to orchestrate real-time personalized experiences across channels and the need to navigate complex privacy regulations. Let’s explore each of these factors in greater detail.

More devices, fragmented interactions and high expectations

Gartner predicted that the average U.S. adult would own more than six smart devices by 2020, and Cisco forecasts that the number of devices connected to IP networks globally will expand to more than three times the global population by 2023. There will be 3.6 networked devices per capita (29.3 billion overall) by 2023, says Cisco, up from 2. networked devices per capita (18. billion overall) in 2018.

Customers and potential customers are using all of these devices — several in a day, often — to interact with the companies they do business with, and they expect these brands to recognize them no matter what device they’re using at any given time.

According to a Salesforce State of the Connected Customer survey conducted April 2019, 78% of respondents prefer to use different channels to communicate with brands depending on context, but 6% expect companies’ engagements with them to be tailored based on past interactions.

This challenge isn’t going to go away anytime soon. Segmenting Salesforce’s customer data by generations reveals that younger cohorts switch devices more than older, and they’re also more likely to be adding IoT-type connected devices to their repertoire.

Meanwhile, customer data security and governance have leapt to the forefront of marketer concerns, as the alphabet soup of data regulations — from HIPAA (Health Insurance Portability and Accountability) to HITECH (Health Information Technology for Economic and Clinical Health) to GDPR (General Data Protection Regulation), CCPA (California Consumer Privacy Act) and CASL (Canada Anti-Spam Legislation) — continues to grow.

Enter the Customer Data Platform, a system designed for non-IT use to streamline the flow of customer data throughout the martech stack and create a single view of the customer. High expectations, along with the proliferation of possible customer touchpoints, make cross-device IDs and identity resolution — the ability to consolidate and normalize disparate sets of data collected across multiple touchpoints into an individual profile that represents the customer or prospect — critical for helping marketers, sales and service professionals deliver the ideal total customer experience. CDPs offer this consolidation and normalization and also make the data profiles freely available to other systems.

Additionally, CDP vendors seek to help marketers address the privacy challenge by providing strong data governance protocols that are certified by third-party organizations to ensure compliance with these types of regulations, as well as other data security standards. For example, many CDP vendors are SOC (Service Organization Control), SSAE (Statement on Standards for Attestation Engagements) and/or ISO (International Standards Organization) certified. These audits confirm best practices around internal processes, data management, data privacy and security.

For more on Customer Data Platforms, including more analysis of the category and profiles of 23 different vendors, download our newly-updated buyer’s guide today!

The post What’s behind the hype about Customer Data Platforms? appeared first on Marketing Land.

Pitching a Data Strategy? Here’s How to Ensure the C-Suite Says “Yes.”

Are you a CMO who thinks accurate attribution is a pipe dream? Or a customer experience director who has to hack together data to create something resembling a customer journey? This article is for you. We’ll show you how to: Assess your company’s current approach to data. Map your data strategy against overall business goals. […]

The post Pitching a Data Strategy? Here’s How to Ensure the C-Suite Says “Yes.” appeared first on CXL.

Are you a CMO who thinks accurate attribution is a pipe dream? Or a customer experience director who has to hack together data to create something resembling a customer journey?

This article is for you. We’ll show you how to:

  1. Assess your company’s current approach to data.
  2. Map your data strategy against overall business goals.
  3. Build a data roadmap to deliver on the strategy.

Knowing what you’re up against

If I had a dollar every time I heard a business say they were “data driven,” I’d be rich.

While most companies are drowning in data, few have the strategy or cultural maturity to use it for accurate measurement, let alone to make decisions or drive business goals. And while two-thirds of executives think they need a data strategy, only a third are getting it done.

Despite these shortcomings, executives in your company might be wary of signing off on a data strategy. From where they’re standing, they’ve already spent a ton of money buying analytics and dashboarding tools, each of which promised to be the holy grail. So why do they need a data strategy? 

Here’s the problem. They’ve spent money on tactical elements that serve only isolated parts of the business. You need to think beyond the data point you need right now and create a strategy that answers wider business goals.

It’s tempting to develop a strategy that works just for your department. But your customers don’t interact just with your department. Your sales team has data in their CRM. Customer service reps have data on customer conversations. You need all that data to make better decisions. 

If, by contrast, you’re in a business where short-termism runs rampant, these scenarios should sound familiar:

  1. You have massive amounts of data, but different teams have access to different bits, copying and retaining the same data in their own ways.
  2. You can’t get an overview of a customer’s journey because the data you need exists in 10 different tools (that don’t speak to one another).
  3. People ask for specific data points to support the idea they’ve already had, rather than investigating the data first to inform a decision.

Those who cobble things together may get some early wins. But under the hood, things are getting messy, and data gaps will expose the fact that you haven’t thought strategically about your needs. 

If you’re ready to move beyond those short-term tactics, here are the three steps to do it. 

Step 1: Assess your company’s current approach to data.

We’ve created a set of 19 questions to kick off this process, which will help you evaluate the current data maturity within your business. We consider four critical areas for planning your data strategy:

  1. Strategy and culture;
  2. People and skills;
  3. Technology and tools;
  4. Methodology and process.

The scores gathered at this stage inform the roadmap in Step 3. You can also use these scores as a benchmark to reassess your business throughout the delivery of your strategy to quantify progress.

Step 2: Map your data strategy against overall business goals. 

If your company has fuzzy goals like “become customer-centric,” you need to pin them down to something quantifiable. 

My go-to approach for setting goals is OKRs (Objectives, Key Results) because they marry vague objectives to measurable results. OKRs also use a hierarchical structure, with company, team, and personal levels.

This goal-setting structure really helps when it comes to writing the data requirements in the next step. 

Example OKRs:

  • Company objective: Customers love our product.
  • Key results: NPS score increases to 30 by the end of Q3. Customer lifetime value (LTV) increases by 15% by the end of the year. 
  • Product team objective: Build features our customer want.
  • Key result: 65% of customers use new product features at least once a week by the end of Q3. 
  • Product manager objective: Evaluate new feature ideas.
  • Key result: Test four new feature ideas and identify which ones customers use most.
sample okrs at company, team, and individual level.

Creating a goal tree with data requirements 

The goal tree is a document that shows the C-suite how data ties to the business value they want to achieve.

Map the cascading OKRs at a company, team, and personal level. You may choose to work down only to a team level if your company has more than 30 employees. (The goal tree becomes hard to manage after that point.)

To ensure you don’t miss any requirements, talk to individual team members and summarize their requirements at the team level.

example of a goal tree.

Adding the data requirements

Work through each objective and key result with the relevant team:

  • What data do they need to achieve and measure what’s written?
  • Which KPIs will measure their performance? 

Avoid writing a technical solution at this point (e.g., “we need an API to join website data with Intercom into a dashboard”) and instead state the need (e.g. “we need to see customer website behavior data alongside Intercom data”).

The how should be led by your tech team.  

Questions to consider when writing requirements

1. What data do you need to inspire ideas and hypotheses? Say your objective is to increase sales revenue from your existing customer base, and your key result is to generate an additional $500k from these customers by the end of Q2.  

One of your requirements might be the ability to see the behavior of returning customers throughout the customer journey. From this data, you might identify that customers abandon their carts at a high rate, despite buying previously.

This data has helped identify a problem. To inspire a hypothesis, you may need the capacity to record and codify qualitative customer exit poll responses (i.e. your requirements). This data might show that customers abandon their carts because they’re looking for promo codes and then get distracted.

example of promo code.

Your hypothesis might be that if you remove the promo code field at the checkout, cart abandonment will go down, and revenue from existing customers will increase.

2. What data do you need to validate ideas? Which tactics might the team use to achieve their objectives? Continuing with our example, you might need the ability to test your hypothesis by showing half of existing customers the current experience, and half a test version (without the promo code).

Or, maybe you’ll try a cart abandonment–triggered email that requires measuring open rates, click-through rates, and conversions. 

3. What data do you need to report on key results? In this example, you’ll need statistically significant results to understand if you can rely on test data to make a decision. You also need metrics on cart abandonment and purchase rates of those who saw your test idea and those who didn’t.

Plus, you’ll want to measure the amount of additional revenue generated from the test.

Data-informed business decisions in practice

We recently worked with an ecommerce business that realized their subscription customers were worth more than one-off buyers. But they couldn’t be sure because they couldn’t track LTV and average order value consistently.

Even if they did have that data, they couldn’t act on it—they were unable to segment customers by LTV. The ability to segment would mean they could test and target valuable customers and provide marketing messages relevant to their stage in the journey. 

This led to us remodel their entire tracking. We started with offline attribution for recurring orders. Now, we’re wrapping up a data lake that stores data across all customer touchpoints on most channels, from acquisition to re-engagement.

This has not only helped inform hypotheses for experimentation—and considerably increased their conversion rates—but it’s also guiding wider business decisions, such as focusing on a subscription model vs. one-off sales, their product offering, and their sales approach and messaging.  

Step 3: Build a data roadmap to deliver on the strategy. 

To build your roadmap, you’ll need to work with your tech team to understand what and how things need to be done to deliver the requirements you’ve set out.

Ideally, you want to deliver these data solutions in sprints. This allows the business to:

  1. Start seeing the value of a holistic data strategy early on;
  2. Iron out any problems in your plans or working methods.

The roadmap should contain fairly detailed elements such as “implement a data warehouse” in the near term (e.g., a three-month window), with activities getting broader/fuzzier the further out in time they go.

As your team delivers elements of the roadmap, you can assess the success of what’s been delivered and add more detail on a rolling basis to the next three months of planned activity.

Define your starting point  

Use the answers from the maturity audit in Step 1. Working alongside your tech team, catalog what your current technical capabilities, tool stack, skills, and culture can achieve from your list of requirements in the goal tree. 

Work out what goes where in the roadmap

Now that you understand your starting point, you’ll need to prioritize the rest of the requirements. Here are six factors to rate by importance:

  1. Data security and governance activities should occur throughout the roadmap. Specific activities to comply with security or governance should be the highest priority. 
  2. Front-load activities that are easiest to implement and tied to the biggest wins. Consider any costs involved weighted against the business value associated with the requirements. Review and mine your existing analytics setup—often, there’s unused but needed, valuable data. Early wins can help gain support for your work and get the team familiar with new practices (e.g., agile). 
  3. Dependencies. Understand what has to happen first for something else to happen later (e.g., check the reliability of the data before you start personalization). 
  4. Factor in staff availability and consider other internal projects that might conflict with your roadmap. 
  5. Consider the budget process at your company. Do you have to wait until next April when budgets get renewed?
hindsight, insight, foresight diagram.

A strategy is more than just a plan

Gartner has a model to help illustrate how activities in your roadmap move your business toward data maturity:

  1. Hindsight: What happened?
  2. Insight: Why did it happen?
  3. Foresight: How can we make it happen?
cro with hindsight, insight, foresight analytics.

Access to the right data helps businesses move through these levels, but so too does cultural change and education. 

Get your CEO and managers to ask teams to present data-based insights rather than gut feelings to reinforce the correct use of data. Help others learn from your own work by modeling how to make decisions using data. You could even share insights on your business and customers in weekly company newsletters. 

The goal tree also helps in this area—teams rely on data to inform and measure their key objectives. These may be part of their performance review; if not, consider suggesting this to your HR team to incentivize the desired behaviors around data usage. 

from information to optimization.

People and skills

Educating individuals on how to use data to make decisions is crucial for a strategy to do more than just tick the “data” box. Airbnb created an internal “data university” with a curriculum tailored to their tool stack and business cases.

As a result, they saw weekly active users of internal data science tools rise from 30% to 45%, and 500 employees had taken at least one class. You don’t need to go to this extreme, but you do need to add training and key hires into your roadmap to ensure people know how to use data.

Process and methodology 

Plans are great, but requirements will change. Someone will need to track additional metrics. A new social media network will get popular, and suddenly you have a new source of data to factor in. 

Within the strategy you’ve created, set out internal processes for handling new requests and questions going forward. For example, when a team shops for a new tool, create a checklist of things the tech team needs, such as an open API or other way to support automated data export/import. 


The above steps, exercises, and questions will help you develop a data strategy that’s clearly linked to wider business goals and avoids the all-too-common, sporadic approach of most businesses.

To increase your leverage within the C-suite, involve multiple teams from across your business—more departments will see and advocate for your proposal.

If you need a partner to help define and implement your data strategy, get in touch

The post Pitching a Data Strategy? Here’s How to Ensure the C-Suite Says “Yes.” appeared first on CXL.

Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data

Marketing and analytics professionals need to work together to not only increase the accuracy of our data, but to educate people about how to leverage it.

The post Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data appeared first on Marketing Land.

One thing online businesses need to use as a cornerstone of their business decision making  process is their digital analytics data (analytics data from a variety of sources: i.e. web analytics, search console, paid search, paid social, social media, etc.). Yet, according to a MIT Sloan Management Review only 15% of more than 2,400 business people surveyed trust their data. While there is no analytics method available that will guarantee 100% accuracy of your digital analytics data, by auditing your data you can ensure the data is as accurate as possible. This will provide you with the confidence to not only trust your data but to leverage the information provided in making objective business decisions, instead of subjective decisions. It is that lack of trust that explains why a mere 43% (according to the same survey) say “they frequently can leverage the data they need to make decisions.” This low confidence in one’s data equals failure.

As marketing and analytics professionals, we need to work together to not only increase the accuracy of our data, but to educate people about the data and how to leverage it. The first step in this process is auditing your analytics configurations and thereby identifying issues, and correcting them to ensure the integrity of the data. 

The analytics audit process

Step 1: Acknowledge analytics data isn’t perfect

When you start your analytics process gather together all those who have a stake in the outcome and find out why they don’t trust the data. Most likely they have good reasons. Don’t make claims that your goals is to make it 100% accurate because that is impossible. Use the opportunity to explain at a high level that analytics data captures a sampling of user activity and for various technical reasons, no system will be perfect and that’s why they are most likely seeing data difference between things like their Adwords account and their web analytics data. Use an example of taking a poll. Pollsters take a sample ranging in size of 1,000-2,000 people of a total population of over 350,000 in the USA and then state their data is accurate within a few percentage points 4 out of 5 times. In other words, they are way off 20% of the time.  However, businesses, politicians and the general public respond and trust this data. When it comes to web analytic data even at the low end of accuracy, your data is likely still capturing an 80% sample which is far more accurate then the data presented by pollsters, yet it is less trusted. Let the stakeholders know, as a result of the audit and implementing fixes, you could be improving data capture accuracy to 90% or even 95%, and that is data you can trust 100%.

Step 2: Identify what needs to be measured

One of the biggest issues when it comes to analytics data is, the analytics software isn’t configured to collect only the correct data. The software becomes a general catch-all. While on the surface it sounds perfect to just capture everything (all that you can), when you cast a huge net you also capture a lot of garbage. The best way to ensure the right data is being captured and reported on is to review the current marketing and measurement plans. Sadly, too few organizations don’t have these, so during your meeting, make sure to ask what the stakeholders’ primary items they want measured are.

Identify and gather all the “Key Performance Indicators” (KPI) currently being reported on. You’ll need this before you start the audit. Verify all KPI are still valuable to your organization and not just legacy bits of data that have been reported for years. Sadly in many organizations, they are still reporting on KPIs that actually hold little to no value to anyone within the organization.

Step 3: Review the current analytics configuration

Now is the time to roll-up those sleeves and get dirty. You’ll need admin level access (where possible) to everything or at a minimum full view rights. Next you’ll need a spreadsheet which lists the specific items that you need to review and ensure are configured correctly and if not, a place to note what is wrong and a column to set a priority to get them fix.

The spreadsheet I’ve developed over the years has over 100 standard individual items to review, grouped into specific aspects of a digital analytics implantation plus depending on the specific client additional items may be added. The following eight are some of most critical items that need to be address.

  1. Overall Analytics Integrity: is your analytics code (or tag manager code) correctly installed on all pages of your website? Check and make sure your analytics code is correctly deployed. Far too often the javascript (or the code snippet) is located in the wrong place on the web page or perhaps it is missing some custom configuration. Simply placing the code in the wrong place can cause some data to not be captured.

    Verify the code is on all pages/screens. Too often either section of a site are missed or the code doesn’t work the same on all pages resulting in lost data or potentially double counting.

    If you run both a website and an app, are their analytics data properly synced for data integration, or is it best to run them independently?
  2. Security: Review who has access to the analytics configuration and determine when the last time an individual’s access and rights were reviewed. You’d be surprised how many times, it has been discovered that former employees still have admin access. This should be something that is reviewed regularly, plus a system has to be in place to notify the analytics manager when an employee departs an organization to terminate their access. While you may think since you’re using the former employee’s email address all is fine because HR will cancel that email address, they may still have access. Many analytics systems do not operate within your corporate environment and are cloud-based. As long as that former employee remembers their email address and the password to that specific analytics account they’ll have access.
  3. Analytics Data Views: This is an especially critical feature when it comes to web analytics (i.e. Google Analytics, Adobe Analytics, etc.). Is your analytics system configuring to segregate your data into at least 3 different views? At a minimum, you need “All Data” (no filtering), “Test” (including only analytics test data or website testing) and “Production” (only customer-generated data). In many organizations, they also segment their data further into “Internal Traffic” (staff using the website) and “External Traffic” (primarily external users).

    If these don’t exist, then it is likely you are collecting and reporting on test traffic and internal users. How you’re employees use a website is completely different than customers and should at a minimum be excluded or segmented into their own data set.
  4. Review Filters: Filters are a common tool used in analytics to exclude or include specific types of activity. Most filters don’t need to be reviewed too often, but some do need to be reviewed more frequently. The ones that need to be reviewed most often are ones that include or exclude data based on a user IP address. IP addresses do have a nasty habit of changing over time. For example, a branch location switched ISPs and received a new IP address. When it comes to IP based filters it is recommended they be reviewed every 6 months, but if not possible at least once per year. As a tip, after they’ve been reviewed and verified, rename the filter by adding the date they were last reviewed.

    Don’t forget to ensure that exclude filters are in place to exclude search engine bots and any 3rd party utilities used to monitor a website. This machine-generated traffic has a nasty habit, of getting picked up and reported on which skews all the data.
  5. Personally Identifiable Information (PII): Too many developers have a nasty habit of passing PII via the data layer especially on ecommerce sites. They are unaware that it is possible that this data can end in the company’s analytics database. By storing PII on a 3rd party system, you either need to reference this in your privacy policy and even then you might be breach of various privacy laws. One of the most common types of these errors is passing a users email address as a URI parameter. The easiest way to check for this is to run a pages report for any URI containing a “@” symbol. Over the years, I’ve seen customer’s names capture and much more.

    If this happens, ideally your developers should fix what is causing this, but at a minimum you’ll need a filter to strip this type of information from the URI before it is stored in your analytics database.
  6. E-commerce Data: This is the most common issue we hear from organizations: “The sales figures reported in the analytics doesn’t match our e-commerce system!” As stated above, analytics isn’t perfect nor should it be treated as a replacement for an e-commerce/accounting backend. However, if you are capturing 85-95% (and possibly higher) of the transactional data then you can effectively leverage this data to evaluate marketing efforts, sales programs, AB tests, etc.

    From an e-commerce perspective, the easiest way to audit this is to compare the reported data in a given time period to what the backend system reports. If it is near 90%, then don’t worry about it. If it is below 80%, you have an issue. If it is somewhere in between, then it is a minor issue that should be looked into but is not a high priority.
  7. Is everything that needs to be tracked being tracked: What does your organization deem important? If your goal is to make the phones ring, then you need to be tracking clicks on embedded phone numbers. If your goal is forms driven submissions, are you tracking form submissions correctly? If you’re trying to direct people to local locations, then are you capturing clicks on location listings, on embed maps, etc.?

    What about all those social media icons scattered on your website to drive people to your corporate Twitter, LinkedIn, Facebook accounts? Are you tracking clicks on those?
  8. Campaigns: Is there a formal process in place to ensure links on digital campaigns are created in a consistent manner? As part of this are your marketing channels correctly configured within your analytics system?

You now have an outline for where to start your analytics audit. Think of your organization’s analytics data and reporting systems like a car. It always seems to be working fine until it stops working. You need to take your car in from time to time for a tune-up. This is what an analytics audit is. The audit will identify things that need to be fixed immediately (some small and some big) plus other items that can be fixed over time. If you don’t fix the items discovered during the audit your analytics system won’t operate optimally and people won’t want to use it. How frequently should an analytics audit be conducted after everything has been fixed? Unlike a car, there is no recommended set amount of time between audits. However, every time your digital properties undergo major updates or if there have been a series of minor updates that can easily be viewed together as a major update, it is time to repeat the audit process.

The post Analytics Audit 101: Identify issues and correct them to ensure the integrity of your data appeared first on Marketing Land.

Soapbox: Is my data telling me the truth?

How email security programs affected my perception of data integrity.

The post Soapbox: Is my data telling me the truth? appeared first on Marketing Land.

As marketers, we face the overwhelming challenge of demonstrating proof that our tactics are effective. But how can we convince management if we are not convinced of our own data?

Here’s the reality, which I recently learned for myself: If you’re running email marketing, it’s very likely that your performance reports are not disclosing the full truth… inflated CTRs (click-through rates) and open rates being the main culprits. 

Email security programs – loved by recipients, hated by senders

Barracuda. SpamTitan. Mimecast. Email bots that serve a single purpose: to protect users from unsafe content. These programs scan inbound emails and attachments for possible threats, including viruses, malware, or spammy content by clicking on links to test for unsafe content.

For email marketers, this creates several challenges:

  • Inflated CTRs and open rates due to artificial clicks and opens 
  • Disrupting the sales team’s lead followup process as a result of false signals
  • Losing confidence in data quality (quantity ≠ quality)

Real or artificial clicks?

In reviewing recent email marketing performance reports, I noticed an unusual pattern: Some leads were clicking on every link in the email…header, main body, footer, even the subscription preferences link — yet they were not unsubscribing. Not only that, but this suspicious click activity was happening almost immediately after the email was deployed. I speculated that these clicks were not “human”, but rather “artificial” clicks generated from email filters. 

Hidden pixels are your frenemy

To test my hypothesis, I implemented a hidden 1×1 pixel in the header, main body, and footer section in the next email. The pixels were linked and tagged with UTM tracking — and only visible to bots.

Sure enough, several email addresses were flagged as clicking on the hidden pixels.

All that brings me back to the question of whether or not marketing data can be trusted. It’s critical to “trust, but verify” all data points before jumping to conclusions. Scrutinizing performance reports and flagging unusual activity or patterns helps. Don’t do an injustice to yourself and your company by sharing results that they want (or think they want) to hear. Troubleshoot artificial activity and decide on a plan of action:

  • Use common sense and always verify key data points
  • Within your email programs, identify and exclude bots from future mailings
  • Share results with management, sales, and other stakeholders

A word of caution… 

Tread carefully before you start implementing hidden pixels across your email templates. Hiding links might appear to email security programs as an attempt to conceal bad links. You could be flagged as a bad sender, so be sure to run your email through deliverability tools to check that your sender score isn’t affected.

As the saying goes, “There are three kinds of lies: lies, damned lies, and statistics.” Sigh.

With different solutions circulating within the email marketing community, this is likely the “best solution out of the bad ones”. It all depends on what works best with your scenario and business model. 

Godspeed, marketer! 

The post Soapbox: Is my data telling me the truth? appeared first on Marketing Land.

The state of tracking and data privacy in 2020

Here’s where search marketers find themselves in the current entanglement of data and privacy and where we can expect it to go from here.

The post The state of tracking and data privacy in 2020 appeared first on Marketing Land.

January 2020 felt like a turning point. CCPA went into effect, Google Chrome became the latest browser to commit to a cookie-less future and, after months of analytics folks sounding the alarm, digital marketers sobered to a vision of the future that looks quite different than today.

This article is not a complete history of consumer privacy nor a technical thesis on web tracking, although I link to a few good ones in the following paragraphs.

Instead, this is the state of affairs in our industry, an assessment of where search marketers find themselves in the current entanglement of data and privacy and where we can expect it to go from here.

This is also a call to action. It’s far from hyperbole to suggest that the future of digital and search marketing will be greatly defined by the actions and inactions of this current calendar year.

Why is 2020 so important? Let’s assume with some confidence that your company or clients find the following elements valuable, and review how they could be affected as the associated trends unfold this year.

  1. Channel attribution will stumble as tracking limitations break measurability and show artificial performance fluctuations.
  1. Campaign efficiency will lose clarity as retargeting efficacy diminishes and audience alignment blurs.
  1. Customer experience will falter as marketers lose control of frequency capping and creative sequencing. 

Despite the setbacks, it is not my intention to imply that improved regulation is a misstep for the consumers or companies we serve. Marketing is at its best when all of its stakeholders benefit and at its worst when an imbalance erodes mutual value and trust. But the inevitable path ahead, regardless of the destination, promises to be long and uncomfortable unless marketers are educated and contribute to the conversation.

That means the first step is understanding the basics.

A brief technical history of web tracking (for the generalist)

Search marketers know more than most about web tracking. We know enough to set people straight at dinner parties — “No, your Wear OS watch is not spying on you” — and follow along at conferences like SMX when a speaker references the potentially morbid future of data management platforms. Yet most of us would not feel confident in front of a whiteboard explaining how cookies store data or advising our board of directors on CCPA compliance. 

That’s okay. We’ve got other superpowers, nice shiny ones that have their own merit. Yet the events unfolding in 2020 will define our role as marketers and our value to consumers. We find ourselves in the middle of a privacy debate, and we should feel equipped to participate in it with a grasp of the key concepts. 

What is the cookie? 

A cookie stores information that is passed between browser and server to provide consistency as users navigate pages and sites. Consistency is an operative word. For example, that consistency can benefit consumers, like the common shopping cart example. 

Online shoppers add a product to the cart and, as they navigate the site, the product stays in the shopping cart. They can even jump to a competitor site to price compare and, when they return, the product is still in the shopping cart. That consistency makes it easier for them to shop, navigate an authenticated portion of a site, and exist a modern multi-browser, multi-device digital world.

Consistency can also benefit marketers. Can you imagine what would happen to conversion rates if users had to authenticate several times per visit? The pace of online shopping would grind to a crawl, Amazon would self combust, and Blockbuster video would rise like a phoenix.

But that consistency can violate trust. 

Some cookies are removed when you close your browser. Others can accrue data over months or years, aggregating information across many sites, sessions, purchases and content consumption. The differences between cookie types can be subtle while the implications are substantial.

Comparing first- and third-party cookies

It is important for marketers to understand that first- and third-party cookies are written, read and stored in the same way. Simo Ahava does a superb job expanding on this concept in his open-source project that is absolutely recommended reading. Here’s a snippet.

It’s common in the parlance of the web to talk about first-party cookies and third-party cookies. This is a bit of a misnomer. Cookies are pieces of information that are stored on the user’s computer. There is no distinction between first-party and third-party in how these cookies are classified and stored on the computer. What matters is the context of the access.

The difference is the top-level domain that the cookie references. A first-party cookie references and interacts with the one domain and its subdomains. 


A third-party cookie references and interacts with multiple domains. 


Marketing Land has a helpful explainer, aptly called WTF is a cookie, anyway? If you’re more of a visual learner, here is a super simplistic explanation of cookies from The Guardian. Both are from 2014 so not current but the basics are still the basics.

Other important web tracking concepts

Persistent cookies and session cookies refer to duration. Session cookies expire at the end of the session when the browser closes. Persistent cookies do not. Data duration will prove to be an important concept in the regulation sections. 

Cookies are not the only way to track consumers online. Fingerprinting, which uses the dozens of browser and device settings as unique identifiers, has gotten a lot of attention from platform providers, including a foreshadowed assault in Google’s Privacy Sandbox announcement.

Privacy Sandbox is Google’s attempt at setting a new standard for targeted advertising with an emphasis on user privacy. In other words, Google’s ad products and Chrome browser hope to maintain agreeable levels of privacy without the aggressive first-party cookie limitations displayed by other leading browsers like Safari and Firefox.

Storage is a broad concept. Often it applies to cookie storage, and how browsers can restrict the storage of cookies, but there are other ways to store information. LocalStorage uses Javascript to store information in browsers. It appeared that alternate storage approaches offered hope for web analysts and marketers affected by cookie loss until recent browser updates made those tactics instantly antiquated.   

Drivers: How we got here

It would be convenient if we could start this story with one event, like a first domino to fall, that changed the course of modern data privacy and contributed to the world we see in 2020. For example, if you ask a historian about WWI, many would point to a day in Sarajevo. One minute Ol’ Archduke Ferdinand was enjoying some sun in his convertible, the next minute his day took a turn for the worse. It is hard to find that with tracking and data privacy. 

Facebook’s path to monetization certainly played a part. In the face of market skepticism about the social media business model, Facebook found a path to payday by opening the data floodgates.

While unfair to give Facebook all the credit or blame, the company certainly supported the narrative that data became the new oil. An iconic Economist article drew several parallels to oil, including the consolidated, oligopolistic tendencies of former oil giants.

“The giants’ surveillance systems span the entire economy: Google can see what people search for, Facebook what they share, Amazon what they buy,” the Economist wrote. “They own app stores and operating systems, and rent out computing power…”

That consolidation of data contributed to an increase in the frequency and impact of data leaks and breaches. Like fish in a bucket, nefarious actors knew where to look to reap the biggest rewards on their hacking efforts.

It was a matter of time until corporate entities attempted to walk the blurring line of legality, introducing a new weaponization of data that occurred outside of the deepest, darkest bowels of the internet.

Enter Cambridge Analytica. Two words that changed the way every web analyst introduced themselves to strangers. “I do analytics but, you know, not in, like, a creepy way.”

Cambridge Analytica, the defunct data-mining firm entwined in political scandal, shed a frightening light on the granularity and unchecked accessibility of platform data. Investigative reporting revealed to citizens around the world that their information could not only be used by advertising campaigns to sell widgets, but also by political campaigns to sell elections. For the first time in many homes, the effects of modern data privacy became tangible and personal.  

Outcomes: Where we are today

The state of data privacy in 2020 can perhaps best be understood by framing it in terms of drivers and destinations. Consumer drivers, like those mentioned in the previous section, created reactions from stakeholders. Some micro-level outcomes, like actions taken by individual consumers, were predictable. 

For example, the #deletefacebook hashtag first trended after the Cambridge Analytica story broke and surveys found that three-quarters of Americans tightened their Facebook privacy settings or deleted the app on their phone. 

The largest outcomes are arguably happening at macro levels, where one (re-)action affects millions or hundreds of millions of people. We have seen some of that from consumers with the adoption of ad blockers. For publishers and companies that live and die with the ad impression, losing a quarter of your ad inventory due to ad blockers was, and still is, devastating. 

Political Outcomes

Only weeks after Cambridge Analytica found its infamy in the headlines, the European Union adopted GDPR to enhance and defend privacy standards for its citizens, forcing digital privacy discussions into both living rooms and board rooms around the world.  

Let’s use the following Google Trends chart for “data privacy” in the United States to dive deeper into five key outcomes.

General Data Protection Regulation (GDPR) has handed out more than 114 million in fines to companies doing business in the EU since becoming enforceable in May 2018. It’s been called “Protection + Teeth” in that the law provides a variety of data protection and privacy rights to EU citizens while allowing fine enforcement of up to €20 million or 4 percent of revenue, whichever hurts violators the most.

Months later, the United States welcomed the California Consumer Privacy Act (CCPA), which went into effect in January 2020 — becoming enforceable in July. Similar to GDPR, a central theme is transparency, in that Californians have the right to understand which data is collected and how that data is shared or sold to third parties.

CCPA is interesting for a few reasons. California is material. The state represents a double-digit share of both the US population and gross domestic product. It is also not the first time that California’s novel digital privacy legislation influenced a nation-wide model. The state introduced the first data breach notification laws in 2003, and other states quickly followed.

California is not alone with CCPA, either. Two dozen US state governments have introduced bills around digital tracking and data privacy, with at least a dozen pending legislation. That includes Nevada’s SB220 which became enacted and enforceable within a matter of months in 2019.

Corporate Outcomes

Corporate responses have come in many forms, from ad blockers I mentioned to platform privacy updates to the dissolution of ad-tech providers. I will address some of these stories and trends in the following section, but, for now, let’s focus on the actions of one technology that promises to trigger exponential effects on search marketing: web browsers.

The Safari browser introduced Intelligent Tracking Prevention (ITP) in 2017 to algorithmically limit cross-site tracking. Let’s pause to dissect the last few words in that sentence.

  • Algorithmically = automated decisions that prioritize scale over discernment
  • Limit = block immediately or after a short duration
  • Cross-site tracking = first- and third-party cookies

ITP 1.0 was only the beginning. From there, the following iterations tightened cookie duration, storage, and the role of first-party cookies for web analytics. Abigail Matchett explains the implications for users of Google Analytics.

“All client-side cookies (including first-party trusted cookies such as Google Analytics) were capped to seven days of storage. This may seem like a brief window as many users do not visit a website each week. However, with ITP 2.2 and ITP 2.3… all client-side cookies are now capped to 24-hours of storage for Safari users… This means that if a user visits your site on Monday, and returns on Wednesday, they will be granted a new _ga cookie by default.”

You are beginning to see why this is a big deal. Whether intended or not, these actions reinforce the use of quantitative metrics rather than quality measures by obstructing attribution. There is far more than can be said on ITP so if you are ready for a weekend read, I recommend this thorough technical assessment of the ITP 2.1 effects on analytics.

If ITP got marketer’s attention, Google reinforced it by announcing that Chrome would stop supporting third-party cookies in two years, codifying for marketers that cookie loss was not a can to be kicked down the road. 

“Cookies have always been unreliable,” Simo Ahava told me. “To be blind-sided by the recent changes in web browsers means you haven’t been looking at data critically before. We are entering a post-cookie world of web analytics.”

Where it goes from here

The state of tracking and data privacy can take several paths from here. I outline a few of the most plausible then ask others in the analytics and digital space to offer their insights and recommendations. 

2020 Path A: Lack of clarity leads to little change from search marketers

This outcome seemed like a real possibility in the first week of January as California enacted CCPA while enforcement deadlines got delayed. It was not yet clear what enforcement would look like later in the year and it appeared, despite big promises, that tomorrow would look a lot like today. 

This path looked less likely after the second week of January. That leads us to the next section.

2020 Path B: Compounding tracking limitations keep marketers on their heels

Already in 2020 we have seen CCPA take effect, Chrome put cookies on notice, stocks for companies that rely on third-party cookies tumble, and the sacrifice of data providers that threatened consumer trust.

And that’s just January.

2020 Path C: Correction as consumer fear eases in response to industry action

The backlash to tracking and privacy is a reaction to imbalance. Consumers are protecting their data, politicians are protecting their constituents, and platforms are protecting their profits. As difficult as it is to see from our vantage point today, it’s most likely that these imbalances will normalize as stakeholders feel safe. The question is how long it will take and how many counter adjustments are required in the wake of over or under correcting.

As digital marketers, who in some ways represent both the consumers with whom we identify and the platforms with whom we depend, are in a unique position to expedite the correction and return to balance.

The post The state of tracking and data privacy in 2020 appeared first on Marketing Land.

Who’s Hiring in January 2019?

Here are our picks: Website Optimization Specialist – In Atlanta, SunTrust is looking for a specialist to be responsible for “developing and executing business strategies, processes and policies to enhance the sales and service experiences intrinsic to SunTrust’s digital spaces.” A/B Testing & Personalization Analyst – Join Barnes & Noble’s Optimization team in New York […]

The post Who’s Hiring in January 2019? appeared first on Brooks Bell.

Here are our picks:

Website Optimization Specialist – In Atlanta, SunTrust is looking for a specialist to be responsible for “developing and executing business strategies, processes and policies to enhance the sales and service experiences intrinsic to SunTrust’s digital spaces.”

A/B Testing & Personalization Analyst – Join Barnes & Noble’s Optimization team in New York to “improve’s content, design, and usability for customers and to create unique experiences based on customers’ preferences and behaviors.”

Director-Digital Product Analytics & Testing –  Join the Enterprise Digital and Analytics team at American Express in New York.  They are looking for a leader to “provide value to the online card shopping experiences within the Global Consumer and Commercial businesses through customer data and measurement, insights through analytics techniques and experimentation.”

Marketing Manager, International Conversion – Ancestry is looking for a candidate to join their Conversion Marketing team in San Francisco.  This person is “responsible for improving and optimizing the user experience at each step in the conversion funnel with the end goal of maximizing revenue from visitors in each of Ancestry’s key global markets.”

Marketing Manager, A/B Testing & Optimization – Join AuthO’s Growth Team in “driving improvement in key engagement metrics and customer experience throughout the customer lifecycle.”

Director of B2B Marketing, Demand Generation – Join Vimeo’s B2B marketing team in New York to “scale qualified lead acquisition, build and continuously optimize digital marketing, account-based marketing (ABM), email automation, social, and event-based marketing channels.”

Sr. Analyst, eCommerce Direct to Consumer Analytics – Newell Brands is looking for a senior analyst in Hoboken, New Jersey, to drive “sustainable growth online through the best-in-class use of data and analytics.”

Digital Marketing Leader – Website Optimization – Join GE Healthcare in Wauwatosa, Wisconsin to “develop a rigorous testing and experimentation framework, and conceive, scope and implement experimentation initiatives to improve the website user experience and drive conversion rate optimization.”

Manager, Marketing Planning, Test & Analysis – Express is looking for an individual to lead the testing and optimization program in Columbus, Ohio, “starting with A/B & multivariate testing taking us into experience optimization and eventually personalization.”


Looking for a job or to fill a position?  Give us a shout and we’ll help spread the word in our next careers blog post.


The post Who’s Hiring in January 2019? appeared first on Brooks Bell.

Free Guide: How to Strategize & Execute Profitable Personalization Campaigns

When I speak with our clients, it often strikes me how many of them feel overwhelmed by the very idea of personalization. Our imagination, often fueled by the marketing teams of various software companies, creates a perfect world where personalization enables every interaction to be completely custom for every individual. In this dreamland, artificial intelligence […]

The post Free Guide: How to Strategize & Execute Profitable Personalization Campaigns appeared first on Brooks Bell.

When I speak with our clients, it often strikes me how many of them feel overwhelmed by the very idea of personalization.

Our imagination, often fueled by the marketing teams of various software companies, creates a perfect world where personalization enables every interaction to be completely custom for every individual. In this dreamland, artificial intelligence and machine learning solve all our problems. All you have to do is buy a new piece of software, turn it on, and…BOOM: 1:1 personalization.

As a data scientist, I’ll let you in on a little secret: that software only provides the technological capability for personalization. Even further, the algorithms found within these tools simply assign a probability to each potential experience that maximizes the desired outcome, given the data they have access to. Suffice to say, they’re not as intelligent as you are led to believe.

If you caught our first post in this series, you already know that we define personalization a bit more broadly, as any differentiated experience that is delivered to a user based on known data about that user. This means personalization exists on a spectrum: it can be one-to-many, one-to-few, or one-to-one.

And while there are many tools that enable you to do personalization from a technical standpoint, they don’t solve for one of the main sources of anxiety around personalization: strategy

Most personalization campaigns fail because of a lack of a strategy that defines who, where and how to personalize. So I’ve put together a free downloadable guide to help you do just that. This seven-page guide is packed full of guidelines, templates and best practices to strategize and launch a successful personalization campaign, including:

  • Major considerations and things to keep in mind when developing your personalization strategy.
  • More than 30 data-driven questions about your customers to identify campaign opportunities.
  • A template for organizing and planning your personalization campaigns.
  • Guidelines for determining whether to deliver your campaigns via rule-based targeting or algorithmic targeting.

Free Download: Plan & Launch Profitable Personalization Campaigns.

The post Free Guide: How to Strategize & Execute Profitable Personalization Campaigns appeared first on Brooks Bell.

Thank You + Brooks Bell’s Best of 2018

It’s January 3, and if you’re like us, you’re already heads down at your desk and neck deep in emails. But we’d be remiss if we didn’t take a minute to reflect on the previous year. In November of 2018, we quietly celebrated 15 years of being in business. When Brooks Bell was founded, experimentation was in […]

The post Thank You + Brooks Bell’s Best of 2018 appeared first on Brooks Bell.

It’s January 3, and if you’re like us, you’re already heads down at your desk and neck deep in emails. But we’d be remiss if we didn’t take a minute to reflect on the previous year.

In November of 2018, we quietly celebrated 15 years of being in business. When Brooks Bell was founded, experimentation was in its infancy. But despite all the changes we’ve experienced since then, one thing remains true: it is the opportunity to connect with so many interesting people that are solving big problems for their business that makes our work worthwhile. Thanks for walking with us.

A look back at some of our big moments from 2018

Winning like Winona

In January, our Founder & CEO, Brooks Bell, was recognized as one of 25 women who rocked digital marketing in 2017. Later in the year, she was also announced as a Southeastern Finalist for EY’s Entrepreneur of the Year award. 

We also celebrated 2017’s record-breaking growth, were recognized as Optimizely’s North American Partner of the Year, and we garnered our local business journal’s Best Places to Work award.

Getting Lit with Illuminate

Fun fact: We originally built Illuminate to help us better manage and iterate upon our clients’ tests. Over time, we got so much great feedback, that we decided to make it available to everyone this year.

Now, with a successful beta launch under our belt and even more new features being added to the software, we’re excited to see where this new endeavor takes us in 2019.

F is for Friends, Fun and…Fear?

In October, things got a little spooky around the office and it had everything to do with Scott, our Director of Sales, who decided to channel his inner Ellen Degeneres for the day (much to our colleagues’ horror). Watch the video if you dare.

Making Bacon for our Clients

Back in 2014, we set a Big Hairy Audacious Goal to achieve $1 billion in projected revenue for our clients. By the end of 2017, we’d reached $500 million. And this past December, we hit $1 billion. (cue ::gong::)

But we’re not resting on our laurels. We’ve set some aggressive goals for 2019, with a focus on personalization, and we’re pumped to get to work.

Brooks Bell takes the Bay Area 

In September, we officially opened the doors to our San Fransisco office. This decision came after years of working with clients on the West Coast and our desire to work even more closely with them. And with the Bay Area’s rich history of innovation, we can’t think of a better place to help more companies push their boundaries through experimentation.

Still Clickin’ 

Last May, we hosted our annual Click Summit conference. We might be biased but this remains one of our favorite events as it’s filled with meaningful connections and seriously impactful takeaways. 2019 marks our 10th Click Summit, and we’ve got big plans. Request your invite today.

2018 on the blog


The post Thank You + Brooks Bell’s Best of 2018 appeared first on Brooks Bell.