Back to Basics: How every marketer can tame the analytics beast

Incorporating analytics into your work isn’t as challenging as you might think once you find the right resources (and many are free).

The post Back to Basics: How every marketer can tame the analytics beast appeared first on Marketing Land.

For most marketers, analytics exists in a magic Pandora’s box, encompassing everything from CPCs to CTRs, from algorithms to artificial intelligence, from machine learning to quantum computing — with a bit of blockchain sprinkled in for good measure.

Buzzwords aside, the barriers to incorporating analytics into your life aren’t as high as analytics behemoths may make it seem. To the contrary, once you clarify a few misconceptions, you can make this seemingly enigmatic field not only relevant but also remarkably useful.

You don’t need an Excalibur

Cost is an often-cited obstacle to starting a data journey. Despite the shiny advertisements, you may see for Adobe’s Marketing Cloud (which costs upwards of $100,000 a year) and the dozens of LinkedIn messages you get from martech salespeople; you don’t need Fortune 500 money to take a stab at unlocking analytics. Google Analytics, Google Search Trends, Hotjar and HubSpot are just a few examples of industry-standard platforms that can dramatically improve your decision-making capabilities for free.

Even better, these platforms are made for data amateurs. Their interfaces are straightforward, and if you get lost, there are countless tutorials, help forums, boot camps and even classes to help you. Google also offers a certification program for Google Analytics, complete with videos and walkthroughs. It’s perfect for anyone who needs a place to start.

Don’t let the tool guide be the craftsman

Marketers often forget that data is merely a tool. Expecting a Google Analytics tag to fix your website is like throwing a hammer at your newly opened IKEA purchase and expecting a sofa to emerge.

In other words: Collecting data is the easy part. Understanding what to do with all this info is where the magic happens.

So, spend a few weeks studying how to interpret data. Bootcamps and classes are always helpful, but the secret that every engineer already knows is that Youtube and Google are your best friend. Dig out your notes from that statistics class in college and learn how to run a simple correlation in Excel. An investment of your time today learning how to interpret data will pay dividends for the rest of our career.

Keep perspective

There are no sure things in marketing. Even scientists (and yes, I mean the ones in lab coats) often need years of data collection, rigorous modeling and endless testing to prove a hypothesis. And that’s in a lab. Imagine what happens in the real world, where things are constantly changing and driven by deadlines.

In this chaos, it’s no surprise that data rarely provides a bullet-proof answer. Sure, you can add more expensive technology, but it’s important to remember that, as marketers, we’re dealing in the realm of probability, not exact certitude.

What’s more, it’s okay to be wrong. Take every failure as a badge of honor; minimizing risk does not mean avoiding it entirely. A 95 percent chance of sunshine tomorrow still means that rain is a possibility, but also, your decision to not bring an umbrella isn’t necessarily incorrect. Make peace with the risk as long as you separate logic from emotion. In the long run, your data-driven approach will result in far more wins than losses.

You’re a solver of problems, not a creator of reports

All too often, people associate analytics with reporting. While reporting is critical, it is merely a means to an end. No business has ever been transformed by a single report.

Data is meant to be used as an unbiased means to test something. Nowhere in that definition does it stipulate that you must create daily, weekly or even monthly reports.

As we’ve seen, data takes time to collect. And while you should consistently check your data, it’s up to you to find the reporting cadence that works best for your team.

Then, instead of focusing on frequency, you can focus on presentation quality. Data is like a foreign language; it’s only useful if someone else understands what you’re saying. So, make sure your reports are thoroughly readable. Be concise, use visuals and err on the side of plain language. Above all, always return to the core business problem you’re trying to solve.

Next steps in your journey

Contrary to conventional wisdom, analytics isn’t shorthand for building sophisticated statistical models. Properly understood, analytics is a philosophy that embodies something much simpler: applying the scientific method to test your educated guesses. Whether you’re running a simple paid Facebook campaign or trying to get into shape for that Bahamas cruise this summer, you can leverage data to make more targeted, meaningful choices.

The reason you’ve read this far is that we agree on a key point: every marketer needs to integrate analytics to succeed in this digital world. In an age where it’s hard to keep up with the jargon, I fully empathize with those who view “analytics” as some enormous, mystical beast. On the contrary, understand that analytics is much more like a puppy; managing your data may be a little unruly at first, but with enough consistent training and respect, the lessons you learn will last you a lifetime.

A data journey can start tomorrow with nothing but a problem to solve or a hypothesis to prove (and a laptop with an internet connection).

So tell me, what are you waiting for?

The post Back to Basics: How every marketer can tame the analytics beast appeared first on Marketing Land.

Regression analysis to improve Google Ads performance

Using advanced techniques to make better predictions can help you stand out. Here’s a step-by-step guide to learning how to do a regression analysis.

The post Regression analysis to improve Google Ads performance appeared first on Marketing Land.

Advanced digital marketing requires us to go beyond what everyone else is doing and approach from new angles. One of the ways to stand out in your SEM analysis and performance is through advanced techniques like regression analysis. Regression is actually a form of basic machine learning (ML) and a relatively simple mathematical application. This type of analysis can help you make better predictions from your data, beyond educated guessing.

Regression might sound scary, but it’s not that advanced in the world of mathematics. For anyone who’s passed year 10 maths, you have probably already worked with regression formula previously. We’re going to look at using regression in your Google Ads to predict the conversion volume you can achieve by adjusting campaign spends. Building the model and applying it is far easier than you would think!

What is regression?

A regression model is an algorithm that tries to fit itself to the presented data best. In essence, it is a line of best fit. It can be linear, as a straight line through the data, or non-linear, like an exponential curve, which curves upwards. By fitting a curve to the data, you can then make predictions to explain the relationship between one dependent variable and one or more independent variables.

The plot below shows a simple linear regression between an independent variable “cost” (daily spend on Google Ads) on the x-axis and a dependent variable “conversions” (daily conversion volume on google ads) on the y-axis. We have fit a linear regression line (blue). We can now say that at $3k on the axis, that point on the regression line would match up to 35 conversions. So, based on the regression model fitted to the data, if we spend $3k, we are predicted to receive 35 conversions.

Headstart on feature selection

I’ve been running many of these regression models and I’ll share what I’ve found to be true, which will give you a headstart in where to start looking

Multiple regression is where some independent variables are used (rather than just one, as in the example above), to predict one dependent variable. With Google Ads, I’ve found that there is always one independent variable that is the strongest predictor of conversions. You could probably have guessed which one it is already.

When running ML model’s on daily labeled training data to predict whether certain features would lead to a conversion, we continually found that all other things being equal, campaign spend is the strongest predictor of conversion volume.

The following table shows the “Root Mean Squared Error” (RMSE) for different ML models.

RMSE is a measure of error, it shows how far off the fitted model is from the training data. The lower the error the better – it means the model is more accurately fitted to the data. (2) All features include: Day of week, keyword, CTR, CPC, Device, final URL (landing page), ad position & Cost.

We ran five different machine learning algorithms: Decision Tree, K Nearest Neighbours, Linear Regression, Random Forest and Support Vector Regression. In most cases, removing “cost” as a feature in the data set, increased the error value by more than removing any other feature. This means that the model became less accurate at predicting the correct outcome.

We can also analyze the feature importance used by the random forest (the best model). It’s clear that cost is the key feature the algorithm is using to determine its results:

This shouldn’t come as too much of a surprise – the more you spend, the more likely you will receive sales. Using cost as a predictor for sales is a great place to start your regression analysis.

Building a regression from scratch with Google Ads data

Here we’ll show you how to build a regression model with “daily cost” as the independent variable and “daily conversions” as the dependent variable. We’re going to do this in 5 easy steps.

Note: This will only work with a Google Ads account that has conversion data in it.

Step 1 – Create report:

Within Google Ads, navigate to Reports >> Predefined Reports >> Time >> Day

Step 2 – Prepare report and download:

Once in the report (screenshot below), select the “columns” button (red box), then remove all columns except “Cost” and “Conversions.” Then select a date going back one year from today (blue box). lastly, download the report as an “excel .csv” file (green box).

Step 3 – Generate scatter graph in Excel:

Open the excel file and select columns that contain only the “cost” and “conversions” data. In the example below, cells C3:D17. Then in the menu bar select “Insert’ >> ‘scatter graph.”

Step 4 – Generate regression line on scatter graph:

We’ve now got a beautiful scatter graph portraying “cost” and “conversions.” Generate a regression line by right-clicking on any of the data points and selecting “add trendline.”

Step 5 – Choose best regression line using r-squared:

In the menu on the right-hand side, you are now able to select different regression options (red box). Select the checkbox “Display R-squared value on chart” (pink box). In a general sense, the higher the r-squared, the better the fit of the line. As you cycle through different regression lines, you can view which has the highest r-squared value. You can also decide visually which appears to fit best. Next, add the regression formula for the fit you have chosen (green box). We will use this formula to make predictions.

Making extended predictions using the regression equation

The regression line that we have just created is extremely useful. Even from a visual perspective you are now able to visualize what your expected daily conversions will be at any point of daily cost.

Although this can be done visually, using the regression formula is more accurate and you can also extend the predictions off the graph. In the example below that I have plotted (with a larger account), the regression equation is given as y = 28.782*ln(x) – 190.36.

In the equation y represents conversions, and x represents “cost.” To predict y for any given x, we replace x with a real number. Let’s assume a cost of $5,000. We say y = 28.782*ln(5,000) – 190.36. Using a calculator, it comes out to 54 conversions per day.

Now the real power here comes when we extend this calculation beyond the graph to where spend has not been before. The data points on the graph show the highest spend ever performed per day was under $7,000. If we replace x with 10k, (a predicted spend of $10,000 per day), I can get an estimate using the formula, of 74.7 conversions per day.

Bonus: Finding Optimal points or diminishing returns with CPA

Graphing the “cost” and “conversions” together is extremely powerful for being able to predict conversions at different spends. But in reality, often we’re more interested in minimizing CPA or predicting conversions at a specific CPA. We can similarly graph CPA against conversions to better understand this.

From the CPA chart on the right we identify a minimal point where CPA is lowest on the cost dimension, this is the bottom of the ‘U’ shape. This point also corresponds on the left graph (cost vs. conversions) with the green line.

Using this methodology we can now identify the lowest CPA potential, at what cost this occurs and then also predict how many conversions we would receive at that point. The same can be done for any point on the CPA line.

Disclaimers

It’s critical to mention that regression uses historical data only. All of the costs and conversion data is based on what has happened in the past. Therefore if you expect your performance to improve and conversions to increase in the future, this will not be taken into account in these models. To adjust for this, taking more recent data only, such as six months back or three months back could be a better option. Similarly, you can remove or include “days,” during sales periods that may or may not be relevant, in order not to skew the data.

Case studies and application

Using this methodology, we have been able to achieve three key outcomes with clients:

  1. We have helped existing clients estimate what will happen if they increase their monthly spends by $10,000. This is a very common client question and this method is better than educated guesses since it is modeled with data.
  2. We have been able to show existing clients where the optimal CPA lies and how much potential exists in the account. For a major client of ours in the competitive legal space, this has allowed them to decrease CPA’s by over 20 percent and keep conversion volume steady.
  3. than has made new account audits faster and more accurate for us. Without knowing too much about a new client, we have plugged in historical “cost” and “conversion” data into a regression model to visualise whether they are spending the optimal amount they should be and discover the potential down the road.

Further exploration

Consider that many businesses are interested in revenue and ROI, rather than conversions and CPA. The same techniques can be used to predict revenue as well as options to maximize ROI (we look for maximal points rather than minimal). I’m currently building a PPC optimization tool to automate this graphing and prediction process.

The post Regression analysis to improve Google Ads performance appeared first on Marketing Land.

Martech can deliver personalized consumer experiences, but humans are still required

The human component of marketing remains as important as ever to make technology more effective over the lifetime of a brand-consumer relationship.

The post Martech can deliver personalized consumer experiences, but humans are still required appeared first on Marketing Land.

In the ever-evolving age of digital marketing, marketers are competing against many things: time constraints, a noisy marketplace, endless channels and formats for content, just to name a few. But the most challenging is delivering contextually-relevant content. Brand marketers are beginning to see help come from artificial intelligence. The talk of AI and martech isn’t new, but real use cases are now becoming a reality. Specifically, AI lends itself well to analytics and predictive data that marketers are now using to deliver what people care about most at any given moment. By taking into account all of the detailed data points across every digital channel over the lifetime of a brand-and-customer relationship, marketing is becoming increasingly relevant and personalized.

Data is the fuel for AI

AI has proven most useful in analyzing data and putting meaningful optimizations to work in real-time. It has the potential to greatly collapse the time-to-market for optimization as compared to traditional operational processes that take weeks or even months to deploy, analyze and improve based on the results. It helps brands understand what, when and how to serve customers to drive the dream state marketers have been aspiring to reach for a long time: real-time contextual relevance.

To get to contextual relevance, AI finds its fuel in data. There are a couple of different types of data that brands can lean on to make AI more intelligent:

  • Implicit data – This is data collected from behaviors such as clicks, purchases, engagement with certain types of content/products, etc. It’s the bread crumb trail of data exhaust that is naturally left behind by digital engagements without having to ask consumers for it. It’s the largest data set that brands have and it’s also the hardest to manage and put to use in meaningful ways.
  • Explicit data – This is data collected by soliciting and receiving direct feedback from customers. This can be profile data points, NPS score or other survey response data, feedback during customer service interactions, etc.

Implicit data is the most powerful data set because it’s massive and the more data AI has to feed it, the smarter it becomes. Explicit data is incredibly important when it comes to AI too, though, but you don’t get a full picture of that data set without the merging of digital and customer service inputs.

The human channel and AI

The customer service and digital marketing gap are still wide, but closing that gap is more important now than ever. Customer expectations are rising as are the chances for people to abandon brands due to poor service and marketing. Customer service is the place where one-to-one personalization is king, and arguably where the most impactful interactions between brands and customers happen. When it comes to salvaging and strengthening relationships by creating unique, personalized experiences, service can be the definer.

While real human interaction will become the differentiator, AI can optimize these interactions. Just as a marketer would orchestrate email, SMS, push, app and web experiences, brands can add the “human channel” to the mix. AI can help determine when a human should be the channel of choice and suggest what the message from that human could be. As interactions become more digitally focused, including the emergence of chatbots, speaking with a knowledgeable human at a brand will anchor any brand-and-consumer relationship.

And regarding the explicit data that is needed to fuel AI? The information flow needs to be bi-directional to ensure that data can be captured by the human channel and fed back into the brand’s data ecosystem, ultimately to be used by other channels. While the human interaction is a cost center for a brand, the value is unmatched, bringing dual benefit to both the customer and the brand; valuable data is collected to make the next engagement even better. It’s a win-win.

Things to consider

Three things to consider when implementing AI solutions:

  • Scale: Unfortunately, many brands that fine-tune their data and marketing strategies do so without the ability to truly scale, so they stop at ‘walk,’ so to speak, and can’t get to run. If that applies to you, reconfigure your scaling strategy and work AI components in from the start, with the clear goal that it is there to boost your ability to operate exponentially.
  • Keep it personal: Leaning too much on machines and AI make brand-to-consumer relationships impersonal. Clearly define your engagement strategies and prioritize the touchpoints that benefit from the human touch. Sincerity in building the relationship is paramount.
  • Lean on your customer data: Unless they opt out, every customer is leaving a breadcrumb trail of feedback, and those data points are clues to how you can better serve them. AI can be instrumental in understanding a customer and strengthening those relationships by providing a contextually relevant view of the customer and generating recommendations on how best to move forward.

As an industry, we get excited about AI and how it can change the way we interact and understand our customers. But at the end of the day, the human component of marketing remains most important and is required to make our technology, including AI, more effective. Customer service reps, data scientists and digital strategists with an eye for emerging tech will be valuable players that will ensure the entire ecosystem is operating to its fullest potential.

The post Martech can deliver personalized consumer experiences, but humans are still required appeared first on Marketing Land.

Confirm the integrity of your data

Work with partners who can help you build an early warning system for any data issues that arise because your investment is at stake.

The post Confirm the integrity of your data appeared first on Marketing Land.

Never before has there been a greater need for a reliable, holistic marketing measurement tool. In a world of fractured media and consumer interest, intense competitive pressure, and lightening-speed product innovation, the sheer volume of data that must be analyzed and the decisions that must be made demand a more evolved approach to attribution and decision making. This need for speed has brought into bright focus a mandate for reliable, consistent and valid data, and the potential for challenges when there are errors.

The attribution category has been evolving quickly over the past decade, and there are myriad options from which marketers can choose. Recent research conducted by Forrester suggests that leading marketers are adopting the newest and most advanced approach: Unified Measurement or Total Marketing Measurement models. This analysis combines the attributes of person-level measurement with the ability to measure traditional channels such as TV. Marketers who upgrade to and invest in novel solutions – financially and organizationally – can find a competitive advantage from smarter attribution.

The greatest of these instruments answer problems such as the optimal frequency and reach in and between channels and determine which messages and creative are best for which audiences. New advances in these products are providing even more granular insights concerning message sequencing, and next-best message decisioning based on specific audiences and multiple stages of their buying processes. The best of these solutions incorporate external and environmental circumstances such as weather, travel patterns and more. Furthermore, capabilities of today’s solutions produce insights in such a timely fashion that agile marketers can include those insights into active campaigns to drive massive performance gains, rather than waiting for weeks or months to see returns.

However while these attribution models have evolved a long way in recent years, there is one challenge that all must tackle: the need for reliable, consistent and valid data. Even the most advanced and powerful of these systems are dependent on the quality of the information they ingest. Incorrect or sub-par input will always produce the wrong outputs. Data quality and reliability have become a primary focus of marketing teams and the forward-thinking CMOs who lead them.

If the data are not accurate, it doesn’t matter what statistical methods or algorithms we apply, nor how much experience we have in interpreting data. If we start with imperfect data, we’ll end up with erroneous results. Basing decisions on a conclusion derived from flawed data can have costly consequences for marketers and their companies. Inaccurate data may inflate or give undue credit to a specific tactic. For example, a model may indicate that based on a media buy a television advertisement –usually one of the most expensive of our marketing efforts – was responsible for driving an increase in visitors to our website. But, if this ad failed to air, and there is inaccurate data in a media log, the team may wrongly reallocate budget to their television buy. This would be a costly mistake.

In fact, inaccurate data may be one of the leading causes of waste in advertising. These inaccuracies have become an epidemic that negatively impacts both advertisers and the consumers they are trying to reach. Google recently found that, due in large part to bad data, more than 56 percent of ad impressions never actually reach consumers, and Proxima estimates $37 billion of worldwide marketing budgets go to waste on poor digital performance. And that’s just digital. The loss for major players who market online and offline can be extensive, and it’s calling for a revolutionary new approach to data quality and reliability.

So, how accurate is your data? Do you know if there are gaps? Are there inconsistencies that may queer your results? Many of us put a great deal of trust in our data systems leaving us forgetting to ask these critical questions. You can’t just assume you have accurate data – now more than ever you must know you do. That may require some work up front, but the time you invest in ensuring accurate data will pay off in better decisions and other significant improvements. Putting in place, from the start and early in the process, steps and checks to ensure the timely and accurate reporting of data is key to avoiding costly mistakes down the road. Solving these problems early in your attribution efforts helps build confidence in the optimization decisions you’re making to drive higher return on investment and, perhaps more importantly, will help teams avoid taking costly missteps.

When it comes to attribution, it is especially critical to make sure the system you are relying on has a process for analyzing and ensuring that the data coming in is accurate.

Below are four key considerations, when working with your internal analytics staff, agencies, marketing team and attribution vendor, you can use to unlock more positive data input and validation to ensure accurate conclusions.

1. Develop a data delivery timetable

The entire team should have a clear understanding of when data will be available and, more importantly, by what date and or time every data set will arrive. Missing or unreported data may be the single most significant threat to drawing accurate conclusions. Like an assembly line, if data fails to show up on time, it will stop production for the entire factory. Fortunately, this may also be one of the easiest of the challenges to overcome. Step one is to conduct an audit of all the information you are currently using to make decisions. Map the agreed upon or expected delivery date for every source. If you receive a weekly feed of website visitors, on what day does it typically arrive? If your media agency sends a monthly reconciliation of ad spend and impressions, what is the deadline for its delivery?

Share these sources of information and the schedule of delivery with your attribution vendor. The vendor, in turn, should develop a dashboard and tiered system of response for data flow and reporting. For example, if data is flowing as expected, the dashboard may include a green light to indicate all is well. If the information is a little late, even just past the scheduled date but within a predefined window of time, the system should generate a reminder to the data provider or member of the team who is responsible for the data letting them know that there may be a problem. However, if data is still missing past a certain point, threatening the system’s ability to generate optimizations, for example, an alert should be sent to let the team know that action is needed.

2. Create standard templates for routinely reported data

You, members of your team, and your attribution partner need a clear understanding of what specific data is included in which report and in what formats. It would be a shame to go through the hard work of making sure your information is arriving on time only to find out that the data is incomplete or reported inconsistently. To use the assembly line analogy again, what good is it to make sure a part arrives on time if it’s the wrong part that’s delivered?

Like quality control or a modern-day retinal scan, the system should check to see if the report matches expected parameters. Do the record counts match the number of records you expected to receive? If data from May was expected, do the dates make sense? And, is all the information that should be in the report included? Are there missing data?

With this system in place, a well-configured attribution solution or analytics tool should be able to test incoming data for both its completeness and compliance with expected norms. If there are significant gaps in the data or if data deviates overmuch from an acceptable standard, the system can again automatically alert the team that there may be a problem.

3. Use previous data from the source to confirm new data

Your attribution provider should be able to use data previously reported from a source to help identify any errors or gaps in the system. For example, you can include in your data feed multiple weeks or months of previously reported data. This feed will produce one new set of data and three previous sets of overlapping data. If overlapping data does not match that will trigger an alert.

Now you’ll want to determine if the data makes sense. You want to see if new data is rational and consistent with that which was previously reported. This check is a crucial step in using previously published data to confirm the logic of more recent data reported.

Here, too, you can check for trends over time to see if data is consistent or if there are outliers. Depending on the specific types of media or performance being measured a set of particular logic tests should be developed. For example, is the price of media purchased within the range of what is typically paid? Is the reach and frequency per dollar of the media what was expected?

Leading providers of marketing attribution solutions are continually performing these checks to ensure data accuracy and consistent decision making. With these checks in place, the marketing attribution partner can diagnose any problems, and the team can act together to fix it. This technique has the added benefit of continuously updating information to make sure errors, or suspicious data, don’t linger to confound ultimate conclusions.

One note here that should be taken into account: outliers are not necessarily pieces of bad data. Consider outliers as pieces of information that have not yet been confirmed or refuted. It is a best practice to investigate outliers to understand their source, or hold them in your system to see if they’re not the beginnings of a new trend.

4. The benefit of getting information from multiple sources

Finally, there are tangible benefits to confirming data from multiple data sets. For example, does the information about a customer contained in your CRM conform with the information you may be getting from a source like Experian? Does data you’re receiving about media buys and air dates match the information you may be receiving from Sigma encoded monitoring?

Even companies that are analytics early adopters find themselves challenged to ensure the data upon which they rely is consistent, reliable and accurate. Marketers understand that they have to be gurus of data-driven decision making, but they can’t just blindly accept the data they are given.

Remember, as we have mentioned, despite the potential benefits of a modern attribution solution, erroneous data ensures their undoing. To be certain your process is working precisely, create a clear understanding of the data and work with a partner who can build an early warning system for any issues that arise. Ultimately, this upfront work ensures more accurate analysis and will help achieve the goal of improving your company’s marketing ROI.

As a very first step, since data may come from multiple departments inside the company and various agencies that support the team, develop a cross-functional steering committee consisting of representatives from analytics, marketing, finance, as well as digital and traditional media agencies; the steering committee should have a member of the team responsible for overall quality and flow. As a team, work together to set benchmarks for quality and meet regularly to discuss areas for improvement.

In this atmosphere of fragmented media and consumer (in)attentiveness, those who rely on data-driven decision-making will gain a real competitive advantage in the marketplace. Capacities of today’s solutions produce insights in such a timely fashion that the nimblest marketers can incorporate those insights into active campaigns to drive massive performance improvements, rather than waiting for weeks or months to see results. But the Achilles heel of any measurement system is the data upon which it relies on generating insight. All other things being equal, the better the data going in, the better the optimization recommendations coming out.

The post Confirm the integrity of your data appeared first on Marketing Land.

4 simple ways small businesses can use data to build better customer relationships

An effective CRM strategy needs to include a way to collect, organize and take action with your data.

The post 4 simple ways small businesses can use data to build better customer relationships appeared first on Marketing Land.

In a world where customers are bombarded across every possible channel with brand messages, targeting is more important than ever before. Small businesses need to be able to make their campaigns feel relevant and personal in order to keep up, but the processes involved – collecting, organizing and interpreting customer data to make it actionable – are often intimidating to small businesses and solo entrepreneurs with limited time and resources.

Collecting, organizing and learning from your customer data is critical no matter how large your team is or what stage of growth you’re in. In fact, there’s no better time to consider your processes for data than when you’re just starting out. And getting started with basic strategies for building customer relationships doesn’t have to be difficult – there are some simple steps you can take to save yourself a lot of time as your business grows and scales.

From the moment you start your business and establish an online presence, you should be laying the groundwork for effective CRM strategies. This includes: establishing a single-source of truth for your customer data, being thoughtful and organized about how you collect information and setting up the right processes to interpret that data and put it to work for your marketing. Here are some actionable steps (with examples) to take now:

  • Collect: Make sure you’re set up to onboard people who want to be marketed to. Whether you’re interacting online or in person, you should be collecting as many insights as possible (for example, adding a pop-up form to your website to capture visitors, or asking people about their specific interests when they sign up for your email list in store) and consolidating them so you can use them to market.
  • Organize: Once you have this data, make sure you’re organizing it in a way that will give you a complete picture of your customer, and make it easy to access the insights that are most important for your business to know. Creating a system where you can easily sort your contacts based on shared traits – such as geography, purchase behaviors or engagement levels – will make it much easier to target the right people with the right message.
  • Find insights: Find patterns in data that can spark new ideas for your marketing. For example, the realization that your most actively engaged customers are in the Pacific Northwest could lead to a themed campaign targeting this audience, a plan for a pop-up shop in that location or even just help you plan your email sends based on that time zone.
  • Take action: Turn insights into action, and automate to save time. As you learn more about your audience and what works for engaging them, make sure you’re making these insights scalable by setting up automations to trigger personalized messages based on different demographic or behavioral data.

Doing this right won’t just result in more personalized marketing campaigns and stronger, more loyal customer relationships – it will also help you be smart about where you focus your budget and resources as you continue to grow.

The post 4 simple ways small businesses can use data to build better customer relationships appeared first on Marketing Land.

Is attribution dead? The answer is yes and no

Multi-touch attribution is only going to get harder due to platform changes and a focus on privacy but there are a few approaches used within MTA that can produce actionable insights.

The post Is attribution dead? The answer is yes and no appeared first on Marketing Land.

Attribution is an analytical method that takes a lot of user-level data and tries to measure the impact of specific tactics on a positive outcome, such as a sale. In its algorithmic form, it is supposed to be an improvement on quaint methods like last-click-takes-all, which are obviously wrong, but very convenient. The purpose of attribution is to give fair credit to the tactics – placements, creative ideas, formats – that work.

The term “attribution” refers to several types of models: Sales Attribution, Location Attribution and Multi-Touch Attribution (MTA). When people say “attribution is dead,” they’re usually referring to MTA and not the other two types of models. Sales attribution and location attribution are continuing to gain adoption within the industry as more media is executed through addressable channels and consumers increase mobile engagement and retailers seek to monetize their sales data.

Multi-Touch Attribution isn’t dead, it’s just hard

Multi-Touch Attribution (MTA) is not “dead” but it has always been hard to accomplish. True MTA was always an aspirational goal, as no single approach, or vendor captured all the touchpoints in the consumer journey. Vendors like Adometry (now Google 360) had specific limitations in mobile exposure due to the inability to tag on Safari or iOS. Thus, some brands were analyzing data on a sample of one percent of site traffic.

MTA is only going to get harder due to platform changes and a focus on privacy. Platforms like Google, Amazon and Facebook have restricted cross-platform tagging for their proprietary solutions, while party vendors (like C3 Metrics, Nielsen, Neustar/Marketshare and Visual IQ) are pixel-based solutions with limits to where their pixels catch consumer signals.

First, Google removed third-party tracking from YouTube and then Facebook, always restrictive in tagging, to sunset the ability to DCM tag on its site. In addition, as a reaction to GDPR, they closed many other linkages to their site. One major example was Google’s announcement of eliminating the Google ID from DCM records and log files, forcing consumers who wish to track in Google’s ecosystem into their Ads Data Hub product. Apple rolled out ITP 2.0, and Mozilla followed suit in Firefox that drops third-party tracking pixels for privacy and speed purposes.

But it’s going to get even harder. Emerging high-growth media channels like OTT, ATV and podcasts have yet to have a consistent measurement solution. California passed its interpretation of the EU GDPR, called CCPA, which comes into effect in January 2020 so we anticipate more platform reactions that close more tracking abilities.

Some MTA models are still viable

But it’s not all bad news. A few approaches used within MTA can produce actionable insights. One example is through reimagining Media Mix Modelling (MMM) by applying a channel/partner based approach. Instead of modeling broad level digital, social and mobile channels, this approach goes deeper into comparing the likes of Google-owned and operated, individual publishers, Facebook and Twitter, and calculating their media elasticity in that way.  Another approach is to leverage experimental design and conduct incrementality tests using Ghost Ads or Randomized Control.

In the vein of utility, the native platforms that offer their proprietary attribution, such as Facebook, Google and Amazon, do provide value. However, expectations should be set on the tracking limitations of each solution.

In summary, as George Box said, “All models are wrong, but some are useful.” While attribution has never achieved the promise that it was supposed to solve, sales attribution and location attribution models continue to be adopted as they connect deterministically digital media activity to a business outcome. While MTA will continue to be challenged, keeping lowered expectations of what insights MTA can provide, balanced with an understanding of the data limitations from platform solutions, can still yield insights.

The post Is attribution dead? The answer is yes and no appeared first on Marketing Land.

Unlocking metrics that matter

Technology that gives consumers a voice to tell brands what they think helps marketers measure the more valuable nuances below the top-line metrics noise.

The post Unlocking metrics that matter appeared first on Marketing Land.

The defining feature of digital marketing is the ability to measure just about anything. That’s a wonderful development, but it can also be a double-edged sword. To borrow loosely from the title of a popular book, the ability to measure everything can cause us to amplify the noise and bury the signal. Here’s how marketers can tune out the noise of top-line metrics and unlock metrics that truly matter for their business.

What are top-line metrics?

Top-line metrics are common digital media measurements like impressions and views. In a recent eMarketer report, marketers cited a lack of consistent metrics as their number one concern. Perhaps that’s one reason why top-line metrics remain the common currency of industry discussions. After all, a view is a metric that a CPG marketer can easily compare with a TV impression. However, that comparison point has little, if any, value in marketing.

Sophisticated brands succeed by measuring the nuances that live below the top-line. To do that, marketers must first embrace two ideas. First, technology gives consumers a voice to tell brands what they think. Second, brands must be prepared to hear those voices and act.

It all begins with a clear objective

Top-line metrics don’t do much in terms of driving business impact, but as the name suggests, they are a good starting point on the journey toward metrics that matter. That journey, however, isn’t about marketing for its own sake — it’s about delivering on specific business goals, and those goals are the products of a clear objective.

Are you looking for aided or unaided brand awareness? For getting your product into a consumer’s consideration set? For changing perception of your brand? For establishing the values of your company in the mind of consumers? For establishing particular product attributes? These are the metrics that truly matter — metrics that measurably move a consumer down the funnel toward purchase.

Honing in on the metrics that matter isn’t about taking the brand’s business goals and plugging them into the marketing stack. The key is putting a laser-focus on the brand’s values to identify and scale enthusiastic audiences that progress through a sales funnel and become customers.

Consider a toothpaste marketer. Historically, those marketers were in a war for shelf space, but in a world that’s increasingly about direct-to-consumer, they’re fighting for mindshare. What attributes bring a consumer to your brand? Quality? Natural ingredients? Sustainability?  Whitening? Taste?

As marketers move away from the top-line impression-based metrics, the relevant metrics become more refined and more relevant. Marketing is shifting to a discipline that’s about making one-to-one connections between consumers and brands, and then developing strategies to scale those connections. Top-line metrics are too much noise and too little signal to find discernable business value. The metrics that matter most are those that draw a measurable line between the values brands share with their customers and the actions consumers take.

The post Unlocking metrics that matter appeared first on Marketing Land.

What does online lead generation look like in 2019?

A good campaign uses optimized forms (the result of user testing), well-designed chatbots and a data processing strategy that provides useful insights.

The post What does online lead generation look like in 2019? appeared first on Marketing Land.

If you ignore the noise and trends that arise each year (and the predictions, so many predictions), the simple truth is that most marketing activities are fundamentally focused on gaining and keeping additional customers. For many businesses, particularly in B2B, this means increasing the quantity and quality of your leads.

At HubSpot (my employer), this is obviously a big focus, so we conducted research recently to see how marketers are looking at lead generation and lead capture in the modern era. With all the talk about conversational marketing, the increasing prevalence of analytics, data enrichment and experimentation we wondered, “what has changed, and what is still the same?”

Here’s a summary of what we learned as well as what it means for your marketing efforts moving forward.

Forms still matter

While everyone the MarTech space keeps talking about the emergence of chatbots, and forms being boring and frustrating, most marketers are still using forms. And most consumers still have no problem filling them out (so long as they are well designed, which we’ll get into).

There are 74 percent of marketers using web forms for lead generation, and 49.7 percent of marketers say that web forms are their highest converting lead generation tool.

Of course, chatbots, live chat, quizzes and all other forms of lead capture have their place. But I wouldn’t count out forms in 2019.

Measurement and optimization are underutilized

We asked respondents about three data-driven research methodologies:

  • Form analytics – setting up measurement to see drop off rates and calculate the conversion rate of your forms.
  • User testing – a form of usability testing where you ask a user to perform a task on your site and watch how they complete it.
  • A/B testing – putting a new variation against the original to see which one performs better, using statistical tests.

It’s best practice to make decisions using data, do user research, and run experiments, right?

Sure, but many marketers still aren’t doing any of those things, at least when it comes to their forms. For instance, 36 percent of respondents said they run zero user tests on their forms.

A quarter of marketers said they aren’t using form analytics, and about 45 percent of marketers don’t do any A/B testing on their forms:

But those who are using analytics, user tests, and A/B tests all report better results (unsurprisingly).

For instance, marketers who run A/B tests on their forms tend to be more satisfied than those who don’t. They also report roughly 10 percent higher conversion rates than those who don’t run A/B tests.

Those who use form analytics report, 15 percent higher satisfaction with their lead generation efforts and 19 percent higher conversion rates.

Finally, we found that people who run user tests are more satisfied with their lead generation programs than those who don’t and that the satisfaction rating increases as the number of user tests rise.

Our survey backs common intuition: data-driven decision making and design are more effective.

Interactive forms and conversational marketing

One thing that inspires a lot of discussion is “conversational marketing.” Where a traditional web form is a static experience, “conversational marketing” tools hope to mimic the one-to-one nature of a human conversation.

Tactically, this usually means chatbots, but there are other cool iterations, such as the way Typeform designs their forms one step at a time (with conditional logic) or how Conversational Form works:

So far, most of the hype and excitement marketers feel for the technology has been unanswered by any sign of happiness on the part of the customers.

There has been mixed research on the topic. For instance, SalesForce published a study that suggests 69 percent of consumers prefer chatbots for quick communication. But if you read closely, the question was actually only comparing chatbot to application-based support for a series of customer service benefits like “convenience.”

Most research suggests the opposite: customers mostly prefer human interaction at this point.

One such study [.pdf], commissioned by PwC, found the following:

Today, 64 percent of U.S. consumers and 59 percent of all consumers feel companies have lost touch with the human element of customer experience. 71 percent of Americans would rather interact with a human than a chatbot or some other automated process.

It’s not difficult to understand why consumers feel that way when they’re greeted with experiences like this time after time:

Source: chatbot.fail

People love the convenience, though, so it’s not impossible to make chatbots and automation work.

But you have to strike a balance between economic efficiency for the marketer, convenience for the customer, and the human touch that makes the experience feel delightful.

Most of the time, this involves an intelligence balance of automation and personalization. As a rule of thumb, you can try to automate 80 percent of the experience, but leave the final 20 percent open for human interpretation and personalized touchpoints.

This way, you can achieve the efficiency benefits of automation and chatbots, but you’ll keep your customers happy as well.

Automate, enrich and optimize your campaigns

There are a series of technologies that are making it easier to:

  • Collect leads
  • Predict lead quality
  • Enrich data
  • Centralize data and utilize it in campaigns

Tools like Clearbit are making it easier to cut off needless form fields, and instead, enrich your data with known information about a given email address or domain.

Once we have a good mixture of contact property data and behavioral data, tools like Madkudu can help you predictively score leads to better assess their sales readiness.

Finally, the world of CRMs and Customer Data Platforms is making it easier to collect data from disparate sources and centralize it in a place that you can utilize it in real time. This can enable cool personalization campaigns at scale.

All of this is emerging technology, and while most companies aren’t at this level of complexity yet, we see some companies begin to go beneath the surface. If you’re looking to be inspired, the Hull.io blog is a great place to find examples of this in the wild.

Conclusion

In 2019, most aspects of lead generation are remarkably similar to years past. Marketers are still using forms and customers are still happy to fill them out (so long as they are well optimized).

Emerging trends like chatbots and automation are still trying to bridge the gap between customer expectation and reality. With intelligent design, I believe chatbots can be a powerful part of your lead generation strategy (just don’t rely on 100 percent automated solutions yet).

Finally, data processing and pipeline tools like Clearbit, Hull.io, Madkudu and more are enabling new campaign capabilities by making all the data we’re collecting meaningful and usable.

It’s a good time to be a marketer.

The post What does online lead generation look like in 2019? appeared first on Marketing Land.

Identity management investment can pay off, here’s how

Marketers must examine how people-based IDs differ and how quality impacts identity through activation. Learn how to evaluate your program.

The post Identity management investment can pay off, here’s how appeared first on Marketing Land.

The marketing industry has been awash with articles and papers talking about marketing technology and the importance of linking identity management across an enterprise’s investments. And rightfully so. Brands should be laser focused on these topics because, simply put, they are the fundamental building blocks for establishing a meaningful, direct relationship with customers and, in turn, gaining competitive advantage.

The challenge, like many past inflection points in our industry, is how to capitalize on this. What is needed, beyond the actual physical technology and people? In my experience, the “how to activate” is often the last consideration, but really, it should be the first place to start. Let’s take a deeper look at how this impacts the need for a tactical, ground-up data plan for identity management.

Identity as a whole is impacted by the level of fidelity of your data and how it’s able to paint a clear picture of your customers, their brand interactions and the end-to-end customer journey.

Let’s use the analogy of music to help bring some clarity. I’ve always appreciated sound quality and the impact it has on my listening experience. There are multiple areas that impact the sound quality, from the environment you’re in (e.g., subway vs. home) to the device with which you’re listening (e.g., Apple earpods vs. home speakers). Most important, though, is the source. If the source file (e.g., MP3 vs. FLAC) is not high quality, your listening experience can suffer.

It’s the same with identity. Identity necessitates the highest fidelity source of data. In this case, moving from a cookie-based to a people-based world is like moving from music on cassette tapes (remember those?) to high-quality digital music files.

Today’s world of marketing is complex, with multiple ways to link customer data. These range from cookies to offline transactions IDs, all the way to people-based, one-to-one linkage. As marketers progress in adoption to 100 percent people-based marketing, they must think about why all people-based IDs are not equal and how the fidelity (i.e., the cleansed one-to-one view of a customer) impacts identity all the way through activation.

As you continue on your journey, there are many identity-related considerations, including the four key areas listed below. They illustrate the impact identity has on your people-based marketing activation, using as an example a group of customers who are top-tier loyalty members:

1. People-based platforms must be connected to activation. If an ID is not linked directly to activation, drop-off and de-duplication can occur, impacting one-to-one marketing and marketing ROI results.

Example: You want to cross-sell into this group with a new premium product by leveraging an integrated campaign with paid display and measuring the incremental impact of display on sales. To enable activation, you’ll need to turn the loyalty-based PII to anonymous IDs, such as cookies, and activate them via platforms like demand-side platforms (DSPs) for paid display targeting.

This process of turning a known loyalty audience to cookies needs to be seamless and is the point where media marketing ROI can be impacted. Industry challenges like cookie deletion and changes in devices (e.g., a new tablet) necessitates that your PII data be linked and refreshed continuously with your customers’ cookies, otherwise breakdown can occur.

If cookies are lost, it will adversely affect your ability to measure downstream engagement and the incremental effect of paid display ads on sales.

2. People-based platforms need to bring higher fidelity audience profiling capabilities from rich third-party data, leading to better insights and more precise models.

Example: Let’s say you want to use third-party data to get a deeper understanding of your audience’s interests in your new product segment. What happens if a high percentage of individuals just got a new mobile device, and they don’t authenticate for several weeks? Audience-based platforms need to continually link between known and unknown IDs; otherwise, customer insights will not be precise.

3. People-based platforms should be connected directly to offline martech PII data, enabling one-to-one resolution at the anonymous ID level.

Example: Relating to our first key area, connecting your offline PII to anonymous IDs is critical. If you have a high-value group of known customers you want to activate and cross sell, the need to speak to them one-to-one in any channel is critical. If you’re speaking to someone in a display ad and you can’t be certain it is the person you are targeting, then your ability to extend your conversion is highly limited to known channels, such as email.

4. People-based platforms should be able to easily interact/activate with offline segmentation models that incorporate a mixed set of martech data from DMPs to loyalty programs enabling seamless activation and optimization of marketing ROI insights.

Example: The adage “what’s old is new again” is a key theme in the way CRM principles are being extended to today’s ecosystem of digital marketing. Many organizations have invested a lot of time and effort into “offline” models. Whether they are credit risk models or customer segmentation across product offerings, the ability to take offline PII based-models and bring them into a digital ecosystem is critical.

While these considerations are just a starting place, I hope they help bring some food for thought in our exciting and rapidly changing marketing ecosystem. Here’s to continued success in 2019 and beyond.

The post Identity management investment can pay off, here’s how appeared first on Marketing Land.

The in-housing trend is all about data

In 2019, we’ll see more brands increasingly turn to digital ad agencies and tech consultants nimble enough to act as an extension to their internal teams.

The post The in-housing trend is all about data appeared first on Marketing Land.

As the advertising industry evolves, and brands look to have more ownership of their data and measurement, big media holding companies have struggled — Omnicom, IPG and WPP included. The roles of agencies, brands, tech platforms continue to shift and consultants have edged in on the agency piece. “Tech consultants are the new mad men,” declared the Wall Street Journal in November 2018.

But there’s certainly a middle ground here. Most, if not all, in-house teams simply won’t be able to be the best at everything. The speed and rate of industry change are just too high, and top talent is too difficult to retain. The result — in 2019, we’ll see more brands increasingly turn to digital advertising agencies and tech consultants nimble enough to act as an extension to their internal teams.

As Electronic Arts’ Global Head of Media Belinda Smith asserts, “Brands taking marketing, media or content in-house does not mean the apocalypse for agencies — quite the contrary.”

Advent of transparency

We believe that the conversation of in-housing is actually about transparency and ownership of data. Marketers are finding data to be more accessible, but data without a strategy is useless. In 2018, consumer privacy, fake news and brand safety were areas of concern for most CMOS. By owning data and data sources, brands can not only better understand the customer journey but can also establish more trust. Good first-party data, that is collected correctly, is the only way to capture clear insights.

Understanding customer interactions across all touchpoints is the number one challenge for today’s marketers. As WPP CEO Mark Read put it, “It’s clear that scale has moved from buying power to the power of intelligence, and the heart of that is data.” Clean data provides insights into the kind of content that works, where and to what audiences are responding but often requires a data specialist to make sense of the data. With data teams working with creatives, brands and agencies can create compelling and engaging content with better results and can deliver personalized experiences to specific audiences.

New model, new day

We agree that the agency model is transforming, but at the end of the day, data can’t replace creative. The truth is, brands need to take steps towards owning their data while not losing sight of the Yin and Yang that make up marketing teams, meaning the creative and data strategists. Yes, marketers are taking a progressive leap into the transformation that is happening in digital – but that doesn’t mean all or nothing. Data supplies the WHAT and creative delivers the WHY. But as more brands announce in-house moves, agencies need to not feel threatened but know that it is time to evolve their service offerings and help marketers to streamline data from a variety of sources. If data is not informing a buy, marketers are not data-driven, they are data-responsive. Therefore, the better teams can collaborate and leverage the “art and science” disciplines, the more effective they will be.

At the end of the day, it all comes down to visibility. Bringing data in-house just means brands own the keys to their platforms — but working with agencies and tech platform partners are still critical to delivering value and expertise towards true business outcomes. With data at the center, decision making will become easier, campaigns more predictive and return on investment will no longer be in question.

Time will only tell how these market shifts will impact the way all parties work together, but one thing is for sure: the days of data obfuscation are over. It’s time to open the possibilities on the entire marketing stack and give tech providers, agencies and brands the visibility and transparency for both media buying and measurement so that everyone is working from the same view. Those who do — internal marketers, agents, consultants — will reap the rewards.

The post The in-housing trend is all about data appeared first on Marketing Land.