Google Analytics can be an intimidating tool to many marketers. It contains tons of information about your website visitors, but unlocking insights and coming up with action items is often a challenge when you’re faced with so much data. To avoid…
Google Analytics can be an intimidating tool to many marketers. It contains tons of information about your website visitors, but unlocking insights and coming up with action items is often a challenge when you’re faced with so much data. To avoid data paralysis, we recommend using Google Analytics to uncover your most popular webpages (let’s say your Top 5 to begin with), and then using Crazy Egg’s user behavior reports (Heatmaps, Scrollmaps, Confetti, Overlay and List) and A/B testing tools to optimize them. That way, every positive design change you make has the biggest impact to a visitor’s experience of...
The secluded and isolated deserted island setting has been used as the stage for many hypothetical explanations in economics and philosophy with the scarcity of things that can be developed as resources being a central feature. Scarcity and the need to…
The secluded and isolated deserted island setting has been used as the stage for many hypothetical explanations in economics and philosophy with the scarcity of things that can be developed as resources being a central feature. Scarcity and the need to keep risk low while aiming to improve one’s situation is what make it a […] Read More...
Dayparting, also called ad scheduling, refers to the tactic of adjusting ad exposure for certain hours of the day and/or days of the… > Read More
The post Amazon Dayparting: Bid Smarter Using New CAPx Heat Mapping Tool appeared first on Retail Perfor…
Dayparting, also called ad scheduling, refers to the tactic of adjusting ad exposure for certain hours of the day and/or days of the... > Read More
In 2016, online spending topped in-store shopping for the first time ever. That trend continued in 2017, with Adobe Digital Insights reporting that 2017 holiday sales surpassed $91.7 billion, marking 11% YoY growth. Peak season offers peak opportunities for experimentation programs. Increased traffic and conversion rates open the door for higher velocity, shorter durations, and lower minimum […]
In 2016, online spending topped in-store shopping for the first time ever. That trend continued in 2017, with Adobe Digital Insights reporting that 2017 holiday sales surpassed $91.7 billion, marking 11% YoY growth.
Peak season offers peak opportunities for experimentation programs. Increased traffic and conversion rates open the door for higher velocity, shorter durations, and lower minimum detectable lifts without compromising statistical significance.
If you haven’t already created your experimentation strategy, the time is now. But here are some essential factors to consider while creating your holiday testing game plan.
Maximize Your Holiday Window Thanksgiving Day kicks off the peak holiday season, which continues through December 23. If you know your holiday window and website traffic patterns and expectations, you’ve got what it takes to take full advantage of this opportunity.
It can get complicated, but here’s a simple way to start:
Define your holiday window. Consult past data to determine when traffic and conversion increases start and stop.
Layer in the changes your organization is forecasting over last year. For example, one of our clients is expecting a five percent increase in traffic over last year’s holiday season. That intel is reflected in our traffic assumptions.
Start your roadmap with the most valuable pages so that early wins can positively impact the rest of the holiday season. Create a punch list of pages with this in mind.
Use traffic assumptions, desired statistical significance, and minimum detectable lift to determine the sample size and duration of tests.
Continue this process to fill the window of time. Use these dates to mobilize your team, communicating key dates of test strategy kickoff, when tests will move into development, when they will launch and end and when results will be shared.
Communication is Critical Since the holiday season represents a large portion of annual revenue, stress and emotions run high. As a result, it’s important to create your communication plan in advance. Determine who your stakeholders are, the optimal frequency of updates and what information needs to be shared. This isn’t the time for surprises or big reveals, so plan to devote a chunk of time to telling the story of your program and communicating its value.
The Weather Outside May Not Be the Only Freeze You’re Experiencing
Some organizations implement a freeze on development code updates and changes during the holiday season to avoid the risk of broken digital experiences or performance disruptions. Get acclimated with your company’s approach so you can have a plan for implementing winning test programs.
The ideal scenario is to push winners immediately into production. Based on years of experience with enterprise clients, Brooks Bell strongly advocates this approach so you can maximize the impact of that winning test.
If production updates aren’t on the table because of a code freeze, don’t immediately jump to the decision to push the winner to 100 percent through your testing tool. Though it sounds like the best way to manage through code freezes, it could cause delays and create an undesirable experience. Before you make the decision how to handle, get your organization’s development experts involved to help you evaluate the risks and rewards.
Holiday Shoppers are Different Think about your own shopping behaviors during the holidays compared to the rest of the year.
When I’m shopping during the holidays, I find myself on a mission to knock out my shopping list. As the countdown clock ticks away in my brain (and often literally on websites), I have a very real and intense sense of urgency. For me, customer confidence indicators, obvious savings and a clear and easy path to checkout are the ticket.
During the rest of the year, shopping is more leisurely for me and allows time for more browsing and consideration. I may even visit a website a few times before making a purchase. I zoom in on product details. I read customer reviews. I have more time, and the only restrictions are my own.
Keep it simple. As illustrated in my example above, successful holiday strategies are frequently based on a streamlined path to purchase, removing any friction and creating a sense of urgency and scarcity.
Test your hypotheses again after the holiday season. Do these experiences still produce a conversion lift when the holiday rush isn’t in full effect? If not, it’s okay! It’s an important learning you can use to build your Holiday/Non-Holiday playbook to make each holiday season better than the last.
Know your “Out of Stock” strategy. Regardless of what changes you make to your Product Detail page, nothing zaps excitement out of a customer experience faster than something being Out of Stock. Understand how your site handles Out of Stock messages, such as using red copy or suggesting alternate options. If it’s less than optimal, do some early testing to determine the most effective messaging. If your site includes a lot of Out of Stock product, it’s even more important to make sure it‘s been optimized.
Document your findings. Carve out time to tell the story of your testing through the chaos of increased velocity. The data and insights will be helpful after the rush and can greatly influence your future program success. Be sure to look at new, returning and loyal segments, and evaluate the differences in their holiday and non-holiday shopping behaviors.
With the number of testing and personalization tools available, it can be difficult to choose one to invest in. But once you’ve already selected a software, making the decision to transition to a new tool altogether can feel overwhelming. But this happens quite often. For many clients, cost is often the deciding factor in making […]
With the number of testing and personalization tools available, it can be difficult to choose one to invest in. But once you’ve already selected a software, making the decision to transition to a new tool altogether can feel overwhelming.
But this happens quite often. For many clients, cost is often the deciding factor in making the decision to switch testing tools–there are a few testing tools that offer similar capabilities at a lower price point. On the flip side, if you’ve increased your program budget and capabilities, it may be time for an upgrade.
And although all testing tools offer similar functions, each has unique features that are important to consider. Personalization, for example, has become a point of focus for many testing programs – perhaps you’re interested in transitioning to a tool such as Evergage or Dynamic Yield that puts personalization at the forefront. Or your testing program has enough velocity to run multiple experiments simultaneously, and you feel you’d make good use of Optimizely’s built-in mutually exclusive experiments feature. Maybe your company uses other Adobe products, like Adobe Experience Manager, so you feel Adobe Target is a good fit.
Regardless of which tool you select, once you select a new software–the next major obstacle is implementing it. Here are our tips for going about the process:
First, examine your testing roadmap.
Take inventory of the tests that will be running close to the date when you plan to stop using your previous tool. Make sure they will have reached significance and be ready to be turned off before you lose access.
If your budget allows for it, we recommend giving your team a period of time where both tools are available. This will ensure your testing cadence isn’t affected while your team gets up to speed on using the new tool and allows you to transition more seamlessly – you’ll be able to let current tests run their course in the old tool while launching new ones in the new tool.
Then, test your testing software.
While you might be excited to dive in and start launching tests left and right, it’s important to take the time to ensure your new tool is implemented correctly.
Run a QA test that visually changes the page to check that the code is being delivered and the flicker looks reasonable. If there are a lot of flickers, you may need to move the testing tool tag higher up in the head of your HTML.
We also recommend running a live test without visual changes, just for the purpose of checking metrics. This enables your analyst to see that metrics are being tracked correctly within the testing tool, or if you’re using an outside analytics tool, that those metrics are being passed accurately to it.
Once you’ve confirmed that visual changes are showing up as expected and metrics are tracking accurately, you’re ready to start using your new tool!
Switching testing software comes with its challenges. However, in the right circumstance, switching can offer substantial benefits to your testing program. Taking the time to pinpoint your reasons for switching, plan your testing roadmap carefully around the transition, and having patience as the new tool is implemented will ensure your tool transition goes smoothly.
Brooks Bell has over 15 years of experience working with enterprise brands to establish and scale their experimentation programs. We take a holistic approach to our technical diagnostics and analytics services, providing technology and data recommendations based on your business, your goals, your team, and your unique challenges.
What can Brooks Bell do for you? ✓ Clean, organize and centralize your customer data. ✓ Help you select the right a/b testing and personalization tools. ✓ Ensure your tools and systems integrate with one another. ✓ Train your developers and analysts.
Want to skyrocket your business’s sales leads? They might be the metric to save your business. Why? Because you can track metrics all day. Traffic, statistics, social followers, and email subscribers are great to track—but nothing is a better predictor…
Want to skyrocket your business’s sales leads? They might be the metric to save your business. Why? Because you can track metrics all day. Traffic, statistics, social followers, and email subscribers are great to track—but nothing is a better predictor of your sales than leads. And the more you can increase leads, the more money you’ll make and the more successful your business will become. In fact, according to HubSpot, 63% of marketers responded that generating traffic and leads was one of their top marketing challenges. Today, you’ll learn exactly what a sales lead is, what makes a great lead,...
As advertising and other brand communications get increasingly personalized, the demand for multiple iterations of content is growing at a rate that marketers are struggling to keep up with. And personalization without, well, personalized content just doesn’t work.
The report says that creative teams working on personalized campaigns are producing an average of 28 pieces of content a week and can take up to 12 days to get a single piece to market. But even with all that output, most marketers and advertisers feel like they could be doing better.
Why you should care
Content has always been king. But in this age of hyper-personalization, the demand has gone into overdrive — providing more of a need that many companies can afford — both financially and operationally.
To catch up with the demand, businesses may need to invest more resources into their content creation efforts: 33 percent of those surveyed said their biggest barrier to personalization is the time investment and 20 percent said cost. This is reinforced by data from the survey that says that half of companies that simply had more money to spend with revenues of more than $50 million reported higher satisfaction with their content strategy and creation and felt they were very well coordinated, as opposed to the average across all businesses at 34 percent.
There is good news. Advertisers and marketers have added considerable staff in content creation (a 63 percent increase for advertisers, 60 percent for marketers) and the vast majority still prize content quality over quantity.
And there’s good news operationally with a majority of marketers, advertisers, creatives and IT pros communicating with each other at least once a week, and 71 percent wanting to get creatives involved in the pre-planning process earlier.
More about the numbers
Companies need to step up. Only 26 percent of brand creatives, 21 percent marketers, and 24 percent advertising professionals believe their companies do enough to personalize digital advertising.
Scale continues to be an issue. Fifty-nine percent of marketers, 52 percent advertisers, 41 percent agency creatives and 53 percent brand creatives find it difficult to personalize content at scale.
More “digitally advanced” businesses (59 percent) say they are outperforming the competition, their content creation and delivery is very well coordinated, and their personalization is extensive than other companies (35 percent).
Earlier this week YouTube rolled out new extensions and measurement capabilities for advertisers. The ad extensions will enable a variety of new actions (e.g., app downloads, booking, movie showtimes). There will also be new brand lift metrics, including offline sales lift.
Brand lift and offline sales data. IRI is one of the Google Measurement Partners that’s supporting the company’s new brand lift metrics. Brand lift studies are typically survey based. In this case they will show marketers the following types of information:
Positive response rate
Absolute brand lift
Number of lifted users
Control positive response rate
Relative brand lift
Brand lift studies offer valuable information. Arguably more compelling than self-reported attitudinal data is the sales impact data that IRI will provide for YouTube ads. IRI will supply e-commerce and in-store sales data to Google to show if YouTube ads are having an actual impact on sales (avoiding a last click attribution problem is a separate discussion).
The IRI measurement capabilities come from IRI’s loyalty card-based “vast point-of-sale, frequent shopper, causal and media exposure data.”
Why it matters. Last year Google announced that YouTube ads could be tied to store visitation and sales. That was being done chiefly through Google partnerships with credit card companies, which provided anonymous, aggregated sales data to Google. So the IRI lift measurement capabilities are not entirely new.
Regardless, it’s another important source of offline data that will help marketers better understand the real-world impact of their ads and which ones are truly performing.
In 2009, Netflix offered $1 million to anyone who could improve the quality of its recommendation engine by 10%. It took two years, but a team finally won. Netflix paid the bounty—then ignored the code. As it turned out, the enhanced algorithms “did not seem to justify the engineering effort needed to bring them into […]
In 2009, Netflix offered $1 million to anyone who could improve the quality of its recommendation engine by 10%. It took two years, but a team finally won. Netflix paid the bounty—then ignored the code.
As it turned out, the enhanced algorithms “did not seem to justify the engineering effort needed to bring them into a production environment.”
Not only did the winning prediction engine fail to scale economically, it also addressed an outdated problem: The shift from mail to streaming during that same two-year window gave Netflix all the data it needed to develop newer, better algorithms.
Predictive analytics, in other words, wasn’t a panacea. Nor, in the decade since, has it become one. But, in 2018, incremental gains no longer cost $1 million either:
You have more data;
Storage is cheap; and
Cloud computing is almost infinitely scalable.
This post details those changes and shows how several businesses—and not just behemoths—have cultivated the predictive analytics landscape.
What’s changed in the last decade?
1. More data, more storage, more computing power
Massive, cloud-based repositories of customer interactions, often called data lakes, are the raw source material for predictive analytics applications.
Many companies have taken advantage of cheap cloud storage to stow away data for years—without even considering its potential use. (How many neglected data points do you have in Google Analytics, Google Ads, MailChimp, Marchex, Stripe, and similar services?)
That dual growth in scale—of data collected and accessibility to it—has solved two primary challenges of predictive analytics implementation.
Historically, raw computing power has been the other. As Andrew Pearson of Intelligencia notes, “Without significant hardware investments, predictive analytics programs either weren’t possible or too slow to be useful.”
That, Pearson continued, has also changed: “Cloud-based analytics systems have added massive computer power into the mix.” Increasingly powerful systems cracked open the door for real-time predictive analytics.
2. A world of real-time predictions
For some, the age of “real-time” predictive analytics is here. Judah Phillips, the co-founder and CTO of Vizadata and founder of SmartCurrent, explained:
We already live in a world of “real-time” predictive analytics. A simple predictive analysis is your arrival time in Waze. A more complex real-time prediction occurs billions of times worldwide every millisecond in matching certain types of digital advertising.
Further, companies like Mintigo and Versium now offer real-time solutions for lead scoring, showing that the transition is technically possible. Possible, however, doesn’t mean perfect. Sam Underwood, a vice president at Futurety, acknowledged the complexity of necessary integrations:
Especially in the mid-market world, the tools that gather data to turn into predictive modeling—CRM systems, social media aggregators, logistics, and purchasing systems—often do not have friendly APIs or other easy mechanisms with which to quickly gather and interpret data.
That disconnect still thwarts even the most fundamental business cases for real-time predictive analytics. David Longstreet, the chief data scientist at FanThreeSixty, offered an example:
In our world of sports and entertainment, for example, most sports teams do not know how many people are in a stadium for a game. Teams know how many tickets were distributed; however, they do not know in “real time” how many people are in the venue or stadium during the event.
That knowledge gap hampers efforts to staff and stock the stadium appropriately. It’s also why interest in predictive analytics is almost universal, even if it vastly outpaces adoption.
Interest, however, exceeds implementation. The same research suggests that 90% of businesses “attach, at minimum, some importance to advanced and predictive analytics.”
So which questions are those 23% answering with predictive analytics? Let’s take a look.
Which questions can marketers answer with predictive analytics?
“They want to predict everything,” according to Underwood. And who wouldn’t want to know the exact foot (or web) traffic by month, day, and hour to streamline staffing (or allocate server resources)?
But, Underwood continued, he tries to focus clients on “the one thing that, if we could predict it for you, would revolutionize your business.”
In digital marketing, Phillips outlined myriad use cases for predictive analytics, including the capability to predict:
which advertising will be most effective—however you define effective.
which marketing campaigns, channels, touches, behaviors, and demographics are contributing to a business outcome, a form of “machine learning–based attribution.”
which segment, test, or personalization a user is most likely to respond to.
the probability of users to click on an ad, to download a whitepaper, to respond to an email, to respond to an offer, and other customer response you define.
which leads will convert—however you define conversion.
which customers will buy one or more products for a cross-sell or upsell.
the number of purchases or revenue that will occur in the future.
which customers will have high/medium/low lifetime value.
The novel opportunity of predictive analytics, then, is not what you can predict but the fact that you can predict. The historical data you currently analyze can probably become a prediction.
Just make sure you have the data.
What do you need to get started with predictive analytics?
Data, data, and data. “Priority 1A and 1B are data sources,” stated Underwood. That’s true whether you plan to license software or hire an outside organization. (Both options are detailed later.)
All uses require training data. That training data, in turn, is used to build a predictive model to apply to current data. “The only limitation we’ve run into,” Phillips noted, “is a company’s available data for training.”
How much data is enough? According to Phillips:
A few thousand records with a sufficient amount of positive and negative outcomes can be sufficient for marketing, sales, and product prediction.
Not all data is created (or stored) equally
“You have to understand—I grew up tearing tickets.”
FanThreeSixty’s Longstreet has heard that same explanation from venue managers who have spent countless hours counting stacks of stubs after games. It’s a reason why vital data sources may not be easily accessible, or accessible at all.
In stadiums, Longstreet explained, point-of-sale machines and ticket scanners exist for a single purpose—to complete transactions quickly and keep lines moving. Those systems do not store data efficiently for extraction, nor can they handle incessant server requests (unless hungry fans don’t mind waiting).
For Underwood, clients tend to fall into one of two buckets, with half in each:
“The ideal client has an internal database set up and ready to go. We pull in the data, build the model, and are off and running.”
The other half have a mix of data sources, which inevitably include an offshore SQL database (or ten) managed by an external vendor whom no one can track down.
Stitching data sources together is a major development project that may require creating custom connectors, setting up third-party FTP drops, and other complex but thankless tasks. That work, however, is necessary: Models and their predictions are only as accurate as the data they’re built upon.
Don’t forget external data sources
Not all data comes from internal sources, either. External data sources, like weather reports, are often a critical addition to data lakes, especially for small businesses. As Underwood explained:
Restaurants may use analytics to trigger email sends; for example, we can set up the email platform to sync with National Weather Service data to send an email about iced tea when the temperature in a given metro area is above 90 degrees.
Likewise, we can trigger an email to send to customers in a given city if the system detects wind gusts of 40+ MPH. Both of these use cases reach consumers in a key moment of need, negating downstream ad spend and beating competitors to the punch.
So you have a large, well-organized dataset. What do you do with it?
Companies require either a dedicated team of data scientists to parse through these sets, or a software suite powerful enough to do so rapidly. For most small and medium-sized businesses, this usually means settling for subpar software, or forgoing it entirely.
For businesses of all sizes, solutions branch into two options:
Purchase software and create predictions in-house.
Pay an outside vendor to develop models and visualizations for you.
1. Predictive analytics software
The marketplace for predictive analytics software has ballooned: G2Crowd records 92 results in the category. Pricing varies substantially based on the number of users and, in some cases, amount of data, but generally starts around $1,000 per year, though it can easily scale into six figures.
G2Crowd lists both IBM’s SPSS Statistics and SAS’s Advanced Analytics as market leaders at the enterprise level. Along with RStudio, the pair are also tagged as leaders for mid-market companies; only IBM retains a place in the “Leaders” quadrant for small businesses.
Historically, however, even industry-leading predictive analytics software hasn’t been a simple, jump-right-in experience. Take these two examples from IBM’s SPSS Statistics and RapidMiner:
While these platforms are powerful, users must format data files, link nodes, and develop visualizations. Learning how to do this—and having the time to do it—is a specialized, full-time job. (To believe otherwise is to expect a Microsoft Word license to write your Great American Novel.)
Not surprisingly, the market is shifting. RapidMiner has rolled out a SaaS beta that, with a bit of manual adjustment, translates an Excel sheet full of, say, employment data to a prediction of employee retention:
Some companies, like Vizadata’s Phillips, see the user-friendly SaaS model as the future:
We are democratizing data science, so that people with limited or no data science or engineering skills can predict. You simply upload your data and click next. We do all the heavy lifting.
Our intelligence determines your dependent and independent variables and the type of analysis to run. You can go with our selections or override them—from regression, where we can do forecasting and optimization, to both binary and multiclass classification, where we can predict the probability of outcomes.
Like Vizadata, MIT’s Endor pursues this path. The platform uses a query-builder to allow anyone to ask questions like “Where should we open our next store?” or “Who is likely to try product X?” It then mines targeted datasets to provide answers, often in a matter of minutes.
The inclusion of tangential datasets that fall outside consideration—or feasibility—for human observers is a recurring advantage of predictive analytics. Endor’s creators offer an example:
A marketing department for a bank asks, “Who is going to get a mortgage in the next six months?” Machine-learning engines may detect a pool of, say, 5,000 customers who have a bank credit card and a high credit score, and are married—many of which may be false positives.
Endor detects more specific clusters of, say, couples about to get married or going through a divorce, founders who recently sold their startups to Facebook, or customers who recently graduated from a local real-estate course.
Of course, if you want to outsource the process entirely, outside vendors can organize your data, build models, and visualize predictions for you.
2. Outside vendors
Agencies offering bespoke solutions
For most clients, Futurety starts by identifying the key business question—not a specific metric or visualization. Clients may come in for one-off projects, annual re-runs of their data, or ongoing work.
“The end result is not always clear at the beginning,” Underwood explained. “When we’re delivering to someone close to the outcome, like a marketing manager, they’re typically happy with the model, the finding, and the math behind it.”
The “end result” could be several things:
Integration with a third-party platform, like an email client, to automate predictive messaging.
Plain-text predictive answers to guide practitioners.
Robust visualizations to demonstrate the process and value to the C-Suite.
At the end of each engagement, Futurety delivers the model back to the client for management and maintenance.
Predictive analytics at work
Futurety has a small business client that helps aspiring performing arts majors gain admission to their dream college. But few high schoolers have broad knowledge of good programs. More often than not, they know only one name: Julliard.
Futurety trained its model on three years of placement data. Then, using new student data entered into a common portal, predicted where students would get accepted and succeed academically.
The predictive analytics model, which Futurety updates annually, delivers a simple list of recommended schools for students based on factors like grades and exposure to different musical or artistic styles.
The model takes into account whether past placements graduated or won awards.
All-in-one niche providers
FanThreeSixty serves a narrow market: sporting venues. Because they work with a comparatively consistent dataset—season ticket, concession, and souvenir sales—they know the range of business questions, data outcomes, and relevant visualizations.
This consistency incentivizes niche vendors like FanThreeSixty to develop proprietary dashboards to roll out to all clients.
The interface allows Longstreet’s team to keep data science in the background: “The secret of machine learning is when you’re being prompted behind the scenes.”
Distilled fully, FanThreeSixty’s goal (and Longstreet’s explanation of his role at dinner parties) is to “help teams sell more tickets and hot dogs.”
Predictive analytics at work
FanThreeSixty mines historical data to see which concessions are most commonly purchased with a hot dog at a Major League Soccer venue.
If a customer purchases a hot dog, concession staff are prompted to ask whether a customer would like to add the most popular accompaniment. That recommendation—a prediction of fan desire—changes based on other variables.
Predictions consider more than 20 datasets—everything from the home location of season ticket holders to the weather—to tailor messaging before, during, and after matches.
During cold-weather games, for example, FanThreeSixty can automate push notifications with tailored coupons, like buy-three-get-one-free hot chocolate for a family of four.
Whether solutions are internally or externally managed, they‘ve long been common in enterprise businesses.
Predictive analytics use cases at the enterprise level
AutoTrader. AutoTrader uses data from its 40 million monthly visitors to better understand the sometimes lengthy customer journey. They built propensity models based on search behavior and created high-value lookalike audiences.
Editialis. The French publisher uses predictive analytics in its email campaigns to “anticipate engagement at an individual level.” As a result, they’ve seen click-through-rates increase “dramatically.”
Predictive analytics can also coordinate offline and online interactions, with two clear use cases for marketers whose companies have physical products or storefronts:
Improved pricing. Smartphone data registers in-store browsing habits to improve online or offline marketing targeting, approximating the advantages enjoyed by ecommerce companies.
Inventory management. Full warehouses cost money; empty shelves cost money. Folding online data, such as search patterns, into sales data can better manage inventory, especially at a regional and local level.
In addition to external marketing campaigns, predictive analytics also supports internal project management. Large marketing campaigns have many moving parts—a new ad campaign needs new creative, new copywriting, new landing pages, etc.
Coordinating the involvement of those teams and accurately estimating the time-to-launch is complex. Many fail to get it right, sometimes at great expense.
Predictive algorithms, as McKinsey notes, use a wider lens that captures historical patterns and unique project elements in a single frame:
While every development project is unique, the underlying complexity drivers across projects are similar and can be quantified. If companies understand the complexity involved in a new project, they can estimate the effort and resources required to complete it.
Predictive analytics models “take into account not only the complexity of the project (both the functional and implementation aspects) but also the complexity of the team environment.”
Predictive analytics at work:
More accurate internal project management, in an example McKinsey offers, can have a major impact:
A company initially planned a product update to take roughly 300 person-weeks of effort, an estimate based on the limited number of changes between the current product and a new design.
However, that estimate failed to take into account the fact that planned updates would affect many different teams. Predictive analytics models did take it into account and estimated that the project would take three to four times as long.
As a result, the company limited the work to the original product team, enabling them to deliver the update on time.
In addition to helping companies solve internal and external challenges, predictive analytics is also the foundation for some businesses.
Building a business on predictive analytics
Ken Lazarus, CEO of the recruiting platform Scout Exchange, has an advantage—the company has been around for only five years.
That means that the company’s data sources are already primed for extraction into its predictive models that pair companies with the right recruiter.
The single best predictor of job placement, Lazarus and his team have found, is the track record of job recruiters. In contrast, pairing the right job description with the right resume remains exceedingly difficult.
“Job specs are horrible,” he lamented. “The data isn’t on the paper. CVs are pretty horrible, too.” (Data augmentation, such as skills testing and video interview decoding, Lazarus noted, offer potential improvements.)
Nonetheless, holes remain. Candidates will never disclose negatives on their resume, and important information might forever remain “non-data,” such as whether a candidate is a good “culture fit.”
Scaling data gathering
Scout Exchange has honed its predictions by focusing on enterprise customers—its algorithms feast on hundreds or thousands of openings from Fortune 500 clients.
As a result, the platform takes in roughly 1 million data points monthly, with each new job posting yielding an additional 50 data points.
Still, human assessment by a recruiter—and their client—is necessary. Lazarus drew a parallel: “Would you let machine learning pick your wife? No. But would you let it pick the right matchmaker to help you find a spouse? Yes.”
Those who are trying to solve the most complex human issues aren’t even in the business world.
Predictive analytics with life or death consequences
The greatest challenges for predictive analytics are those that deal with complex, individualized human behavior, such as the likelihood that a patient or crisis-line texter will commit suicide.
Because success or failure is measured in human lives, these challenges are also the most urgent. And while these projects operate beyond the scope of marketing and business, they suggest the potential for predictive analytics as it evolves.
“REACH VET is not about trying to find the veteran who’s sitting in the car in a parking lot with a gun in his lap,” Aaron Eagan, Veteran Affairs deputy director for innovation told a Washington conference.
“What we found,” Eagan continued, “is that veterans at highest risk of suicide [also have] significantly increased rates of all-cause mortality, accident morality, overdoses, violence, [and] opioids.” Proactive alerts that trigger physician check-ins have improved primary-care appointment attendance and reduced hospital admissions for mental health issues.
Leaning on natural language processing and predictive analytics, the program analyzed conversations, forecasted trends, and trained more than 13,000 volunteers. The results?
Wait times decreased to less than 5 minutes, an operational goal.
Capacity increased by 10% during peak periods.
Responses were prioritized based on machine-identified urgency.
Endor’s technology has taken on similarly serious challenges. Using 15 million data points from 50 known ISIS supporters, Endor identified 80 lookalike accounts in less than half an hour, with only 35 false positives—expert investigation was still necessary yet feasible.
In a collaborative project with the U.S. Defense Advanced Research Project Agency, the platform also analyzed mobile data to identify patterns to predict future riots.
Predictive analytics is not immune to criticism: GDPR rebuffs some of the same collection methods that swell data lakes. And not all predictions, even the most accurate, are well-received. (Famously, Target unwittingly informed a father of his teenage daughter’s pregnancy based on seemingly benign shopping habits.)
Predictive analytics experts point out that their algorithms search for patterns among values, not the values themselves. Regardless, insufficient data is unlikely to hold back the expansion of the industry—the IoT, wearables, and other data collectors already supplement traditional web and app analytics.
User-friendly SaaS platforms are still an emerging opportunity. For most businesses, creating models and predictions from historical data still requires a dedicated employee to navigate complex software solutions or the outsourcing of that work to a vendor.
For those postponing predictive analytics projects until the SaaS options are more mature, you would be wise to keep filling your data lake.
Beyond Website Optimization: Omnichannel is Here to Stay There’s no doubt about it: having a strong website optimization strategy in place is crucial for your brand. Over time, however, customers have expanded to platforms beyond y…
Beyond Website Optimization: Omnichannel is Here to Stay There’s no doubt about it: having a strong website optimization strategy in place is crucial for your brand. Over time, however, customers have expanded to platforms beyond your website and the...