Case Study: How to Evaluate Google Automated Bidding vs Manual Bidding to Improve ROAS & Profitability

If there’s one thing you should know about our team here at Inflow it’s this: We love running tests and using data to back-up our insights. We hate navigating blindly and hoping for the best. That’s why for us it was exciting to have the opportunity to test manual PPC bidding vs automated bidding. The …

If there’s one thing you should know about our team here at Inflow it’s this: We love running tests and using data to back-up our insights. We hate navigating blindly and hoping for the best. That’s why for us it was exciting to have the opportunity to test manual PPC bidding vs automated bidding.

The current pay-per-click climate is all about platforms’ rolling out automation. Ad platforms are getting more and more data, and the machines are getting better and better at predicting who is going to convert from your ads. Some advertising account managers are quick to automate entire accounts; others, like me, are more skeptical.

Personally, I’m willing to test anything, but when you start taking away the data I need to do my job (search queries) and the levers I use to increase performance (placements, bids, negative keywords), I get concerned. Ultimately, my job is to improve performance for my client any way I can, and if that means putting pieces of the account on “smart,” so be it. I just want the best performance possible.

Here’s a quick snapshot of the results we saw in this case study: 

Manual bidding beat automated bidding in all major metrics measured across desktop, tablet and mobile. We saw:  

  • Higher Revenue
  • More Transactions
  • Higher Ecommerce conversion rate
  • Lower Cost
  • Higher ROAS

Now, let’s dig into this case study more to see how the way we evaluate metrics influences the outcome of tests.

Starting Work for a New PPC Client

This client was using a different service we offer, conversion rate optimization, and was starting to get busy managing their business. So he asked us to look at his Google ad account for improvement opportunities. 

Although the client was getting great results, we noticed some areas we could help him expand while keeping the account return on ad spend (ROAS) within acceptable ranges. So we struck a deal.

The client said he would allow us 3 months to implement and optimize our new strategies, but we had to increase profit enough by the 3rd month to pay for ourselves. So we started working on the account July of 2019 and quickly implemented a tiered shopping structure

  • Tier 1 (Campaign 1): Contains every product available on the site but with many negative search phrases applied.
  • Tier 2: This is where the average to medium performing search terms live.
  • Tier 3: This is where the best converting search terms exist.

We also reorganized search campaigns by product type and narrowed the account to push spending to the areas driving the most revenue. In month one, we were able to increase profit by $11k, and by month 3 (our deadline), we increased profit by $27k. So we hit our goal. The client was happy, and we were off to the races to continue optimizing the account.

Start of Our Manual vs Automated Google Ads Experiment

Around late September, an advertising representative from Google who used to work directly with the client contacted us and wanted to meet to discuss opportunities and strategies. I know how these meetings go (“You aren’t automating the account enough. Look at your optimization score and the opportunities in the account to automate.”), and I try to avoid them. However, we wanted to keep an open mind, so we scheduled a meeting for mid-October.

The rep suggested switching our tiered campaigns to smart shopping, but we were getting a 20x return on our shopping campaigns at this point, so I really did not want to mess with that, especially right before the holidays. I told him, “We can test some of your automated strategies as experiments on the search campaigns we are running.”

Fortunately, disputes in our industry can be quickly resolved with an experiment and data analysis. So we agreed to run an experiment against the recommendations of the Google Ads representative.

Parameters of the Experiment

  • Target ROAS test 
  • A 50/50 split with the control (manual bidding). 
  • We switched the attribution model in Google Ads to data-driven.
  • 3 weeks to run the test 

We wanted to make sure the algorithm had enough time to get “dialed in.” We also scheduled it to end before Black Friday and Cyber Monday, so the winning strategy could be at 100% for those 2 big days.

Now, let’s take a look at how the experiment progressed.

Week 1 Results

When looking at Google Ads, the Target ROAS campaign jumped out to a surprising lead. However, when looking at Google Analytics, the first week seemed to be a wash. This discrepancy made sense because we switched the attribution before starting the test, and the machine was optimizing toward the new attribution model. But would it translate into more revenue in Google Analytics? Would it continue to get smarter and better? Could our manual adjustments make up the ground?

Week 2 Results

We made some bidding adjustments, and our manual cost per click (CPC) campaign (the control) bounced back in Google Ads and slightly beat out the Target ROAS campaign in terms of conversion value and ROAS. Once again, the Google Analytics results were really close. We had a slight edge in revenue and spend, but the Target ROAS campaign had 5 more transactions. Got to love that average order value swing!

All right, as we were heading into week 3, each bidding strategy had “won” a week in Google Ads and were nearly neck-and-neck in Google Analytics. It looked as if it were going to be a photo finish (which, for the record, I was already considering a win). The rep was certain his algorithms were going to crush our manual adjustments, but our campaigns were getting better performance.

Week 3 (Final Week)

We were coming down to the wire. It was Man and Woman (shout out to Rachel) vs. Machine…

It wasn’t even close.

In Google Ads, the manual campaign had more than double the conversion value. In Google Analytics, the manual campaign had nearly triple the revenue and triple the transactions. Coming around the last corner, our campaign took off, while the automated campaign sputtered. 

This was surprising to me. I would have thought that as the automated campaign got more data each week, it would also get more efficient. That wasn’t the case in this experiment, although I’m not saying it’s always that way.

Evaluating the Results in Google Analytics

Let’s first get the Google Analytics data out of the way so we can look closer at the Google Ads data that I showed you at the beginning. In Google Analytics, the control beat the test in:

  • Revenue
  • Transactions
  • eCommerce conversion rate
  • Cost
  • ROAS. 

Manual bidding was the clear winner here.

Evaluating the Results in Google Adwords

However, armed with data-driven attribution, Google Ads was spinning a different tale, which was why the rep was immediately declaring victory. 

Inside Google Adwords the Target ROAS test: 

  • Had more conversions 
  • Higher conversion rate 
  • Higher ROAS. 

Based on this data, it makes sense the rep was claiming victory. Because we had agreed at the beginning that Google Ads data would determine the winner, I couldn’t just ignore it and send him the Analytics data. But now, after all this writing, we are getting to the point of this case study.

Don’t just take results at face value.

Why did we lose in Google Ads? What did the Target ROAS do better? Even if automated campaigns actually won and outperformed the manual bidding, wouldn’t you want to know why or how, so you can improve what you are doing?

You have to dig into those questions. So that’s what I did. I dug. And I didn’t have to dig very far to find some answers.

The manual bidding campaign out-performed the Target ROAS in two areas: 

  • Desktop
  • Tablet 

But mobile phones performed much worse, understandably so, because up to this point, we had not really been targeting mobile for the search campaigns based on historical low performance and client information. 

The table shows the average CPC on mobile phones was about $.16, much lower than computer and tablet bids. Thus, our strategy going in was to maximize desktop traffic within the budget and then expand to mobile. This test showed us we should move up our plans for the move to mobile.

After realizing our campaign had won on desktop and tablet (the two areas we were focusing on) and had only lost overall because we hadn’t been targeting mobile, I chalked up our win, and the rep conceded.

Our New Experiment Segmenting Mobile Traffic

We ended up breaking out mobile traffic from the control campaign into its campaign where we could have more control over bids, budgets, and keywords as these tend to behave differently depending on the device. (I am a big fan of this type of campaign segmentation overusing the bid modifiers we had on before.)

That mobiel breakout showed great results in December

  • 46% increase in revenue over the previous month. 
  • Mobile transactions doubled month over month. 

It was also a brand new campaign so we continued to adjust bids and keywords.

Results of the Automated vs Manual Bidding After Segmenting Mobile

The control campaign (manual bidding), now focused on the higher converting desktop traffic resulted in: 

  • Revenue increased by over $6,000 with only $300 more in additional ad spend than the test campaign.
  • The eCommerce conversion rate increased by nearly 40%.

After months of adjustments and optimizations, the ROAS for the mobile campaign using Google Analytics was over 5, in June 2020, compared to the 2.21 ROAS captured in the screenshot above.

It would be well worth another test now to see how the mobile campaign does with a Target ROAS strategy as it did well with that type of bidding before, but I will definitely be rooting for the manual bids and adjustments over the machines’.

Conclusion

Obviously Google wants you to use automated bidding because their only goal is to reduce the friction between you and the ad platform. They want to save you time in managing your ad account since there are so many levers to keep track of that users become overwhelmed. 

It’s important to understand there’s a time and place for both strategies. Manual bidding will save you money in the long run because you’ll limit your bids on specific campaigns that turn out to be unprofitable while automatic bidding is a great short term strategy to jump-start a large ad campaign if your time is limited. 

At the moment, humans are better at monitoring the nuances of campaign management. So if you need help running your own PPC campaigns our team is here to help. You can contact us here.

How Denver Increased Engagement for Ad Visitors

With the launch of their “always on” regional “Reclaim the Weekend” ad campaign, VISIT DENVER faced the challenge of how to keep their main landing page relevant. The regional effort, which promotes visiting Denver for a long weekend, targets a wide variety of personas that change monthly. Instead of creating multiple new landing pages every… Read More

The post How Denver Increased Engagement for Ad Visitors appeared first on Bound.

With the launch of their “always on” regional “Reclaim the Weekend” ad campaign, VISIT DENVER faced the challenge of how to keep their main landing page relevant. The regional effort, which promotes visiting Denver for a long weekend, targets a wide variety of personas that change monthly. Instead of creating multiple new landing pages every month, VISIT DENVER used personalization with Bound to match the hero slideshow content to the appropriate persona.

VISIT DENVER developed and rolled out three waves of ad personalization within their first year with Bound:

Wave 1

The first step was to personalize the slideshow for visitors coming to the landing page directly from the ad. This involved not only showing the appropriate group of slides but also starting the slideshow with the content targeted to that persona. While these visitors only had a 4% increase in clicks specifically on their persona-targeted slides, overall page engagement was significantly increased. Compared to other visitors, the ad persona segments had a 53% increase in visit duration and a 45% decrease in bounce rate when entering the site through the Reclaim the Weekend landing page.

Wave 2

The second step was to use Bound’s Media Optimizer tool to personalize the slideshow for visitors who were exposed to the ad. The pixeling capabilities of Media Optimizer allowed Denver to target Reclaim page visitors who had seen, but hadn’t clicked on the ad, as well as visitors who came back to the site after their specific persona campaign ended. Not only did these pixeled visitors have great page engagement, but they also had a 100% increase in clickthrough rates on the slideshow and were 28% more likely to click specifically on the persona-targeted slides. With this information, Denver had the data needed to show that visitors were still interested in persona-specific content even if they had not clicked on the ad. 

Wave 3

The third step was to build on the learnings from the first two phases of personalization and launch a fly-in campaign. The fly-in targeted visitors exposed to the persona who had never clicked on the ad or otherwise reached the Reclaim page. Using the fly-in, Denver was able to successfully direct 2% of these visitors to the page and continued to increase website engagement. Visitors exposed to the persona fly-in had a further 23% increase in visit duration and 18% decrease in bounce rate.

By identifying visitor interests based on ads, even if those visitors never directly engaged with the ad, Denver has been able to increase views on their key ad landing page and continually increase their landing page engagement. This has increased overall site performance and has allowed Denver to optimize the experience for these high-value website visitors. 

Want to learn more about personalizing for your targeted ad visitors? 

The post How Denver Increased Engagement for Ad Visitors appeared first on Bound.