Core Web Vitals scores improving for top-ranked sites

New research looks at the top rankings for 500 keywords in Education, B2B Tech, Finance, and Retail.

The post Core Web Vitals scores improving for top-ranked sites appeared first on Search Engine Land.

Organizations have been investing resources into improving their Core Web Vitals scores, according to new research. 

Enterprise SEO platform BrightEdge has compared the top rankings for 500 keywords, from this year to last, in four markets: education, B2B technology, finance and retail. 

Core Web Vitals, one year later. It’s been just over a year since Google’s page experience update started rolling out June 15, 2021. It was completed Sept. 2

The three Core Web Vitals website performance metrics were a subset of the overall Page Experience Update. Those metrics are:

  • Largest Contentful Paint (LCP) – loading, target 2.5 seconds.
  • First Input Delay (FID) – interactivity, target 100 milliseconds. (Note: Google announced in May that it might replace FID with a new metric called INP – Interaction to Next Paint).
  • Cumulative Layout Shift (CLS) – visual stability, target 0.1.

Page experiences are improving. Here’s the data from BrightEdge:

Source: BrightEdge

On average:

  • Retail scores improved by 58%. BrightEdge noted that retail giants have made significant strides in improving poor experiences – last year, top-ranked retail sites failed to meet Core Web Vitals metrics.
  • B2B tech scores improved by 30%. Publishers and resources from SaaS platforms have made good gains.
  • Education scores improved by 28%. More .gov sites are ranking and have good Core Web Vitals scores.
  • Finance scores improved by 27%. This sector was best prepared last year and perhaps due to that saw the least amount of performance gains.

Why we care. SEOs weren’t sold on Core Web Vitals. However, this research shows that companies have been taking Core Web Vitals seriously. It is impossible to point to Core Web Vitals as the sole reason for any ranking boosts or declines. But smart SEO requires driving incremental gains. And every possible positive signal you can give Google (such as good Core Web Vitals scores) is a potential way to positively influence your ranking and visibility.

The post Core Web Vitals scores improving for top-ranked sites appeared first on Search Engine Land.

Google Analytics 4: A guide to the Traffic Acquisition Report

Learn how to access traffic acquisition data in GA4 in a few easy clicks, then how to go deeper by slicing and dicing that data.

The post Google Analytics 4: A guide to the Traffic Acquisition Report appeared first on Search Engine Land.

After nearly a decade of using Universal Analytics for website traffic analysis, you probably have a pretty good idea of where you need to go to get the data you need. Maybe you’d even consider yourself a GA reporting master.

Want to know how much traffic came from organic search? No problem. You can pull that report in two seconds flat! Or, even better, you’ve got it bookmarked.

But when it comes to Google Analytics 4, things look and feel different. You may have no idea how to get started.

Have no fear!

In this guide, you’ll learn

  • How to access Acquisition reports in GA4.
  • What the differences are between User acquisition and Traffic acquisition.
  • How to build some of the most useful acquisition reports that you know and love from Universal Analytics in Google Analytics 4. 

Where to find the GA4 Acquisition Reports

Acquisition reports live under Reports > Lifecycle > Acquisition

You’ll immediately notice that there are actually two acquisition reports.

The User acquisition report:

And the Traffic acquisition report:

Important differences to keep in mind:

  • User acquisition: the first campaign/source/medium observed for the user (or more accurately, for the browser/cookie)
  • Traffic acquisition: the campaign/source/medium of the session – this will be most similar to what you are used to in Universal analytics

Which one should you use?

That’s up to you and the type of analysis you’re trying to do (whether you want to know what brought the user to the site in the first place, or campaign information for the latest session that brought someone to your site).

In this guide, we’ll examine the Traffic Acquisition report.

Using the Traffic Acquisition Report

When you select the Traffic acquisition report, you’ll notice that the report defaults to the Default Channel Grouping view of data.

In UA, you could select a Channels report to get this view, or you could choose to look at the “source / medium” report directly. In GA4, you can get all of these views from one single table. 

To do so, you can change the primary dimension in the dropdown of the table to switch the dimension view. 

If you want to rebuild the source / medium report that we all love from UA, it’s easy.

First, you’ll change the primary dimension to source and then you’ll add a secondary dimension (click the + icon) of medium

*Note that as of this writing, there is not a single primary dimension for source / medium, but there is a secondary dimension for it. Hopefully, this will soon be resolved in the GA4 reporting interface.

If you’d like to get a little fancier with your data analysis, you can leave the primary dimension set as is as Default channel grouping and add a secondary dimension of source / medium.

It’s a nice view to see high-level Default channel grouping and the more granular source / medium breakdown together, giving you the ability to easily scan and classify traffic into channels and distinct sources.

Similar to Universal Analytics, we can also filter a table report in GA4.

For example, if you’d like to filter by Referral, you can use the filter box to narrow down your data results. 

One important difference between GA4 and UA is that the filter you add here will apply to both the primary and the secondary dimensions in the table.

So if the word “referral” shows up in either column, the row will be shown. In UA, the simple filter box only impacted the primary dimension, and you could add an advanced filter option for the secondary dimension specifically. 

If you want to know acquisition data by landing page, you can apply this same methodology of adding a secondary dimension of source / medium to your fancy new landing page report we recently built.

If you haven’t done it already, here’s how you can easily build your own landing page report in GA4. It’s a handy tool for search marketers.

Now you’re a GA4 acquisition master

That’s it! And it isn’t all that different from UA.

With a little practice, you’ll be pulling those acquisition reports with your eyes closed. 

The post Google Analytics 4: A guide to the Traffic Acquisition Report appeared first on Search Engine Land.

Top 4 data challenges among agency clients

Be mindful of these potential pitfalls in a marketing landscape where data reigns king.

The post Top 4 data challenges among agency clients appeared first on Search Engine Land.

Marketing agencies recognize the need to be at the top of their data game to provide the best return for clients.

That’s why it takes two to tango.

If you don’t have clients’ buy-in for embracing a data-driven approach toward their marketing efforts, untapped potential and money gets left on the table.

These are the most frequent obstacles our clients face, and how to overcome them.

1. Tracking is an afterthought

UTM parameters are a marketer’s best friend when measuring down-funnel performance measurement. They give us incredible visibility into exactly what drives performance across all digital channels.

However, the reality is that many organizations simply don’t have a UTM structure to properly attribute data in their marketing campaigns. 

Some of the common critical pitfalls are:

  • Inconsistent templates from channel to channel.
  • Missing or duplicated parameters.
  • Mismatched templates at different levels, such as having one for account level and another for campaign level.

A consistent, cross-channel UTM template can be as simple as an Excel spreadsheet.

Ensuring that it is adopted across the entire marketing operations team can immediately improve attribution and measurement insights. 

Furthermore, as we move more toward machine learning and automation, clients must capture click-level IDs to measure performance, provide feedback to the platforms that offer offline conversion tracking and further optimize campaigns and bidding strategies to down-funnel goals.

While Google and Facebook are currently the only platforms that offer offline conversion tracking, we have to anticipate that this will become more widely adopted sooner than later and eventually become the best practice.  

To take full advantage of offline conversion tracking, the client has to do the legwork in setting up their martech stack to capture and pass these IDs through. They also need to create internal reporting and dataset schemas to export this information back into platform APIs.

2. No centralized data management strategy

Often, greener companies lack cohesive data infrastructure, and their data is siloed and disorganized. It lives all over the place in different formats (Google sheets, CSV files stored in a folder that has no access controls and lead data that lives only within their CRM).

When we see these kinds of issues, we also tend to see that the organization’s various departments handle their data differently. 

Marketing operations should be considered integral and aligned with organizational goals, and data management practices should reflect that.

Having a centralized data warehouse solution and a data operations team that transcends individual departments forces the entire organization to align with its data storage practices and definitions.

Getting everyone onboard with a more modern approach to data can seem daunting, but it pays dividends in the long run.

3. Analysis paralysis

The volume and granularity of data available to us as marketers are almost limitless and will only continue to grow.

It is easy for an organization to fall into the pitfall of spending too much time analyzing every piece of data instead of zeroing in on what’s important and actionable. 

When an ad manager or client comes to our BI department with a new dataset or visualization request, marketers should always ask:

  • “What is the result you hope to achieve with this request?” 
  • “Will the data drive actionable insights and facilitate decision-making?” 
  • “Is the request a nice-to-have?” 

Actionable is the key word here.

Because of vast data availability, it can seem daunting if an organization doesn’t have someone asking these kinds of questions to steer the ship toward a thoughtful and focused approach. 

Data analysis typically falls into three categories:

  • exploratory
  • descriptive
  • prescriptive 

As marketers, we want to focus our efforts on the last two. In other words, what is currently happening, what do we want to happen and what do we need to change to get us there?

While there is a time and place for more exploratory analysis, it’s essential to not take our eyes off the prize and the insights that truly matter for a client’s bottom-line goals. 

4. Lack of data culture in the organization

We hear the term “data culture” thrown around quite a bit, but the phrase can come across as a bit nebulous and sound like a substanceless buzzword.

Ultimately, all of the plights above can be encapsulated in one overarching challenge: a lack of decisive, holistic data management direction.

Data culture has to be embraced at the executive level and implemented top-down. If marketing operations speak a different data language and define important organizational goals and KPIs differently than financial operations, that’s a problem. 

When we see a lack of data culture and a disorganized approach to handling and storing data, most likely, a company hasn’t put the right people and tools in the right places.

A company must be willing to invest the time and resources into finding data leaders who can guide:

  • Philosophy at an organizational level.
  • Implementation at a departmental level. 

We can do our part as marketing data experts to guide our clients toward fixing some of the low-hanging fruit in the short term, like improving tracking and measurement. Still, it ultimately falls on the shoulders of their organizational leaders to foster a data culture that is forward-thinking and open to change to set them up for long-term success. 

The post Top 4 data challenges among agency clients appeared first on Search Engine Land.

How to measure the value of SEO with CTR

At SMX Advanced, Jessica Bowman and Avinash Conda shared how CTR is a simple, effective way to prove the worth of their SEO efforts.

The post How to measure the value of SEO with CTR appeared first on Search Engine Land.

“I see clients, again and again, having to evaluate the value of making this [SEO] change over that change at the ticket level,” said Jessica Bowman, enterprise SEO consultant and author of Executive SEO Playbook, in her presentation at SMX Advanced.

While all SEO tasks are important for the success of your site, there’s a key distinction that should be drawn: those that have the potential to increase revenue, and those that are designed to prevent revenue decreases. SEOs who fail to differentiate these types of tasks often find themselves burnt out and unable to show how their work impacts the bottom line.

“What usually happens is SEO managers fall into this whirlpool of trying to build a positive revenue case for all of the tickets they have,” said Avinash Conda, director of organic growth at Williams Sonoma Inc., in the same presentation. “But that’s not something you need to be applying to all of the tickets.”

Marketers need to prioritize those tickets/tasks that will prove the most SEO value. But they must also choose which metrics should be reported.

According to Bowman and Conda, detailing click-through-rate (CTR) metrics for your brand’s target keywords is a simple yet effective way to prove your SEO efforts are worthwhile.


Get the daily newsletter search marketers rely on.

Processing...Please wait.


Data needed to measure SEO’s effect on CTR

To set up the template SEO impact on CTR, Conda suggested marketers first collect a list of specific URLs.

“This could be a subsection of the site, a folder on the site, or just a list of URLs which are impacting,” he said. “The second [step] is to get the list of keywords mapped to these URLs.”

This basic information will serve as the foundation for the formulas on the spreadsheet (see the template below).

spreadsheet showing CTR's impact on keyword and URL rankings
Source: Jessica Bowman and Avinash Conda

To ensure your team gets the CTR metrics it needs, SEOs should also consider adding the following data from Google Search Console (past 12 months) for the URL/keyword combinations:

  • Total impressions.
  • Average rank.
  • Total clicks.
  • Current CTR.

Once this data is added to the template, the formulas will help you calculate the increases in CTR and forecast fiscal-year growth from search. However, there are a few additional manual inputs marketers will need to include to ensure it functions correctly:

  • The list of URLs the SEO task will impact.
  • The list of keyword(s) each URL is targeting.
  • Assumed CTR increases based on high- and low-ranking keywords.
  • The average revenue per visit.

Using the CTR template

Marketers can use a template like this to calculate assumed CTR, clicks and revenue increases. This is a great way to connect URLs/keywords to your organization’s bottom line, helping teams better predict the impact of these efforts.

spreadsheet showing assumed CTR and revenue increases
Source: Jessica Bowman and Avinash Conda

However, as most search marketers probably know, not every aspect of SEO value can be measured in such a clear-cut fashion.

“There are a few tasks which cannot be quantified on a traffic level,” Conda said. “Site speed [tasks] are a good example … We know they are going to have a positive impact on traffic because site speed is a big ranking factor, but I don’t think we’re there yet in terms of coming up with [quantifiable] methodologies.”

Trying to forecast traffic using CTR in such a scenario might not be the most effective plan. In these cases, Conda recommends estimating SEO gains with conversion rates.

For instance, a Portent study found sites with 1-second load times had a conversion rate that was three times higher than sites that loaded in five seconds. Tying less quantifiable SEO tasks to important metrics like this can go a long way in proving their value.

Still, CTR is a solid metric SEOs can use to show the value of their work, especially if the data is made accessible using templates like Conda and Bowman’s.

“This model is a straightforward way to see the updates we are making, which we know for a fact will have a direct impact on rankings,” Conda said.

Watch Jessica Bowman and Avinash Conda’s full SMX Advanced presentation

Not registered for SMX Advanced? Get your free pass here.

Already registered for SMX Advanced? Log in here.

The post How to measure the value of SEO with CTR appeared first on Search Engine Land.

How paid marketers are fighting back against the Fake Web

Bots and fake users steal ad clicks, pollute audience targeting, and skew metrics. Paid marketers are learning to protect themselves.

The post How paid marketers are fighting back against the Fake Web appeared first on Search Engine Land.

Anyone who manages paid marketing channels knows the importance of being able to accurately report on metrics and KPIs. Typically, marketers are looking to see if their campaigns are driving traffic, conversions, and ultimately pipeline for their go-to-market team. However, today nearly 40% of the internet is made up of fake traffic, which directly impacts marketers’ ability to do their jobs. When bots and fake users interact with paid marketing campaigns, they can decrease the effectiveness of nearly every aspect of advertising. 

First, when bots click on ads, the obvious downside is that they take away that portion of the cost-per-click budget. But the damage does not stop there, as they also consequently divert ad spend away from potential customers. Additionally, if audience segments and smart campaigns become infected with bots, they can inadvertently encourage ads to be remarketed to additional fake users until they are completely unusable. Optimizations also become skewed as pixels fire when fake users interact with campaigns, which ultimately delegitimizes all performance metrics. 

Fortunately, many paid marketers are noticing these issues, staying diligent, and fighting back against the Fake Web. Throughout this article, we describe the ways they are identifying threats and combating them in order to increase the effectiveness of their campaigns and make the most of their ad spend. 

Checking for time zone mismatches 

When it comes to mobile and desktop devices, users are able to select a Declared Time Zone in their settings. Typically, if the user is a legitimate person going about their daily life, they select the time zone that they are most frequently living and working in, so that their Declared Time Zone reflects reality. However, some malicious users may choose to declare a different time zone from the one they are actually in, so that they appear to be in a time zone that a given business typically works within. The reason for this deception is to trick that business into thinking they are a legitimate customer. One way that smart paid marketers are snooping out this type of suspicious activity is by checking the Declared Time Zone of a device against the actual Device Time Zone. If there is a mismatch, the user could be masking their identity for malicious purposes. 

Watching out for repetitive behaviors 

Bots are programmed to perform the same actions over and over. Similarly, malicious human users typically perform hacks and fraudulent activities at a high volume. Furthermore, and perhaps most concerningly, botnets attempt to make a whole network of bots look like a single user. For this reason, to protect their campaigns from planned attacks, paid marketers are looking for repeated behaviors coming from the same IP address or from the same cookied user. Identifying repeated malicious behaviors can help these marketers stop attacks in their tracks. 

Analyzing traffic metric anomalies

Website traffic metrics from paid marketing campaigns can vary based on many naturally occurring factors such as time of day, keyword strength, and current designated campaign budget. For this reason it can be tempting to overlook unusual spikes in traffic from advertising campaigns, and brush them off as a non-issue. But savvy paid marketers know better. Unusual spikes in website traffic on specific days, from areas outside targeted geographies, and atypically high bounce rates, can all be indications of a bot attack. By carefully analyzing all website traffic, these marketers are able to quickly identify malicious activity impacting their campaigns. 

Looking for user agent inconsistencies

User agents are the devices and mechanisms that someone uses to access the internet. For example, someone’s user agent string could identify them as a tablet user who is operating on a Windows operating system, and accessing the internet via Google Chrome. All internet users have a string of information about themselves like this, and most user agents have unalarming qualities. But malicious users may try to manipulate their user agent in order to hide their true characteristics, so that they can more easily commit fraudulent activities while going undetected. However, marketers who pay close attention to user agents in their analytics platforms are looking out for inconsistencies, and identifying potential threats. For example, using an Apple device with Android software is nearly impossible, so if something like that appears in a company’s analytics platform, there is a good chance that the user is manipulating their user agent for malicious purposes. 

Reevaluating traffic sources 

Paid marketers take stock of the sources that are driving the most traffic to their site to see if they align with the channels they are investing in most. However, if one paid channel is driving a lot of traffic, but that traffic is leading to unusually low conversion rates, something could be awry. In order to identify whether affiliate programs, content syndication programs, and other paid platforms are sending bots traffic to their sites, paid marketers are looking closely at this traffic and checking to see if behaviors across different channels are driving the same behaviors throughout the funnel. They are essentially looking for oddities and inconsistencies throughout the buyer journey, and diving in deeper to see if any inconsistencies could be caused by fake traffic.  

Deploying go-to-market security 

As one can imagine, analyzing all of this data on top of running paid marketing campaigns can quickly become overwhelming. Fortunately, there are go-to-market security platforms that can step in and automate many of these processes. GTMSec is one of the fastest-growing categories in cybersecurity, largely because it is designed to address the problems that the Fake Web causes for marketers and analysts specifically. Rather than creating solutions for the IT department to protect against fraud, these GTMSec platforms speak the marketer’s language and can help block fraudulent activity from infecting their campaigns. Since bots and fake users stand in the way of marketing objectives, it makes sense for paid marketers to prioritize cybersecurity in order to meet their goals and KPIs.

The post How paid marketers are fighting back against the Fake Web appeared first on Search Engine Land.

17 content optimization mistakes affecting ROI

Hint: proper content optimization is more than just adding in a few keywords and creating a FAQ section.

The post 17 content optimization mistakes affecting ROI appeared first on Search Engine Land.

Content optimization directly impacts ROI. The better the content for users and search engines, the more it drives results such as visibility, traffic, conversions, and loyalty.

Even Ross Hudgens confirms it to be a mature content marketing strategy.

 That’s why we now see people adding content optimization to their budget requests.

I’m not asking you to spot the mistake in the above tweet but to see a significant amount dedicated to content optimization.

Now that SEOs have the budget for content optimization and know how to optimize content (even for featured snippets), let me showcase the mistakes that happen directly or indirectly when planning and implementing content optimization. 

I was able to find 17 of them. Let’s get straight to it.

1. Missing out on the audience research

The biggest mistake while optimizing content is not considering for whom this content needs to be. If the audience reading your content is not right, how can you expect ROI?

Every business and industry/segment has different audiences that read the content. 

For example:

The publishing industry

A publishing site like Search Engine Land would target the audience such as:

  • SEOs at all levels (entry, intermediate, experienced, managerial) to learn and inspire
  • PPC professionals at all levels (entry, intermediate, experienced, managerial) to learn and inspire
  • Business owners looking to market their SEO or PPC related products

Service industry

A marketing agency like Missive Digital would target the audience such as:

  • Business Owners of SaaS, IT, and B2B companies
  • Marketers at all levels (entry, intermediate, experienced, managerial) to learn and inspire about marketing

Product segment

A talent-hiring platform like Codemonk would target the audience such as:

  • Developers of all technologies looking for jobs
  • Project heads of global enterprises looking to hire remote talent

When I first thought about writing content on “top fashion eCommerce brands,” I found a lot of competing blogs that were talking about which fashion brands to buy from.

But, the target audience of eComKeeda is the eCommerce business owners. So, it was tricky to target that keyword.

Vatsal Shah and I concluded that rising fashion ecommerce business owners would be keen to know how the top online fashion brands became successful.

We decided to write the content showcasing the behind-the-scenes story of every top online fashion brand and their life-changing decisions. 

The result is in front of you. We still own that featured snippet.

You can optimize your content for a specific audience and get a much higher ROI. It doesn’t matter what keywords you use, as long as they’re relevant to the people who will be reading them!

2. Disdaining the user’s reading intent

Being an SEO, you care a lot about user search intent; what goes missed is their reading intent – the “Why” of a person reading the content.

For example, this week, we optimized content for “ReactJS developer skills.” When we received the blog for optimization, we saw it had the section, “What is ReactJS and its benefits.”

My team member quickly messaged me about whether we should have that section considering the topic and the users’ reading intent.

We removed that section before it goes live, as we should understand that people coming to read about ReactJS developer skills are very well aware of the basics of ReactJS. You don’t need to waste their time and effort there.

Before optimizing, think twice about why a person would continue reading your content, improving content metrics.

3. Not analyzing the right data in Search Console

Today, every SEO is looking for quick hacks. Here is one I found recently,

Aisha Preece has suggested a great hack, but we need to choose those high-impression queries carefully.

Here is a list of the top 10 queries on Google Search Console (GSC) to optimize the content, “X Benefits of Full Stack Development.”

The highlighted user search queries have the highest impressions. If we choose them to optimize the content for them, we end up making a mistake that would cost not just our content creation but optimizing and link-building effort, leading to negative ROI.

Why? Because the user’s searching and reading intents behind those keywords won’t match the topic that we are optimizing. 

The people searching information for the query “hire full stack developer” are not looking to understand the benefits of full stack development but to hire a developer doing that.

4. Forgetting about analyzing user behaviors

Most content optimization sticks to looking at GSC and optimizing keywords in any way possible.

But ROI doesn’t stick to keyword rankings, and it goes beyond traffic and conversions. 

While conversion is still the most significant ROI metric, most SEOs skip looking at conversion optimization tools to see how people behave on the content they’re looking to optimize.

Forget conversions if that’s too high. Think of content metrics such as engagement rate, engaged sessions per user, and average engagement time. The content you optimize should be improving these metrics too.

You need to look at what makes your audience read the content more, whether they find images interactive, whether that highly creative pop-up is annoying and much more.

Unless you know these things, you only optimize for keywords, not conversions. And honestly, better keyword ranks don’t guarantee business. Avoid making such a content optimization mistake.

5. Not analyzing your competitors thoroughly

I often see SEOs considering the following things while doing a competitor analysis for content optimization.

  • How many words they have written
  • What headings they have used
  • The keywords they rank for
  • The backlinks (but least important)
  • The media they have added

The biggest mistake we make here is looking at what our competitors have added. We should be looking at what they haven’t. After all, outperforming them would drive excellent results.

For example, when we were doing competitor research to optimize the content on “top fintech apps,” everyone talked about the fintech app. None of them wrote about which type of fintech app it is.

We added that line for every app in the list, and we got the featured snippet for the most competitive and high-impression keyword.

You should be looking at the right things in the right places.

6. Not defining a content optimization structure in advance

A mistake we commonly see across agencies, publishers, and sites with more than 1000s of blogs to be optimized.

Every site is different, and so is its content optimization strategy. But, what can be the same is the structure you use to optimize your content.

Recently, we came up with a structure that we created to use across different projects and teams. 

Note: Content optimization structure example on Google Docs.

Like any other task, having a structure for content optimization eliminates any chances of missing out on an important aspect and improves how efficiently it’s implemented.

7. Skipping the tech content audit 

What if the optimized content has a poor user experience on the mobile site? The content has media that take years to load. Navigating from one page to another is a puzzle for the user.

Do you think such issues will sustain the user on the site for more time? Of course, not.

How can you expect that content to improve performance metrics?

Get the technical audit done. If you have done it already, that saves time. 

But, in a case like ours, when you have got a project only for content optimization, I would suggest you follow the below tech content audit steps by Tory Gray and Tyler Tafelsky.

  1. Ensure your JavaScript content is fully accessed and rendered using SEO Spider Software.
  2. Audit core web vitals (as you already have access to GSC) and optimize for page speed using the Page Speed Insights tool or the Chrome Web Vitals Extension (to save some time).
  3. Audit index bloat and keyword redundancies, and prune mindfully using sitemaps.
  4. Determine (and improve) pages where you’re losing users using Google Analytics.
  5. Leverage content gaps inspired by competitors and keyword data using an SEO tool
  6. Consider non-SEO segments and overall conversion value using the SEO Spider tool integrated with Google Analytics API.

If the doctor doesn’t know your problem, it would be difficult to suggest the medication. And, you know the implications of incorrect medication, right?

8. Mapping the irrelevant keywords

This is one of the most common content optimization mistakes I have seen, experienced, and rectified.

Even at Missive Digital, we invest a considerable amount of time training new joiners on how they don’t have to choose money keywords when optimizing a blog and vice versa.

If you go back to your past and current optimization docs, you will see a mix of both blog and money keywords to be used in the blog.

With this, you’re confusing search engines on which page to give importance for ranking the target queries and also inviting keyword cannibalization issues.

With such issues, the blog page won’t be able to educate the audience, and the money page won’t convert. Ultimately, your content optimization effort would not result in a positive ROI.

9. Choosing a non-user-friendly content flow

How do you say if the content has a non-user-friendly flow? By comparing the topic and its users’ reading intent with the content flow.

For example, let’s take the blog example, “The best smart TVs to buy in 2022.”

The content is good, so it has most queries on page 1 or around it.

Following is the content flow where the buying guidelines are on the top of the list of best smart TVs to buy in 2022:

We don’t mind putting the buying guidelines on the top if it’s short, but this buying guideline is of almost two scrolls on the desktop. These scrolls would get doubled on mobile screens. It might distract people from coming to the point on why they’re recommending the presented smart TVs.

We recommended changing and having the list before showcasing the buying guidelines to win users’ hearts and even the featured snippets.

You need to understand why a person is landing on your page and what you should do to avoid distracting them. Otherwise, such a mistake can make you stay away from page 1.

10. Missing out on contextually adding internal links 

Another huge mistake happens on internal links that directly impact your SEO ROI. This mistake happens in two ways,

  1. When you link to a page on an irrelevant anchor text
  2. When you put “Read more:” links instead of putting them on the anchor text

For example, while optimizing the content on “6 Commonly Referenced Data Governance Frameworks in 2022,” we found that the internal link to the data governance definition is given at the end, asking people to go and check out that blog to understand in detail.

Because this is a definition, a well-utilized anchor text would be the best to drive significant value for the linked blog. We recommended changing the link placement where the definition was just starting.

In another case, we recommended removing the read more section on another blog for the same client because they already gave that link in its respective section on the relevant anchor text.

Now you might wonder how to decide if we should add a Read More internal link or a keyword-focused anchor text.

In my opinion:

You need a “Read More” section when you think the linked content would help the user move to the next customer journey stage. 

But, if you’ve used a target keyword of another page in a sentence for the first time, you can choose to make it an anchor text. If you use that keyword to ask them to go and check out the content, you’re inspiring them to leave reading the current page and move on to the next.

Contextualize the links you put on your pages to drive the most value. 

11. Updating only dates, years, and keywords

Consider it a myth or mistake; many SEOs consider optimizing content means updating only dates, years, and keywords.

Changing the title from 2021 to 2022 is not a content optimization; it’s only title optimization.

Just putting some keywords in the content doesn’t make it content optimization. 

If you think in this way, you’re making a huge mistake because you won’t see any ROI even after optimizing content.

Content optimization also includes,

Adding missing parts

The introduction of a blog helps the user get the context of the blog, and the conclusion gives them clarity on what they learned and what they should be doing about it.

And, if such necessary information is missing, you need to add them.

Adding new section

You can add a new section if you feel the content is incomplete for a user to get enough value. Here is how you can suggest them.

Writing a new blog and cross-link with each other.

Sometimes, you create a guide-like content where you have different sections, which can be explained in detail but not in that guide. In such cases, you need to write a separate blog and use its summary in that guide-like content.

With this kind of content optimization, you get the opportunity to rank for another blog and pass the internal link juice to the guide-like content and vice versa.

We often call it a Hub and Spoke model, which Andy Chadwick explained in detail how you could use the right way.

Removing unnecessary or stale content

Content optimization is certainly not only about adding new things but even removing the things that can hamper the actual ROI. The way we think, removing zero-performing content can drive great results.

Dana DiTomaso says in a blog post by Andy Crestodina,

“Sometimes you’ll find several blog posts on the same topic but they’re all mediocre, so none of them rank. If the content is still something you want to keep, then combine them into a much better post and redirect the old posts to the new one.”

Following are the scenarios where removing the content makes more sense:

  • When your content has less than 10 or 20 impressions a quarter.
  • When most of your content has definitions and benefits to refer to. Not every content needs that.
  • Trends, best practices, and quick hacks change from time to time.

Changing the length of a section

As content curators, we don’t believe in more extensive introductions or brief explanations in a how-to guide. In such cases, we suggest changing the length of the section considering the content metrics.

Adding visuals

Add them if you feel your audience would benefit from looking into videos, product GIFs, infographics, and more.

Most content that we have featured snippets for has some visuals for sure.

12. Missing out on image optimization

Adding images is one thing, but optimizing images during content optimization means ensuring those images are,

  • Added contextually (and not just the stock photographs)
  • Named appropriately (and not just logo-1.jpg)
  • Described properly with proper ALT attribute (and not just logo-1)

I shared more on my strategy for optimizing image ALT attributes in this presentation.

13. Being strict on external links

Either people don’t put external links, or if they put, they always consider them making “nofollow.”

Here is what the Google guidelines have on the “nofollow” attribute,

“Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.”

For example, in one of my blogs on eCommerce FAQs, we made all the brand links to “nofollow.” The blog has been on featured snippets for over three years now. 

But, for a blog on content-driven commerce written by Vatsal Shah, we didn’t make all the links “nofollow,” and it’s still at rank #1 for over two years now.

Let’s look at another blog on eCommerce entertainment. It’s on rank 1 for over two years with a mix of “dofollow” and “nofollow” external links.

You should not hesitate to use external links if you think they can add value to your content. 

You can choose whether to nofollow them or not, based on your experience.

14. Optimizing for SEO plugins

Ah! This one is amazing. 

When interviewing candidates for the SEO roles, I ask them, “I see you’ve done content optimization. Can you please share how you do it?”

Candidates are like: “We look at the SEO plugin, whether Yoast or Rank Math, and optimize the content to achieve the green color in the scorecard.”

I’m clueless when I hear that.

If we don’t optimize our content for the search engines, don’t do it for plugins as well.

15. Over optimizing content

Thankfully, we haven’t seen so many blogs with such a mistake. 

And that’s why we don’t even have it on our checklist earlier.

But while I was writing this blog, one of my team members asked if we could tell our client that they had stuffed a keyword.

That’s when I decided to add it to this comprehensive list of content optimization mistakes.

There is no keyword density to focus on today, but the keyword should be added naturally and not in every sentence.

Readers are smart enough to identify if you’re faking what you’re saying when you stuff keywords. If users leave your site with such an experience, you lose them forever.

So, just stop it if you’re even thinking about it.

16. Not thinking about building links

I have experienced and heard a lot of case studies where a site is ranking without building links.

But what about conversions, thought leadership, and brand authority? That comes with building links.

Be it any business (even my agency), mostly leads convert from the repeat visitors.

They have visited your website, researched your business on different platforms, and then come to your website again to put the inquiry.

For every content you create and optimize, you should think of distributing it on the right platforms to make the most out of it.

As I said earlier, ranks don’t guarantee conversions, but the authority does.

17. Skipping the performance monitoring

Last but not the least mistake of content optimization is not monitoring the performance of the content optimized. 

How will you come to know if the optimization worked? Whether it drives more engagement or conversions? Content metrics come into the picture. You can use various SEO tools, Google Analytics, Search Console, SEO spider software, and more to monitor the content performance.

But, what’s more important is how often you track it.

We have this tracker, where we monitor the performance every week after the content is updated as per the optimization suggested.

Use my MOM (Monitor -> Optimize -> Monitor) approach to improve results. I coined this approach during my talk on boosting organic traffic using featured snippets at Whitespark Local Search Summit 2021.

Optimize your content wisely.

The above 17 content optimization mistakes help you stick to what is suitable for your audience and brand more than the search engines. After all, publishing content is only 20% of the task. The rest, 80%, is optimizing it to own featured snippets.

Go, and download this infographic to circulate among your team and friends for them to keep handy.

The post 17 content optimization mistakes affecting ROI appeared first on Search Engine Land.

WebPageTest’s new Opportunities and Experiments: test practically anything

Learn what you can do with the new and incredibly useful test tools from WebPageTest – with no coding required.

The post WebPageTest’s new Opportunities and Experiments: test practically anything appeared first on Search Engine Land.

There’s never been a better time than now for a web developer’s approach to SEO.

The pace at which tools and resources, both new tools and anything familiar, innovate and open options for us also demand that we keep up. Recently, that has meant more requirements for performance optimization as Google releases algorithm updates and changes to metrics calculations.

One tool that you should be familiar with is WebPageTest. They recently released some incredibly useful new fully integrated test tools.

WebPageTest now proxies real-time user-specified HTML modifications through Netlify to run comparison tests right inside their user interface. No coding is required.

Genius makes sense

Smart application logic in three huge areas of concern bubble findings up for you, but not just with text blurbs, with re-test options prepped for you to run combinations of variations for comparison. The array of tests available in WebPageTest now means there will be no more setting up tests using third-party proxy tools that duplicate what you can test directly.

This was all technically possible before and the original approach continues to have importance.

Although impressively comprehensive, there will always be tests you will need to run using a proxy host of your choice. This requires handling JavaScript and Cloudflare, however.

With WebPageTest you get to point and click.

Pesky lab data

Always keep in mind the best possible combination of numbers from lab tests may not yield the same numbers in the field. It can actually result in broken website features.

Scripts and styles have developer-defined load order where any change can mean a breaking change that is not suitable for production. A proxy host can provide access for QA as part of the optimization process.

With that warning out of the way, let us tell you how great it is having a testbed for demonstrating HTML optimizations. It’s been the basis of our workshops and conference sessions now for well over a year.

Our Search Engine Land guide articles can help you set up a testbed. We’ll be using an updated version at SMX Advanced. Join us live if you can make it.

Opportunities

WebPageText’s Opportunities text is available to everyone in reports.

You won’t need JavaScript skills to run HTML variation comparisons anymore. Instead, you will need a paid account to run built-in proxy tests labeled Experiments.

The free account gives you better access to reports and history, but not running Experiments. You can still write JavaScript and proxy your own tests for free.

It’s just nowhere near as handy and it takes up way too much time.

Free opportunities. Paid account for experiments.

Experiments

Select the Opportunities & Experiments menu item in a WebPageTest report and you will be presented with a comprehensive list of findings.

Opportunities here are derived from real-world test conditions (simulated with hardware where possible). Our test indicated an opportunity to re-test experimenting with render-blocking resource variations (typically JavaScript and CSS), lazy loaded images, self-host third-party script and much more.

Pro Account Required for Variations

Test async, defer, or even inline scripts and stylesheets using the interface. We’ve been writing Cloudflare Worker JavaScript to proxy these tests and we also added inline style rules to defer loading content towards the bottom of the page, including the footer. The initial array of WebPageTest integrations can handle most, but not all, of our original tests.

It’s a snap to put tests together now.

Modify test settings and start running variations to hone in on the holy grail of green Core Web Vitals across the board. The offering is amazingly comprehensive and covers far more than what affects a webpages’s performance.

You’ll find three categories which group opportunities to experiment by the following questions:

  • Is it quick? Quickness categorizes and groups performance optimization experiments.
  • Is it usable? Usability groups HTML validation errors that can mess with screen readers and things that affect layout shifts.
  • Is it resilient? Resiliency goes to security concerns including mixed protocols. Modify test settings with a checkbox interface and start running variations. You will get refined options in the comparison report.

Dashboard for a test suite

WebPageTest has to resemble a dashboard for a test suite and manage to do that inside reporting that provides more detail than Lighthouse, and with far better waterfall chart representation than Chrome Dev Tools.

Although it’s true with point and click you can run HTML experiments in a “no-code” environment, the detail provided and navigation requires experience – and coding experience is best.

A new built-in Experiment replicates another Cloudflare worker task by removing all JavaScript. Having stuff like that so accessible is exceptionally handier than writing a script for test variations.

Advanced Experiments allow us to insert HTML in key locations, test tactics to change load order, fail to load, or modify, including minify, resources.

There’s literally nothing technically stopping us from testing practically anything on any page.

Fall into the pit of success

Comparison reports themselves serve to funnel you into selecting and re-testing more variations. The result metrics banner includes color-coded improving and worsening scores between control and experiment.

A remaining opportunities section with a subset of experiment switches appears below. You can click your way through to significant improvement.

We’ve done the hard work to write tests for demonstration at SMX Advanced and when we’re live you can expect us to cover this major update to the very tools we used. It’s going to be so much easier.

We will see if the rapid text cycle of WebPageTest experiment integrations gain what we were already preparing to deliver. Let’s see if we can get to green across the board.

The post WebPageTest’s new Opportunities and Experiments: test practically anything appeared first on Search Engine Land.

How your PPC conversions will be impacted without privacy-first measurement

What happens if you don’t build new measurement frameworks? Decreased PPC performance, for starters. Let’s avoid this.

The post How your PPC conversions will be impacted without privacy-first measurement appeared first on Search Engine Land.

Within the next 12-15 months, third-party cookies will retire across digital marketing channels.

Savvy advertisers know they need to begin developing a game plan for the cookieless future, but what will happen to those who don’t adapt to these changes?

Above all, marketers will suffer from signal loss, which will negatively impact how we measure campaign performance, optimize campaigns over time, create audiences for ad distribution and drive growth within our digital channels. 

The industry sea change with the lion’s share of attention is the retirement of third-party cookies in Google Chrome.

Sure, other browsers, including Microsoft Edge, Apple Safari and Mozilla Firefox, have previously restricted third-party cookies. Chrome is more monumental simply because of its market share.

SimilarWeb recently released a study that showed Chrome was the world’s most popular browser with 62% of web traffic. 

To recap from my previous article, Google Chrome will retire third-party tracking cookies around Q3 2023. That is an approximate timeframe for this monumental change, but it gives us a target to make sure that our digital marketing campaigns will be ready.

This might sound like the distant future, but many of the measurement solutions needed to replace the functionality of third-party cookies could require significant time and effort from development teams.

This type of support usually requires a few cycles to be prioritized on project roadmaps.

Getting started in the next couple of months will be beneficial in the long run.

Look at it this way: your future self will thank you for being thoughtful and proactive!


Get the daily newsletter search marketers rely on.

Processing...Please wait.


What happens when marketers do not build new measurement frameworks?

For over two decades, marketers have utilized third-party publisher cookies to track their media performance. This method isn’t perfect, but it’s been a standard practice that’s set to evolve in a major way during the next 12-15 months.

From a digital marketing perspective, one of the most significant impacts is the loss of conversion measurement. This loss of performance data includes sales, sign-ups, purchases, revenue and other engagement metrics since those actions are likely to be restricted.

If marketers do not evolve their measurement practices, their accounts will rely on algorithmically-driven modeled conversions. 

Successfully enabling automation within PPC is critical to driving positive results.

One of the most potent algorithmic elements is smart bidding. Algorithms that drive cost-per-acquisition (CPA) and return-on-ad-spend (ROAS) bidding need strong data signals to optimize performance.

The data that feeds these algorithms must be reliable so that accounts are optimized toward the most valuable actions and this conversion data needs to have enough volume to drive machine learning.

Data loss means bid algorithms will not function properly, which will result in decreased PPC performance. Let’s try to avoid this!

More conversions will be algorithmically modeled as a result of signal loss

There is too much at stake (i.e., money) for ad platforms such as Google and Microsoft to leave marketers without another option to gain back lost data.

When marketers forge new measurement frameworks via Enhanced Conversions (EC), Google Analytics 4 or Offline Conversion Tracking, those are considered Observed Conversions.

This mix of first-party data and user-matched data (EC) is generated by registered actions taken by our website visitors.

Try to collect as much observed conversion data as possible.

The alternative is Modeled Conversions in Google and Smart Goals in Microsoft Ads. According to Google, Modeled Conversions is:

“When Google surfaces modeled conversions in Google Ads, we are predicting attributed conversions. In most cases, Google will receive ad interactions and online conversions but is missing the linkage between the two. The modeling we perform is modeling whether a Google ad interaction led to the online conversion, not whether a conversion happened or not.” 

Even after these large-scale privacy shifts, Google will continue to acquire mountains of data per user: search history, browsing history, and any other online activity when someone is logged into their Google Account, especially when those signed-in users are on a Google property.

Google will not be able to install a tracking pixel for that user specifically, but they should have enough data to algorithmically predict which media interactions lead to a conversion for an advertiser. 

Microsoft Ads is working on a version of conversion modeling. This product is called Smart Goals.

According to Microsoft:

“Smart Goals use Microsoft Advertising machine learning models to identify the best sessions on your website. If you have the UET tag set up correctly, the smart goal will examine all your website sessions and determine which of those sessions can be considered a ‘conversion.’ Smart goals use multiple signals to identify conversions. Some of the signals that are used include session duration, pages per session, location, device and browser.”

In essence, they are similar to Google’s modeled conversions. They both rely on machine learning at scale to understand user behavior and potential reactions to paid media exposure.

Marketers need to provide numerous additional signals to make any modeled conversions as accurate as possible.

With the loss of user-level data, modeled conversions will be part of the measurement landscape going into 2023.

This brings us back to creating a strong framework for supplying as much Observed Conversion data within the platforms, which will help inform the Modeled Conversion algorithms. 

Marketers have time and tactics to forge new measurement frameworks

The prospect of rebuilding your measurement framework can feel daunting, but you have the next couple of quarters to determine which solutions work best for you and your business.

Now is the time to start evaluating your current processes, review the new measurement tactics that are currently available and begin building a plan. 

In my last article, we laid the groundwork for what this metamorphosis means for the digital marketing landscape and approximately when it should occur. This article has addressed why adapting to these changes needs to be a strategic priority.

Next time, we can begin drafting a plan on how you can build a privacy-centric measurement and audience framework for 2023. 

The post How your PPC conversions will be impacted without privacy-first measurement appeared first on Search Engine Land.

SEO reporting to impress: How to successfully report your SEO process, efforts and result

SEO reporting is critical for SEO success, and you should prioritize it accordingly.

The post SEO reporting to impress: How to successfully report your SEO process, efforts and result appeared first on Search Engine Land.

None of us have likely become an SEO for the love of reporting, in fact, it’s among the least favorite activities for many SEOs based on a poll I did a while ago.  

However, decision-makers care a lot about reporting as it’s how we communicate and they assess the SEO process investment and overall success. In fact, the effectiveness of SEO reports can end up being the difference between getting fired rather than more SEO support or a raise by decision-makers. 

Despite this, many SEO reports are broken as they’re just a compilation of dashboards automated via tools featuring SEO metrics. I asked over Twitter and 41% of SEOs who answered said to only use a dashboard with data for SEO reporting. 

Data from our SEO dashboards can be included in reports but they can’t replace them as a whole: an SEO dashboard is a visualization resource that contains the most important, latest status of all metrics we want to follow up from our SEO process, to easily monitor its progress at any time. 

On the other hand, an SEO report is a document featuring a collection of key performance indicators from a certain time period along with an analysis and conclusions, to be used for periodic analysis and assessment of the SEO process towards the achievement of its goals.

Using only automated SEO dashboards as reports can end up harming more than helping. They are filled with information that the audience – often non-technical stakeholders or decision-makers – won’t understand or care about, with no prioritization, insights, analysis, or outcome actions. This only generates more questions than providing answers. 


Get the daily newsletter search marketers rely on.

Processing...Please wait.


Even personalized SEO dashboards can’t achieve all SEO reporting goals – especially taking into consideration that a high share SEOs don’t always present their reports which are the following: 

  • Communicate SEO results: The SEO process evolution towards the established goals (what has been achieved vs. what was expected?)
  • Explain the cause of SEO results: Why the different areas are or aren’t evolving as expected. 
  • Drive actions to achieve SEO results: Establish SEO-related activities and request support for the next steps to achieve goals. 

The biggest challenge to developing personalized SEO reports is caused by timing restrictions as we tend to feel the pressure to develop reports fast to get back to “SEO execution,” but SEO reporting is also in most cases only a monthly effort too. 

Ready to help effectively tackle your SEO reporting goals while accelerating the process? Here are three principles to follow.

1. Use only meaningful KPIs that communicate your results 

Cut the noise and minimize doubts with the data you include in SEO reports.

Avoid using confusing proprietary metrics, as they’re unreliable and difficult to connect with your actual SEO goals.

Don’t add everything you monitor to reports either, only Key Performance Indicators (KPIs) that show the progress towards those SEO goals the audience is actually interested in.  

This is why the KPIs to include in each case should be personalized based on the audience profile and interests: the SEO related goals the CEO and CMO care about will be different (eg. SEO activities ROI, revenue and organic search market share) than those the head of SEO is interested in following up with (eg. SEO activities ROI, revenue and organic search market share along with other more technical related ones like non-branded commercial search traffic growth, top-ranked targeted queries, key pages crawlability and indexability, etc.).

Because of this, the KPIs used in the reports targeted to the former will be different than the latter, as well as the metrics to calculate them. 

Here are a few steps and criteria to help you select relevant KPIs to include in your SEO reports: 

  • Start by establishing your SEO reports audience: Who will you report? Each audience will want to answer different questions about the SEO process’s progress. Ask each stakeholder about the SEO goals achievement they want to be informed of. Make sure these are actual goals that have been set for the SEO process and there are actions to be executed that are connected to their achievement.
  • These should be “SMARTER” SEO goals (specific, measurable, attainable, relevant, time-bound, evaluated, reviewed), connecting SEO efforts with business objectives. Depending on the stakeholder role, they can be operational or business-related: Agree on which goals progress questions should be answered with SEO reports. Once you have these questions, it will be easier to establish the KPIs to report, as well as the metrics to obtain and measure to calculate the KPIs. If you can’t establish meaningful metrics to calculate KPIs and answer goal progress questions, then the goal might not be a SMARTER one.  
  • Ensure metrics data sources are reliable and stakeholders trust them and establish a couple of methods to gather the same data for consistency check. If it’s difficult to ensure accuracy for some KPIs, ensure precision (its consistency over time).  
  • Confirm the scope, frequency and format to present the SEO report to ensure you use a medium to facilitate its consumption (Google Slides, Google Docs, etc.). Set expectations about timing to avoid unnecessarily too-frequent reporting (e.g., there’s no point in doing weekly reports if there won’t be meaningful changes during this period due to SEO nature and frequency of releases). 

You now have the input needed to start collecting data and putting SEO reports together with only relevant KPIs for each audience and their understanding metrics. Here’s a Google sheet version of the SEO report Planner for using meaningful KPIs to facilitate this process further: 

2. Ensure clear KPIs presentation to facilitate progress understanding 

Your SEO reports KPI presentation efforts shouldn’t be about “creating a pretty document with beautiful charts” but about making the featured data easy to understand and achieving SEO reporting communication goals.

Sometimes a simpler scorecard will make it easier to understand goals achievement than a fancy time series. 

This is why it’s fundamental to follow certain data presentation and visualization best practices when selecting how to feature your KPIs:  

  • Identify the best data visualization format for each KPI by asking a few questions, as described here and here, the most important being: 
    • What’s the story your data is trying to deliver?
    • Who will you present your results to?
    • How many data categories and points do you have?
    • Should you display values over time or among groups?
  • Test with real data to see if each KPI goal progress question can be answered.
  • Communicate one major KPI in each chart to avoid confusing the audience.
  • Remove pointless decorations and chart information that won’t help to answer the relevant KPI goal question.
  • Add the relevant data source to each chart to establish trust and avoid potential doubts.
  • Always label chart elements clearly and directly to facilitate fast understanding.
  • Add the question to be answered with each KPI as a chart headline to facilitate storytelling.
  • Use color with intent to facilitate KPIs progression understanding.

Here’s a Google Sheet checklist for KPIs clear data presentation that you can use to facilitate your decision-making process: 

3. Leverage data storytelling to explain and drive action with your SEO reports

Data storytelling creates compelling narratives to help audiences understand and drive action from your data analysis.

As explained by PPCexpo, stories attract and maintain people’s attention for longer, numbers without stories can quickly become boring, and stories communicate insights with higher clarity. As a consequence, storytelling should help to communicate the value of the data you’re showing. 

However, it’s fundamental to avoid misrepresenting the data and bringing it to the wrong conclusions when leveraging storytelling.

For this, it’s recommended to avoid cherry-picking data or manipulating scale. Always show the whole picture, giving full visual context and keeping visuals and language consistent across the report. 

SEO reporting storytelling should explain and drive action from the data without misleading. Even if the results are not positive, otherwise, you will lose trust. 

For this, craft a compelling narrative for each KPI using the three-act structure, asking the following questions: 

  • Setup: What happened? Describe “what happened” with each KPI result vs. expected goal progress, taking the audience into account. 
  • Conflict: Why did it happen? Explain the why behind the result, whether positive or negative and describe the cause of the results
  • Resolution: How to proceed? What to do next to achieve the expected goal given the current results? Summarize top-recommended actions

Then to effectively structure your SEO report: 

  • Include a page or slide per KPI by organizing the pages to begin with, the most important KPIs to the audience.
  • Add a data appendix at the end with additional evidence to refer to from the KPIs pages.
  • Include an executive summary at the start, highlighting the main KPIs results and actions: It should be concise but include enough to stand by itself as a report overview. 

It’s also important to remember that there’s nothing like presenting the SEO report yourself to facilitate understanding and get feedback to improve. 

SEO reporting is critical for SEO success, and you should prioritize it accordingly. I hope these principles, guidelines and templates can help you with it as they’ve helped me.  

The post SEO reporting to impress: How to successfully report your SEO process, efforts and result appeared first on Search Engine Land.

Webinar: Benchmark your social media performance for a competitive edge

Get the trends and real-life examples of brands outperforming on social.

The post Webinar: Benchmark your social media performance for a competitive edge appeared first on Search Engine Land.

Social media benchmarking involves comparing your metrics and processes against the industry standards. Learn how you can get a clear idea of how you stack up against the competition.

Hear from Rival IQ and NetBase Quid experts about the metrics and benchmarks you can use to measure your social media performance.

Register today for “Benchmark Your Social Media Performance For a Competitive Edge” presented by NetBase Quid.

The post Webinar: Benchmark your social media performance for a competitive edge appeared first on Search Engine Land.