Bye Bye JavaScript! Auto Event Tracking with Google Tag Manager

Implementing analytics, or any type of conversion tracking, is a big pain in the ass. There, I said it! But it’s been getting easier and easier with adoption of Tag Management tools. Google Tag Manager is going to make it even easier with the introduction of a new feature called Auto Event Tracking. Auto Event […]

Implementing analytics, or any type of conversion tracking, is a big pain in the ass. There, I said it! But it’s been getting easier and easier with adoption of Tag Management tools. Google Tag Manager is going to make it even easier with the introduction of a new feature called Auto Event Tracking.

Auto Event Tracking let’s you track almost any user action without any additional JavaScript. It automatically captures user actions like clicks and form submissions.

TL;DR: watch this video.

For all you Google Analytics users, this means that it is no longer necessary to add JavaScript to track PDF downloads, outbound links or other user clicks. Those tasks, and many others, can be automated with Google Tag Manager.

I know – it’s exciting! Less coding = faster data collection = more reliable data quality = better insights.

There are a number of new additions to GTM that make auto-event tracking possible. Let’s take a look at how the system has changed to make this possible.

How Auto-Event tracking works

Here’s a brief overview of how the new auto-event tracking works.

Listen, Capture Collect. How the Auto-event tracking works for Google Tag Manager.

Listen, Capture Collect. How the Auto-event tracking works for Google Tag Manager.

1. Listen: A new type of tag, called an Event Listener tag, will listen for different types of user actions, like clicks or form submissions.

2. Capture: When the Event Listener tag detects an action it identifies it and captures it (technically it pushes a Google Tag Manager event onto the data layer).

3. Collect: You can then automatically collect the action using additional tags, like an analytics tag.

Remember, this all happens without any additional coding. All you need to do is add the necessary settings in GTM.

There are three new pieces of functionality that make this possible:

1. The new Google Tag Manager Event Listener tag.

2. New events that indicate a user action has occurred.

3. New macros that collect information about the user’s interaction with the content.

The Event Listener Tag & Automatic Events

Let’s start with the new tag, called The Event Listener tag. This is a special tag that – wait for it – listens for a user action on a page :)

When the tag detects an action it automatically collects the action and identifies it. From a technical perspective is pushes a Google Tag Manager event to the data layer.

There are four different types of user actions that the tag can detect. Again, each action results in a Google Tag Manager event.

Click listener: this tag will listen for clicks on a page. This includes button clicks, link clicks, image clicks, etc. When a click occurs, the Google Tag Manager event gtm.click is automatically generated.

Form listener: this tag will listen for any form submissions. When a form submission occurs the Google Tag Manager event gtm.formSubmit is automatically generated.

Link click listener: same as the click listener, except it only captures clicks on links. When a link is clicked, the Google Tag Manager event gtm.linkClick is automatically generated.

Timer listener: the timer listener will collect data at some regular interval that you specify. For example, if you specify an interval of 10,000 milliseconds, GTM will fire an event every 10 seconds.

Obviously, if you want to automatically listen for user actions you must include one of the above tags on the page where you would like to capture the user action.

For example, let’s say you want to capture clicks on outbound links (this means links to other websites). Chances are you have outbound links on all of your pages. So you should add the Link Click listener tag to all pages of your site.

Remember, to add a tag you need to specify a rule that governs when the tag is added to a page. Here’s the default rule to add a tag to all the pages on your site.

Use the GTM All Pages rule to add a common event listener to every page on your site.

Use the GTM All Pages rule to add a common event listener to every page on your site.

But let’s say you want to capture a form submission, like a contact form. There really isn’t any need to include that tag on all of your site pages. So you can create a rule to add the tag to just your form page, like this:

To control the form listener tag, restrict the placement with a rule.

To control the form listener tag, restrict the placement with a rule.

The new Events are important because they identify that an action has happened. I’ve got some example below.

Understanding the New Auto Event Macros

In addition to the new tags & events there are also a number of new macros that help collect the action that occurred.

A macro is a piece of data that you can use in your tags. Some macros are automatically populated, like the url macro (which is the url of the page), the hostname macro (which is the hostname of the site), or the referrer macro (which is the HTTP referrer).

With the Auto Event Tracking macros you can automatically add data about the element the user interacted with to your analytics tag (or any other tag).

There are five new macros that can provide elements information:

Element url: This macro stores the value of the href or action attribute of the element that triggered the event. For example, a click on the link < a href="http://www.cutroni.com">Analytics Talk< /a> would result in an value of http://www.cutroni.com.

Element target: This macro stores the value of the target attribute of the element that triggered the event. Nerd Bonus: The value is stored in the gtm.elementTarget variable in the data layer.

Element id:This macro is the value of the id attribute of the element that triggered the event. For example, a click on the link < a href="http://www.cutroni.com" id="outbound_link">Analytics Talk< /a> would result in an element id value of outbound_link. Nerd Bonus: The value is stored in the gtm.elementId variable in the data layer.

Element classes: This macro is the value of the class attribute of the element that triggered the event. Nerd Bonus The value is stored in the gtm.elementClasses variable in the data layer.

Element: This macro is also the value of the action or href attribute of the element that triggered the event.

Let’s put this all together and look at some of the common analytics tracking tasks you can implement with data layer.

Tracking Clicks

Sometimes we need to track user clicks – a click on a button, image or link. Before Auto Event Tracking we would need to add extra JavaScript to the site in order to fire analytics code. Now we just use the Click Listener tag to detect a click.

Let’s walk through how to track ALL clicks on a page and capture them with a Google Analytics event.

First, add the Click Listener tag to the necessary pages. You can add it to all pages, or just a select few. It depends on what you need to track.

The Click Listener tag will listen for user clicks and execute when a click is detected.

The Click Listener tag will listen for user clicks and execute when a click is detected.

Next, we add our Google Analytics tag to execute, and thus collect, when the click happens. Notice that I am hard-coding the Event Category to be click but the Action and Value will be dynamically populated with data from the HTML element that the user clicked on.

We can use a GTM macro to automatically capture the HTML element that the user clicked on.

We can use a GTM macro to automatically capture the HTML element that the user clicked on.

The value of the action is capturing the generic name of the HTML element. This might be [object HTMLInputElement] for a form element or [object HTMLBodyElement] for the body of the page. These are fairly descriptive and can help you understand what happened.

But a better strategy would be to capture the element class or element id. These are usually more descriptive.

Here’s the rule that determines when to acctualy collect the click. Basically it will collect EVERY click on the page using a Google Analytic event. We’ll look at a few examples later that will restrict the collection to only certain elements.

The gtm.click event indicates that a user clicked on something. This causes the Google Analytics tag to fire.

The gtm.click event indicates that a user clicked on something. This causes the Google Analytics tag to fire.

I should note that this approach will NOT work for content that is in an iFrame. For example, if you embed a YouTube video in your page, you can not capture clicks on the buttons, etc.

Using this general approach can generate a lot of data – crappy data! Let’s look at reducing the amount of data by tracking certain types of clicks.

Tracking Outbound Links

We all want to know where people go after they visit our site. Did they leave using a link in an article or did they just navigate away?

To track a click on an outbound link we follow the same general process we outlined above. The big difference is we need to make sure we only track clicks on links that go to another site.

First, we add the Link Click Listener tag to the necessary pages. Because there usually outbound links on every page, I apply the Link Click Listener tag to every page on the site.

The Link Click Listener tag will listen for user clicks on links.

The Link Click Listener tag will listen for user clicks on links.

Now we need to add an analytics tag to collect data when a click happens. Let’s use Google Analytics and collect the data in an event! Notice that I am hard-coding the Event Category value to outbound-link.

The Event Action will be dynamically filled with the destination URL. That’s the URL of the page the user will land on. This is all made possible thanks to the element url macro.

The element url macro will automatically add the destination url to the Google Analytics event.

The element url macro will automatically add the destination url to the Google Analytics event.

Here’s the important part – the tag rule. Notice that there are two parts to the rule. First I need to check for clicks on links. But I also added an additional condition that stipulates the link must not match cutroni, which is the domain of this blog. Now the Google Analytics tag will only fire and collect the click if the link is to a different domain.

Add a rule to specify what is an outbound link clicks on your site.

Add a rule to specify what is an outbound link clicks on your site.

Tracking file downloads

File downloads are very similar to outbound link clicks. I just use a different Listener tag.

Let’s just skip to the analytics tag that will collect the data.

I’m using a Google Analytics event again. The category is hard coded as file-download. The event action will be the URL of the file and it will be dynamically populated using the element url macro.

The element url macro will automatically add the PDF url to the Google Analytics event data.

The element url macro will automatically add the PDF url to the Google Analytics event data.

Just like I did with the outbound link tracking, I need to modify the rule to include a condition. The condition specifies that the user clicked on a link that contains .pdf.

To track a PDF link click add a condition to your tag firing rule.

To track a PDF link click add a condition to your tag firing rule.

Hopefully you can use this example and track clicks on any type of file that you want.

Tracking Form Submissions

Now let’s move on to forms. You could track a form using the Click listener tag. Basically you would track all of the clicks on the Submit button. But the form

We start with the Form Submission listener tag. Rather than add this tag to every page on the site, I like to only add it to pages where there is a form.

The form listener tag can be configured to delay the form submission while data is collected.

The form listener tag can be configured to delay the form submission while data is collected.

ALso notice that you can configure the form listener tag to delay the form submission to insure that the data is collected.

The tag will delay the form for up to two seconds only. Anything longer than that will create a bad user experience. GTM is smart like that :)

Just like the click tracking, there is also a form-submit event that will be generated when a user submits the form. We use this event to set up our analytics tag with a rule to control the execution.

This rule will only fire the Google Analytics event tag when a form is submitted.

This rule will only fire the Google Analytics event tag when a form is submitted.

I can actually pull some of the data in the form elements directly into my analytics tag using a macro.

For example, let’s say I have a form element named Gender. I can use a macro to capture the data, then use that macro when I define my Google Analytics Event, like this:

You can collect data from a form element using a macro and send the data to Google Analytics.

You can collect data from a form element using a macro and send the data to Google Analytics.

REMEMBER it’s not cool to collect personally identifiable information.

Here’s a bit more information on creating and using macros.

But overall, tracking a form submission is fairly straight forward. Very much like the other scenarios above.

There you have it, some of the common ways to use the new Auto Event Tracking feature.

That was a really looooong post. Hopefully it gave you a good understanding of how this feature works and how you can use it to make data collection easier to implement and maintain.

Give auto-event tracking a shot and be sure to share your experience in the comments below.

The Best Way to Handle 404 Pages for SEO and Users

GitHub 404

There are a few common misconceptions related to 404 pages and SEO that I’ll attempt to clarify.

While it’s common (and technically correct) to call 404 pages “errors,” I am not a fan of that terminology. 404s are the expected result whe…

GitHub 404

There are a few common misconceptions related to 404 pages and SEO that I'll attempt to clarify.

While it's common (and technically correct) to call 404 pages "errors," I am not a fan of that terminology. 404s are the expected result when a website is unable to return a given request, if there is no URL to deliver. If we wanted to be picky about it, we'd say 404 pages are not errors and can even be (blasphemy!) an acceptable user experience.

Calm Down. 404s Are Ok.

404s are fine. There's no need to panic. There's no need to "redirect every 404 to the home page" (eek!), the category parent, or your shopping cart.

All too often I hear SEOs recommending that every 404'd URL be redirected somewhere - anywhere - just so long as "the juice is captured." Nonsense.

I think people tend to freak out about 404s because they sense that Google will negatively score a site if they exist. Or, they worry that link equity is lost when quality links are pointing to a 404 page (a valid concern, which I'll elaborate on below).

Generally speaking (though there are exceptions), 404 pages are a very normal part of running a website. There's no reason to fret over them and there's certainly no proof (at least that we've seen) of Google or Bing penalizing a site for 404 pages.

When 404s Can Be "Bad"

There are cases, however, where a site could be scored negatively as a result of an abnormally high occurrence of 404 pages. Does your site have 10,000 unique pages but 45,000 404s this month? That could be an indicator of a bigger problem with crawling and technical SEO. That's also probably not a very good user experience, which could be reflected in poor user engagement and falling traffic, which could be reflected again by a search engine lowering visibility of your URLs in search results.

A high occurrence of 404s - when they are spiking and continuous - is not a good thing. But forget about SEO for a moment. Is that a good thing for users? Obviously not, and remember: search engines follow users.

Another potentially negative consequence of 404 pages is when a URL has valuable links. I'm not so rabid about "link equity" that I'm going to recommend every URL with links needs to be redirected. If there's not a good page match for a 404, don't redirect it. That said, if the links are precious and difficult to secure, think about contacting the site and having them update their link, or creating a new piece of content that's relevant for the existing link, or finding a relevant page that will still ensure a good user experience and 301 the page.

Last but not least, 404s can be bad news when they act like a 200. A soft 404 error page is not a good thing, because search engines will continue to index these. Ensure your server is configured to return the proper status code for each type of page.

There's an interesting technical SEO problem with regards to inventory and expiring products. But I'll save that for another day.

Best Practices for Redirecting 404 Pages

Here is a quick list of best practices for redirecting 404 pages:

  • 404s should not always be redirected.
  • 404s should not be redirected globally to the home page.
  • 404s should only be redirected to a category or parent page if that's the most relevant user experience available.
  • It's okay to serve a 404 when the page doesn't exist anymore (crazy, I know).
  • If you have valuable links pointing to 404 pages, do one of the tactics I outlined in the above section.
  • Don't panic. 404s are normal.

Helpful Links:

Optimizing Social Link Snippets for Facebook, Twitter & Google+

So, you’ve put time and effort into publishing quality content across your social media profiles, or maybe you just published the content to your site and want users to share it. Either way, Facebook, Google+ and Twitter can help make your content more…

So, you’ve put time and effort into publishing quality content across your social media profiles, or maybe you just published the content to your site and want users to share it. Either way, Facebook, Google+ and Twitter can help make your content more shareable with rich snippets of content that can be shared across platforms. These snippets are platform-specific and include detailed information about the content being shared, displayed in a user-friendly way. For purposes of this blog post, we are going to focus strictly on sharing links to content on your site. However, Facebook, Twitter and Google+ all offer similar types of snippets for additional content or media such as images or videos.

Facebook Best Practices

Facebook designs a social snippet for a link that includes 4 pieces:

  • Link image
  • Link title
  • Link domain
  • Link description

The link snippet is in addition to your profile image thumbnail, your page title, and any content you wish to publish in addition to the link. When sharing a link, your post content can include an introduction to the content or commentary about that content.

Facebook determines what information to pull by looking for schema built into the code of the page being linked to. Facebook uses Open Graph Schema to allow content creators to inform the content pulled to populate these snippets. Ideally, there are 5 Open Graph tags for pages shared as links:

  • “og:title” will be the title that Facebook populates in the snippet, which doubles as anchor text for the URL that is being shared.
  • “og:type” will be “website” for content shared as a link.
  • “og:image” lets you specify the URL of the image you wish Facebook to use in the link’s rich snippet. Facebook recommends a “thumbnail size” image of at least 120x90 pixels. If an image is too large, Facebook will shrink it to fit.
  • “og:description” will be the description that will be displayed in the snippet.
  • “og:url” is the canonical URL for the content being shared.

It is important to keep in mind that Facebook does not have hard character limits for titles and descriptions. However, the snippet will include a hard limit of 5 lines of text including the link title, the link domain, and the link description. As the domain almost always takes only one line, in addition to the example above, you can choose to have Facebook display 1 line of title, 1 line of domain, and 3 lines of description (which will buy you about 150 characters of description), or 3 lines of title, 1 line of domain, and 1 line of description (where Facebook will limit your title to 100 characters). You control the title and description ratio by the number of characters used in the Open Graph title tag. The shorter the title, the more room for a description. Facebook Alternatives What happens if you don’t implement Open Graph Schema? Facebook will search your metadata and try to put together it’s best guess of what should be there, which may not match what you think should appear in your snippet. Facebook will choose an image for you, which may be an image you don’t want to be associated with your content.

Twitter Best Practices

Twitter has recently released rich snippets called “Twitter cards”. Twitter’s “summary card” is the rich snippet generated for links shared on the platform. While cards are still being rolled out, Twitter is currently asking content creators to apply for the program after implementing the schema. Whether or not you apply now, we still recommend getting in the habit of using this schema, as cards will continue to roll out in the near future. Cards can be accessed by a link generated at the bottom of a Tweet:

 

Twitter’s summary card includes 3 pieces:

  • Link image
  • Link title
  • Link description

A summary card also includes information that is not link-specific including your profile image thumbnail, your Twitter name and handle, and any content you wish to publish in addition to the link. It also displays dynamic Tweet information below the link content including your Twitter name, your Twitter handle, number of retweets, number of favorites, and thumbnails for users who have retweeted or favorited the Tweet.

Twitter pulls this information from the page using a similar schema markup we’re calling Twitter Schema. Ideally, there are 5 Twitter Schema tags for Twitter to pull information for the summary card:

  • “twitter:card” tells Twitter what type of card to generate, in this case “summary”. It will default to summary if not included in the metadata.
  • “twitter:url” is the canonical URL for the content being shared.
  • “twitter:title” is the title of your content as it should appear on the card, which doubles as anchor text for the URL being shared. Twitter will only display the first 70 characters.
  • “twitter:description” is the description of the content, with a maximum of 200 characters.
  • “twitter:image” lets you specify the URL of the image you wish Twitter to use in the card. Twitter recommends an image greater than 60x60 and smaller than 120x120 pixels. If it is greater than 120x120, it will be shrunk to fit, if it is smaller than 60x60, it will not be shown.

Twitter Alternatives What happens if you don’t implement Twitter Schema? If Twitter wants to create a card with your content and it does not find Twitter Schema, it will fall back on Open Graph Schema. It is important to keep in mind that Twitter and Facebook support different size images and different character limits for title and description.

Google+ Best Practices

Google+ designs a social snippet for a link that includes 3 pieces:

  • Link image
  • Link title
  • Link description

The link snippet is in addition to your profile image thumbnail, your profile title, and any content you wish to publish in addition to the link. Again, the post content is a great place for a summary of the content.

Google+ finds this information in schema tags, just like Facebook and Twitter, but Google+ relies on markup from Schema.org. Google+ will use the name, image, and description properties of any Schema.org type. Just make sure that you have tags that are optimized for Google+. A Google+ link snippet will display a title of about 140 characters, and it appears that there is no character limit for link description. However, we recommend keeping it around 185 characters to ensure the information is user friendly. Images used in Google+ snippets should be 150x150 pixels.

Google Alternatives

If Google+ does not find markup from Schema.org, it will pull data from Open Graph tags you may have set up for Facebook. Again, keep in mind that Google+ allows for a longer title, longer description, and recommends a larger image. If your image is optimized for Facebook or Twitter, it may not be displayed correctly or at all on Google+.

News agencies such as ABC News or the Weather Channel offer examples of how utilizing schema for social can pay off not just on social platforms, but also offering images and text for Google’s SERP when users search for current events.

We recommend implementing all three types of schema (Open Graph, Twitter, and Schema.org) to ensure that you are taking advantage of the different opportunities across social platforms and to ensure that your snippets show the way you want them to across social platforms. It may be easier to rely on Open Graph for all three platforms, but just be sure you understand how each platform will use those tags individually to ensure that your content is being represented the way it should be.

Ecommerce SEO: Product Variations, Colors, and Sizes

I recently asked twitter for some writing ideas, and Will Scott replied with the following:

@audette I’d love to see you unpack your use of canonical for product color / size pages 🙂 — Will Scott (@w2scott) July 30, 2012

This can be a tricky scenari…

I recently asked twitter for some writing ideas, and Will Scott replied with the following:

This can be a tricky scenario to get right.

This article will outline the appropriate SEO method for handling product versions, colors, sizes and variations on ecommerce sites. You might also find this video on product variations and SEO useful.

SEO for Product Variations: Questions to Ask

Before starting out, ask yourself the following questions to better understand the problem:

  • Are product variations competing in the SERPs, causing duplication and/or cannibalization?
  • Are product variations causing crawl problems or inefficiencies?
  • Has the "wrong" product type been selected as the canonical?
  • Are there business goals tied to long-tail performance where color variants matter?
  • Are color variants, sizes, and models of particular importance (or not) in your industry/niche?

It's common for product variations to cause problems in SEO with duplication, cannibalization, and crawl inefficiencies. This happens most frequently because of complicated URLs generated with many parameters and IDs. That said, it's always a good exercise to do a little digging in analytics. Let the data inform your overall strategy.

There are several ways to handle this, and I'll run through them below.

Tools to Use

The web has evolved to the point where often product variations (such as color) are best presented in the interface. This has the advantage of maintaining a canonical URL by default.

While this is true, there are still a myriad of common problems and poor implementations out there. Your best tool is going to be the rel canonical tag. Ideally, product variations such as color and size can be annotated with rel canonical tags to a 'parent' URL that is the ranking candidate for that product set.

Oftentimes certain types of product variations - especially color - have associated search volume. Users are looking for 'mens green adidas originals', and pages relevant to that query (possibly on a dedicated URL) will perform best. Some product categories show strong search trends towards color, or even size, while others do not. Investigate.

The Ideal Approach

The best way today to handle this problem is via the interface. Product colors can be displayed via a hover or click event, with the URL remaining unchanged. Then, a drop-down selector can be displayed for visitors to choose which item to place in their cart.

This preserves a single, canonical URL at all times for ranking purposes. However, the opportunity to rank for color-specific queries may be lost (or at least sacrificed).

A great example of this technique implemented well is on REI.

color and size selection on rei.com

Excellent implementation of color/size selection on REI.com

Note how color selection doesn't impact the URL, which makes it easy to view different colors of the product without re-loading the page. This is a solid approach, with the aforementioned downside: REI may not be eligible to rank for color-specific queries of this product. That may (or may not) be a big deal to your business, and that's why you need to check analytics before deciding which way to go.

Other Approaches

Another method for attacking this problem can be demonstrated by OnlineShoes. Color-specific URLs are created for variations, as in the example below.

onlineshoes.com color variations

OnlineShoes.com creates URL-specific color variations

Each of the URLs for a specific color variation has a self-referencing rel canonical tag.

While this allows OnlineShoes to compete for color-specific variations in SEO, the downside is that these pages are not highly differentiated and are therefore likely to compete with each other. Additionally, there may not be search volume for every type of product, leading to a situation where URL duplication and inflation is a concern.

Zappos has a different approach. Each color variation is given its own URL, as was the case with OnlineShoes...

Color variations have unique URLs on zappos.com

Zappos uses unique URLs for colors.

However, each color-specific URL is "collapsed" to a parent, canonical URL (somewhat defeating the purpose of having the specific URLs).

zappos rel canonical tags

Color variants are "collapsed" to a canonical URL.

There is an elegance to this approach. However, I would re-visit it today in light of changes in the SEO world.

What About Sizes, Models, etc?

We typically view sizes, models, and different types of the same product to be of secondary concern in SEO. Therefore, these are best displayed as information on the page and in drop-down selectors for "add to cart" functionality. If unique URLs are being generated for many types of product variations (as is often the case with faceted or guided navigations), use rel canonical annotations along with parameter handling tools (depending on the specific scenario) to improve crawling and canonical signals.

Conclusions

First, go back and read the beginning of this article: All decisions need to be based on an analysis of the specific situation, the site's strengths, weaknesses, and obstacles, as well as your primary business objectives and your SEO channel's goals.

Now that we have that out of the way, read on.

  1. Unless your analytics and business goals preclude it, place color, size and product type variations in the interface (ideally appearing in source code as plain text, not just in images or with javascript / ajax). Don't create unique URLs and pages for these.
  2. If you already have unique URLs for color variants, add rel canonical annotations to a primary ranking URL on all variants. If the colors are specified via parameters in the URL query string, then make use of Google and Bing webmaster consoles to have these ignored in the crawl as an additional step.
  3. Exception to the above advice: If you already have unique URLs for color variants, and the URLs are clean and devoid of excessive parameters and IDs, it is perfectly acceptable to leave the strategy in place. In these cases, add self-referencing rel canonical tags to the color variants to ensure they remain canonical.

There are certainly unique scenarios where our recommendations would change. Let me know in the comments if you have specific problems to address.

Thanks for reading!

6 Differences Between Facebook’s Promoted Posts and Page Post Ads

So, if you haven’t realized it yet, Facebook updates its ad types and new products almost constantly. The newest product they’ve launched is the “Promoted Post.”

Let’s take a look at Promoted Posts and try to sort out some of the confusion one might h…

So, if you haven’t realized it yet, Facebook updates its ad types and new products almost constantly. The newest product they've launched is the “Promoted Post.”

Let's take a look at Promoted Posts and try to sort out some of the confusion one might have around the differences between them and Page Post Ads, which are similar.  After all, both are paid advertisements generated by posts published on your Facebook page.  And, of course, a prerequisite for either type of ad is frequent posts to your wall with quality content.

What is a “Promoted Post”?

Promoted Posts provide the ability to push out content that you are posting to your Facebook wall to more users than it would naturally reach. Facebook reports that posts on your wall show organically to between 10% and 20% of your current fans’ news feeds. This is due to EdgeRank, the algorithmic filtering of stories that show in the news feed and the limited number of stories that Facebook shows on the page. Facebook recommends Promoted Posts for “important” high quality posts that need to reach as many fans as possible.

In order for your page to have the option of doing Promoted Posts, you need to have more than 400 fans and less than 100,000 fans. In order for you to promote it, your post needs to be less than 3 days old. You can also choose to promote the post as you are creating it. The option to promote your post is visible at the bottom of eligible posts, if you are an eligible page.

Once you decide to promote a post, you are given a handful of budget options from which to choose, along with estimates of the reach of the post based on that budget. This is a lifetime budget that the promoted post will have to use over the 3 day lifecycle the Promoted Posts currently have.

These budget options and reach estimates are calculated from a number of different factors, including: how large your fan base is, the relative cost per thousand impressions (CPM) for your fans, the probability of how many of those fans will log in to Facebook over the next 3 days, and other ads you are currently running for your Facebook page.

Once you’ve decided to promote a post, it will show as a sponsored post in your fans’ news feeds. Currently, it will show both on desktops and mobile apps. If a user interacts with your sponsored post, that story is eligible to be shown in the news feeds of friends of the user who took the action, further increasing your reach.

While your post will reach users who would not have organically seen it, Facebook will not show your sponsored post to fans that have “hidden” your posts from their news feed. Promoted Posts will also not show up in the ticker.

You can also access different performance metrics and analytics through the post on your wall, or through the Ads Manager. Every time you choose to promote a post, Facebook automatically generates an ad and places it in your Ads Manager account, thus simplifying the process of creating an ad on the front end. As such, from Ads Manager, you can also make changes to the Promoted Post, just like you would with any other ad.

Can you remind me what a Page Post Ad is again?

Page Post Ads are ad units that are automatically generated by Facebook using content that you have already published to your Fan page. When creating the ad, you tell Facebook which post you would like to feature and Facebook creates the content of the ad.

So, what is the difference between a Promoted Post and a Page Post Ad?

While the underlying goal is the same between the two types of ads – reaching more people with content you have published to your wall – they are two different animals. Both are automatically generated by using content that you have published to your wall, but there are a few traits that set them apart:

  1. Creation Flow: Page Post Ads are currently created by using either the Facebook Ads Manager or Power Editor. Promoted Posts are created on your Facebook wall by interacting with the post that you want to feature. If you have ever created an ad with the Facebook Ads Manager, you know that it is more difficult than the few steps I outlined above to create a Promoted Post. While it may be more complicated to create a Page Post Ad, you are given more options.
  2. Who you Reach: Promoted Posts are meant to help you push your content to more of your fans who would not organically see your content in the news feed. Page Post Ads, however, with their more complicated creation flow, allow you to target users based on demographics, location, interests, and Facebook connections. If you want, you can target Page Post Ads to friends of your fans or just users who aren’t already fans of yours.
  3. Ad Location: Once you create them, Page Post Ads take a spot on that well-known right rail. Promoted Posts are seen in the news feed as if the user was seeing the post organically. The only distinction that they are seeing a story that is not organic is the notation in the bottom right corner that the story is sponsored.
  4. Post Age Limitation: Currently you are only allowed to feature a post as a Promoted Post if it is less than 3 days old. While we don’t suggest you create a Page Post Ad with content you generated months ago, you can feature any content that you have published to your page with no time limitation.
  5. Ad Lifecycle: Because of the recency metric that feeds into Facebook’s EdgeRank, Promoted Posts are only able to run for a set time frame of 3 days. While you can pause and restart the ad as much as you want, the ad will not receive impressions after it has been live for 3 days. Similar to the age limitation, it might not be the best idea to show Page Post Ads for months at a time, but you do not have a restriction for how long your ad can show.
  6. No Bids: Promoted Posts only give you a budget and an estimated reach. The “budget” is calculated by looking at the average CPM for your fan base, but you do not set a bid for your ad. With Page Post Ads, you still have the option, based on what objective you choose through the Ads Manager, to pay on a cost per click (CPC) or CPM basis and set what bid you are willing to pay per click or thousand impressions.

What does this mean to my current Facebook advertising?

Promoted Posts are essentially just a new ad type. What really differentiate them from other ad types are the new creation flow and the fact that they are located in the news feed and not the right rail. While this doesn’t have an immediate impact on the other ad types Facebook currently offers, it brings up some interesting questions about where Facebook is moving with ads in the future.

Perhaps moving forward, Facebook will begin to streamline ad creation for certain types of ads, limiting the customization in the process. We also know that users pay more attention to the news feed on Facebook than the right rail, so it is possible that Facebook is considering placing more ads in the news feed when developing new ad types. No matter where Facebook goes from here, it is clear that they are always working to optimize what types of ads they offer and how they are created and managed.

iframe Test: Do Search Engines Follow Links in iframes?

Consider this scenario: you’re auditing a site for SEO issues, and come across wide use of iframe elements on key pages. You might think to yourself, “hmm something about these just doesn’t smell right. But crawlers can’t follow links within iframe ele…

Consider this scenario: you’re auditing a site for SEO issues, and come across wide use of iframe elements on key pages. You might think to yourself, “hmm something about these just doesn’t smell right. But crawlers can’t follow links within iframe elements, so they’re probably just a brick wall.

Still, I’m going to recommend they use caution here.

 example of iframe from PRWeb

That sequence of thinking is completely logical. The use of iframes is widespread and there is nothing inherently wrong with the practice; but they just don’t feel quite right from an SEO point of view, especially when there are lots of links and content in the iframe. One good example of this is with PRWeb.com, a site our team audited several years back.

Our recommendation at the time explained that “… press release pages are seen as linking to the customer URLs within the iframe and thus could be negatively impacting rankings for the individual releases. For any given iframe, there may be 20-100 links featured (or more, depending on the content of the iframe).”

Those words were written nearly 3 years ago. So you can imagine our intrigue when we read about a test that Michael Martinez had conducted over on SERoundtable.

Michael found that links within iframe elements were indeed crawled, also raising the question of whether or not equity passed through them (anchor text and PageRank). This got us sufficiently curious, so we conducted a test of our own.

The Setup: Two tests were conducted: one to determine if search engines grab information from the iframe tag itself, and the second to determine if the search engines will crawl the source of an iframe; follow links from that source URL and index the page with the attributed anchor text from the links found on the source URL.  Let’s see how it all went down…

Test 1

Page A – A trusted, frequently updated, and regularly crawled blog. This is the page where we added the HTML iframe tag: http://shoedigest.com/rants-raves/kushy-flats-to-go/

Page B – A small traffic, long established website page that has high-ranking terms for its subject matter.

Anchor text – “Sammy loves hiking south sister”

HTML Snippet on Page A

HTML on Page B – Legacy blog post about hiking south sister.

Prior to implementation, we tested the anchor text phrase against three search engines: Google, Bing and Blekko.  No search engine ranked Page B for the phrase “Sammy loves hiking south sister” in the first 50 results.

Additionally, we used a small traffic site that has been up for at least 7 years and hasn’t had changes in the last 4 years.  We took a page that ranks relatively high for a particular local query.  We created anchor text that doesn’t actually show up in the HTML source of Page B.  This should make the phrase unique to the anchor text.

Our results were very conclusive.  The search engines easily picked up on this link and the anchor text ranked on Google, Bing and Blekko for Page B.  In Google, Page A (the blog with the iframe) also ranked for the phrase on the first page for the query.  It picked up on the new link within three days of posting the iframe information.

Google

iframe test serp result

Bing

Bing iframe test serp

Blekko

Blekko iframe test serp results

Google Webmaster Tools was able to pick up on the link:

gwt screenshot

The test basically showed that the search engines are able to crawl the text within the iframe tag, but this wasn’t exactly the test we were trying to check.  This lets us know that they do crawl the HTML within the iframe tags.  Not only will it crawl the HTML within the iframe tag, it will attribute the anchor text to the proper page.  However, these are not surprising results.

Test 2

The second test was to determine, through a steel-tight iframe call with no HTML text between the iframe tags, if the engines will crawl the iframe source links.  And, would a link from the iframe source be followed and pass link equity to link’s landing page?

Page A – the same frequently updated blog

http://www.shoedigest.com/rants-raves/bitten-deercows-love-affair-western-boots/

HTML Snippet on Page A

Page B – the same small traffic, long established website with a page newly created for this test (Page B).  It has no prior traffic nor existed before we posted the iframe HTML on Page A.  Prior to the test, this page did not exist on the server and would have returned a 404 page: http://www.gimpslice.com/touch.html

HTML Snippet on Page B

check out Cara rides 100 days at Bachelor

Anchor text – “Cara rides 100 days at bachelor”

Page C – The target page of the anchor text on Page B: http://www.bettylife.com/Contributors/cara.html

Prior to beginning the test, we checked the anchor text phrase “Cara rides 100 days at bachelor” which does not show up anywhere in the source for Page B.  No search engine showed this phrase ranking in the first 50 results.

Results: After 14 days we could see Page B now showing up in the index for Google, but not Bing or Blekko.

Google iframe test results serp

The other search engines didn't have the page indexed:

Bing

Bing iframe test result

Blekko

Blekko iframe test result

Google did show the causal link between the anchor text, Page B and Page C. Remember, the only way that Page C could have been associated with the anchor text was through Page B which is only referenced through an iframe src= call from Page A.

Google passes anchor text through iframe

Furthermore, Google did not show the iframe source URL (Page B) unless the anchor text was put in quotes.

Google screenshot

The other search engines wouldn’t show the phrase even when looking for an exact match:

Bing

Bing screenshot

Blekko

Blekko screenshot

Conclusions: For Google, it does indeed appear that it is following the source calls in iframes.  Because of the links in the second test we can draw the conclusion that it crawls the source, follows links and passes link equity to the final destination page. However, we are not able to draw conclusions as to how much equity is passed in relation to the portion similar to a 301.

This indeed backs up Michael’s test, but we would like to hear from you about your own tests and experiences!

Special thanks goes out to Ben Goodsell for help writing this post!

24 Top Paid Search Metrics Explained

We typically focus on more advanced marketing tactics and deeper analysis of the advertising business, but every now and then it’s nice to take a step back and address more fundamental topics.  In our discussions of paid search, we’re often quick to th…

We typically focus on more advanced marketing tactics and deeper analysis of the advertising business, but every now and then it's nice to take a step back and address more fundamental topics.  In our discussions of paid search, we're often quick to throw around abbreviations and terminology that may not be instantly discernible to all audiences, so we thought we'd offer a primer on paid search metrics.

While much of the jargon of paid search overlaps with that of other marketing channels and business in general, there are a few items here that are PPC-specific.  For those who are new to or have just a casual connection to any of these fields, there may be only a few items here that are familiar.  To those who are well acquainted with these metrics, hopefully we can offer a few tips here and there that will make this worth your while.

Without further ado...

Traffic Metrics:

Impression - An impression occurs when your paid search ad appears on the search engine results page (SERP).  Impression data is provided by the engines via their online interface or through API reports.

Click - An easy one: clicks are when the user clicks on your ad and visits your landing page.  Again, this data is provided by the engines, but it can also be determined by on-site analytics, often with greater detail about the user.

Click-through Rate (CTR) -  CTR is the ratio of the number of clicks your ad has to the number of impressions it received (Clicks/Impressions).  Click-through rate is strongly influenced by the position of your ad on the SERP and your company's name recognition, but compelling ad copy also provides a boost.  High level CTR trends can be deceiving so take caution in your analysis.

Average Position - This is the average position where your ad appeared on the SERP with the top position on the page being 1.  Because of the nature of the auction and increasingly personalized results pages, it can be difficult to interpret the average position metric in isolation.  For example, an ad with 10 impressions in position 1 and 1 impression in position 10 will have an average position of 1.8.  Also, it's possible that increasing an ad's bid will end up lowering its average position.  Google has recently provided a new data segmentation option that provides a bit more insight on ad position.

Cost Per Click (CPC) - CPC is the average amount the advertiser pays for a click.  This should be distinguished from the advertiser's bid or max CPC, as actual CPCs will typically come in lower than the bid due to the nature of the PPC ad auction.

Marginal/Incremental Cost Per Click - Since CPC is just an average and there are diminishing returns to additional ad spend, the advertiser may find they are spending a great deal more per click than their average CPC for their last dollars spent.  Viewing this marginal CPC in data from a tool like Google's Bid Simulator can reveal opportunities to shift spend across one's keyword portfolio.

Cost Per Mille (CPM) - CPM is the cost per thousand ad impressions.  Though not commonly associated with paid search, some advertisers may wish to compare their effective CPM for PPC to other channels where pricing is determined by impressions rather than click costs.

Impression Share (IS) - Impression Share is the ratio of the impressions your ad received to the number of possible impressions it could have received.  Budget restrictions and low ad rank will decrease your impression share, but having a high IS shouldn't be the goal in and of itself.  If your ads are of high quality, budget is not restricting their display and you are bidding what you can afford, IS speaks more to the level of competition you face than anything else.

Conversion Metrics:

Revenue/Sales - Used interchangeably, the terms revenue and sales can apply to the value of orders placed with or without discounts and shipping & handling included.

Margin - Margin, or Gross Profit, is expressed in dollar terms and is defined as (Revenue - Cost of Goods Sold).  Running a paid search program based on Margin figures can provide a more direct impact on profitability by taking into account variable margin percentages across products and product lines.

Leads - For some advertisers, the goal of an advertising campaign is to reach qualified individuals to pursue for a long-term commitment (insurance, bank account, etc.) or purchase farther down the road (B2B, high ticket items).  In these cases, rather than Orders, the key conversion measure is a Lead.  This can include an email signup, application completion or request for information among others.

Conversion Rate (CR) - CR is the ratio of the number of Orders or Leads to the number of ad Clicks.  Aggregate conversion rates in paid search depend heavily on the competitiveness of your product offering as well as your ability to determine value across your keyword portfolio.  Conversion rates do not vary significantly by ad position.

Revenue Per Click (RPC) - Or Sales Per Click (SPC), RPC is the average amount of revenue generated per click.  For advertisers looking to hit a revenue-based efficiency goal, predicting RPC accurately will determine what CPCs can be afforded and how to set bids.

Revenue Per Impression (RPI) - A useful measure for copy tests, Revenue Per Impression accounts for both the CTR and RPC differences one might see between two copy versions.

Average Order Value (AOV) - AOV is Revenue/Orders and it can be useful for determining how promotions should be set up to incentivize shoppers to spend more than average and as a contrast to conversion rates in analyses (Are shoppers purchasing at the same rates, but spending more/less per order over time/seasonally?)

Efficiency Metrics:

Ad Costs to Sales (A/S) - Ad Costs/Sales is a ratio used as an efficiency target for many paid search programs.  It is often used as a proxy for more direct measures of profitability, but it can be useful for those seeking to maximize top-line revenue over bottom line considerations.

Return on Ad Spend (ROAS) - Most commonly, ROAS is simply the inverse of A/S or Revenue/Ad Costs.

Return on Investment (ROI) - In the paid search world, ROI is frequently used synonymously with ROAS, but it is best tied as directly as possible with profit measures.  A typical ROI calculation is: (Gross Margin - Ad Costs - Variable Expense)/Ad Costs

Ad Costs to Margin (A/M) - The Ad Costs/Margin ratio is another common efficiency target metric that provides a more direct view of profitability than A/S.

Cost Per Lead/Order (CPL/CPO) - Leads may not ultimately pay off for months or even years, so Lead generating advertisers need an efficiency metric they can steer by today.  If the advertiser can estimate the value of a Lead they can aim for a Cost/Lead target that meets their desired profitability goals.

Other Considerations:

Quality Score (QS) - QS is a ranking the engines assign to your ad based on their view of its quality.  It is used along with your bid to determine where your ad will rank on the SERP.  Though all details of the QS assessment are not known, it is largely a function of historical CTR among other relevancy factors.  A view of QS is available for keywords via the engine interfaces and API.

Cost of Goods Sold (COGS) - COGS is the direct cost to the advertiser for the products they are selling.  Used for determining Margin, it does not include variable costs for labor, distribution, etc.

Revenue Per Search (RPS) - RPS is the amount the engines make for each search and reflects how well they are monetizing their traffic.  A higher RPS is not necessarily beneficial to advertisers, but a relatively low RPS, as seen since the Search Alliance, suggests valuable traffic is not being fully reached.

Lifetime Value (LTV) - Lifetime Value estimates the full value of a customer by forecasting future revenues they will generate.  Incorporating LTV into efficiency calculations allows the advertiser to be more aggressive with their bidding and reach a greater audience.

Well, there you have it.  I hope this is helpful for those new to paid search or those that just need a refresher.  Did we miss any important ones or personal favorites?

Detecting Significant Changes In Your Data

For statisticians, significance is an essential but often routine concept. For those who don’t remember the details of college statistics courses, significance is a nebulous concept that lends magical credence to whatever data it describes. Sometimes y…

For statisticians, significance is an essential but often routine concept. For those who don’t remember the details of college statistics courses, significance is a nebulous concept that lends magical credence to whatever data it describes. Sometimes you make a change in your paid search program, watch the data come in, and want to claim that numbers are improving because of your initiative.

How can you support this claim?  Can you discredit the possibility that the apparent improvement is just noise? How can you apply that authoritative label of “significant”?

Here I’d like to walk you through a basic test of significance that you can use to de-mystify changes in your paid search data.

If you’d like to skip the math, click here.

Let’s start with a situational example… say you’ve added Google Site Links to your brand ads and you want to show that brand click-through rate (CTR) has improved as a result.

  1. First, you need to know what value brand CTR is potentially improving from.  Let’s call this value mu (pronounced myoo), and you can choose it in a variety of ways: the average or median CTR over the past month, the average or median CTR from this time of year last year, etc. It should really be whatever value you believe CTR to truly center around.
  2. Next, you need data points.That is, you need several days of CTR data since the Site Links have been running. How many days is up to you. Generally, more is better, but I’ll touch on that later. The number of days you have is n. Take the average of the CTRs from those days; this is called xbar. Lastly, take the standard deviation (excel function stdev) of these CTRs and call it s.
  3. Now we can compute a t-score, and with it, the probability that the change in CTR you’re seeing is or isn’t attributable to chance. Set t = |xbar mu| / (s/squareroot(n)). Then use the function tdist in excel, and for the arguments, plug in t, n-1, and 1. The number that this function returns is the probability that the change in CTR is simply due to chance, aka noise. If this probability is very small, then we say CTR has changed significantly.

Enough Math! Is The Change In My Data Significant?

I’ve prepared an excel spreadsheet that handles the arithmetic. In this model, change the gray shaded cells to reflect your data.Enter the data that you think has fundamentally changed in column C. Only include data points since the change began. Then, in cell G2, enter the value from which you believe the data to have changed. That is, the average value of the data before the change.

The value p, produced in cell G7, is the probability that the change you’re seeing is only due to chance, and thus meaningless. Typically, a p-level must be below 5% to be considered significant. (If you want to be super, super sure, you can use 1% or 0.1% instead.) In other words, if your p-value is 5% or less, you can confidently say that the change in your data is real, definite, and due to something other than statistical noise. It’s a pretty safe bet that whatever initiative you took – whether it was switching landing pages, altering ad copy, or refining your bidding – was the catalyst for the improvement instead.

Allow me to fill in the spreadsheet with an example. For an imaginary online retailer, brand CTR hovers around 4.4%, so I fill in cell G2 with the value 4.4. The retailer enables Google Site Links, and CTRs for the 3 days afterward are 4.3, 5.2, and 5. So I enter those three data points into column C. And voila… the p-level comes back as 12.66%. This says that there is a 12.66% chance that the rise in CTR was due only to noise.

Not significant. Sorry, click-through-rates haven’t really increased, or at least, we can't be very confident that the observed change is anything more than random noise.

But… three days is not much data. As smart analysts, we are cautious when examining trends over only a few days, and this significance test incorporates such wisdom. As the number of data points (n) you use increases, p-levels fall. For example, if all the numbers in the above example were the same except that you used 7 days instead of 3 (so n=7), the corresponding probability drops to 2.6%. In this instance, it’s very unlikely (2.6% unlikely) that the increase in CTR was due to noise, so here you can rather confidently say, “Yes, CTR has increased, and it wasn’t due to chance. It was probably due to the site links.”

GreaseMonkey: Hacking Web Apps So They Work The Way YOU Want

GreaseMonkey is a Firefox extension that lets you run arbitrary Javascript code against selected web pages. What this means is that GreaseMonkey lets you change how other people’s web pages look and how they function (but just in your own browser, of …

GreaseMonkey is a Firefox extension that lets you run arbitrary Javascript code against selected web pages.

greasemonkey2What this means is that GreaseMonkey lets you change how other people's web pages look and how they function (but just in your own browser, of course).

I last played with GreaseMonkey (GM) about four years ago. Then, I didn't find the idea compelling. Today, with ever more applications going online, GM is worth a serious look.

GM can increase productivity by making web apps easier to use.

Even more interesting, GM also lets you add functionality to web pages. Here's a small example. I use Delicious to bookmark sites. I used Google Reader to read blogs, and I tag interesting posts with Reader's "gold star" button. Via a few lines of GM code, Google Reader now sends my starred items to Delicious automatically. This improvement keeps all my interesting links in one place.

GM works on intranets, too.

Suppose you're an online retailer and your merchants use an intranet app to enter product information for your site. Suppose that app had some UI annoying issues, like extra confirmation screens after entering each product ("Are you sure you want to add the following?") If your vendor or your internal IT folks can't (or wont) change the app, you could use GM to skip the unnecessary page.

Or perhaps your call center staff uses an intranet app for order entry. If they're retyping or pasting telephone data from phone pop into the order entry app on each call, perhaps the phone app could write its data to a local file (maybe) which GM then used to prepopulate fields in order app.

A GM script could even prepopulate web app fields from database lookups (you'd need to expose the necessary data via some simple restful url, behind the firewall).

Certainly hacky, certainly not 'beautiful' engineering, but GM opens up interesting possibilities.

Here are the GM pros and cons as I see them.

Pros:

  • Javascript. GM code is just Javascript code. Any developers familiar with Javascript and the DOM can write GM scripts.
  • User scripts. If you're seeking a common sense improvement to a popular site, someone probably has already written a GM script to do it. For example, here are popular scripts tweaking Google sites.
  • Cross-site scripting. Unlike the security model in common AJAX, GM code can access the entire web: "Unlike the XMLHttpRequest object, GM_xmlhttpRequest is not restricted to the current domain; it can GET or POST data from any URL" (from DiveIntoGreaseMonkey). This is very powerful.

Cons:

  • Scraping stinks. Fundamentally, GM is screen-scraping. Yuck. If your target site changes their design, your GM script probably croaks.
  • Ffox preferred. GM runs best in Firefox. Some GM scripts run on Chrome, but some do not. (Specifically Chrome does not support the GM_ functions, including GM_xmlhttpRequest.). Here's info on the "--enablegreasemonkey" flag in Chrome, and here's info on GreaseMetal . I've not used it, but IE has GM4IE.
  • Reality warp. If you forget you have GM turned on, or if you assume GM is on when it isn't, or if you switch to a computer without GM, you can get confused when a familiar web page behaves "strangely."
  • Debugging. Sometimes it is hard to understand why a GM script isn't working. Firebug is essential.
  • Local install. The GM extension and scripts are installed locally, not in the cloud. Installing them on your laptop doesn't put them on your desktop, etc. Script updates need to be maintained on each machine.

And here are some GM links I found useful.


Any readers out there using GreaseMonkey for business purposes?

greasemonkey

Correcting the History of Search Engine Optimization

Does the history of search engine optimization (SEO) even matter? I believe it does. In an industry that suffers from lack of structure and standardization, having a fuzzy history only turns the blinds another notch dimmer. Without having a clear frame…

Does the history of search engine optimization (SEO) even matter? I believe it does. In an industry that suffers from lack of structure and standardization, having a fuzzy history only turns the blinds another notch dimmer. Without having a clear frame of reference for where we originated, it makes it more difficult to chart progress and locate the path to where we're going. SEO as an industry has a lot of growing up to do, surely. It's partly the reflection of the wild west mentality that permeates the early days of the Internet, and partly the result of a lot of geeks with keys to the castle. SEO can make companies a lot of money, and with no real law in effect (only rules as set by search engine guidelines), pretty much anything goes provided you're willing to take the risk. But this post isn't about SEO standards or the white hat vs black hat debate. This post is about the history of search engine optimization; a history that will now need to be corrected.

The Current History

Who invented the term search engine optimization? I may have a definitive answer - or at least definitive proof that the term was being used prior to the Usenet spam posting that is now referenced as the earliest known web mention in the Wikipedia entry. Wikipedia says the following about search engine optimization:
According to industry analyst Danny Sullivan, the earliest known use of the phrase "search engine optimization" was a spam message posted on Usenet on July 26, 1997."
Wikipedia references the earliest known reference to SEO
The Wikipedia entry references a Danny Sullivan comment where he points out a Usenet spam post using the term search engine optimization. Here's a link to the Usenet posting with the term search engine optimization highlighted. Directly below is a screenshot of the post:
Usenet spam mentioning SEO in July 1997
Up until now this has been the earliest known record of the term search engine optimization being used online.

Did John Audette Invent the Term Search Engine Optimization?

The Wikipedia page needs to be corrected. John Audette was using the term search engine optimization at least 5 months prior to the earliest reference on Wikipedia. Not only that, it was being offered as a legitimate service, a huge departure from the intentions of the spam message on Usenet. Check this web page that was published in February 15, 1997 that proves John Audette was using the term search engine optimization at least 5 months prior to the Wikipedia reference: http://web.archive.org/web/19970801004204/www.mmgco.com/campaign.html
February 1997 page from MMG on the Wayback Machine
Multimedia Marketing Group (MMG) was John Audette's online agency, founded in 1995 and the starting place for some of the pre-eminent SEO's in the industry, such as Bill Hunt, Detlev Johnson, Marshall Simmonds, Derrick Wheeler, Jeremy Sanchez, Andre Jensen and Adam Sherk. Bend, Oregon - MMG's location prior to its purchase by Tempus Group in 1999 - is still home to many of the world's best SEO shops and consultants (including AudetteMedia). It truly is the birth place of SEO.
Search engine optimization listed as a service from MMG on 2/15/97
Danny Sullivan visited MMG's Bend offices to train the fledgling team on the fundamental tactics of search engine optimization. John wanted to reach out to the premier expert in the field (Danny) to learn all he could. That was in the Fall of 1997, months after MMG began using the term SEO on its site. Here's a services detail page that again mentions SEO and meta tags: MMG was offering SEO in early 1997 I urge you to play around on the old MMG archive from early 1997 and review the several mentions of search engine optimization: http://web.archive.org/web/19970215062722/http://www.mmgco.com/index.html It's time to update the Wikipedia entry on SEO!