7 Ways To Grow Your Customer Base With Emails

Your customers are the essence of your business. Most business owners instinctively know this. What many don’t know is that you have to cultivate and nurture your customers to grow your base and retain your customers. In this article, we look at seven ways to grow your business with emails. We’ll show you how to […]

The post 7 Ways To Grow Your Customer Base With Emails appeared first on Landing Page Optimization Blog.

7 Ways To Grow Your Customer Base With Emails

Your customers are the essence of your business. Most business owners instinctively know this.

What many don’t know is that you have to cultivate and nurture your customers to grow your base and retain your customers.

In this article, we look at seven ways to grow your business with emails. We’ll show you how to do it and why.

Let’s look at the transactional email and the power it has to grow your business.

The Transactional Email

Transactional emails are triggered emails. They involve correspondence you send to someone based on the action your user had with your website.

For example, your customer places an online order, and he expects a confirmation in his inbox shortly thereafter. This is a transactional email.

Often times if the customer doesn’t receive your email, he’ll call customer service to find out why not.

How about when someone signs up for your email list. Do you send them a welcome email? Again, this is a transactional email.

If you aren’t leveraging the power of this type of advanced email marketing, you’re missing out on a big chance to grow your customer base.

Bottom line – a transactional email is an advanced way to send trigger-based, ultra-personalized, targeted and highly-specific emails.

You can expect your transactional emails to have high open and click-thru rates and low bounce rates. This is, after all, information that’s valuable to your customer.

How Transactional Emails Grow Your Base

First, your customers are usually happy to see these emails in their inboxes. Because they’ve done something on your website to trigger the info, they are usually expecting it.

Email marketing extends your digital reach well past your website. Your customers have invited you into the inner sanctum of their inbox, so you have a chance to deliver personal, valuable information.

Now let’s look at the seven ways to grow your customer base with emails – transactional emails.

#1: Welcome Email

Someone created an account, made a purchase, downloaded a product or signed up on your site. This is a good tie to send them a welcome email.

Welcome emails usually provide a login and password and a welcome from your company.

Take this time to say thank you and let your customers know how much you appreciate them. Send this email within an hour of the action taken.

You can also include a call to action in the welcome email. How about asking your customers to tweet or share that they joined your company?

#2: Confirmation Email

You send a confirmation email to notify your email user when the action they took on your website is complete.

This email is fairly straightforward as it’s most often a receipt for a purchase, confirmation of a reservation or a link to a download.

Confirmations have all the information your customer needs in them. It’s up to you, though, to make sure you’re getting everything you can out of the email.

You can take the space to add some additional information in your email that provides value for your customer.

Offer them helpful tips and resources. For example, if they purchased a new camera from you, direct them to your website for blogs and whitepapers on how to use it.

If they booked a room at your hotel, send them their confirmation email with a list of restaurants and things to do in the area. Include insider tips only locals know about.

#3: Newsletter Sign Up Confirmation Email

Don’t be the business who neglects to send a newsletter sign up confirmation. Many of your website visitors will be confused if you don’t.

Not only do these emails confirm their subscription, but this is your chance to let them know what to expect and to welcome them to your “family.”

Provide them a list of what you’ll send and when. Give them a brief overview of the types of information you’ll send.

If you offered them something special for joining your list, you want to provide it to them in this email.

#4: Cart Abandonment Email

Cart abandonment is one of the biggest challenges facing e-retailers today, as the average abandonment rate is nearly 70%.

The best way to encourage your customers to come back to your website and complete their purchase is with a transactional email.

Often, people who return to their carts spend more than they had originally planned.

The best time to send this email is within 24 hours of the abandoned cart. You can even include a special offer such as free shipping or 10% off to increase the odds they’ll finish their purchase.

Do put a time limit on the promo – a sense of urgency moves the process along.

#5: Birthday Email

These are easy emails to send and work to further customer loyalty by growing your customer base.

Provide a gift for your customer in this email. Think video, download or special.

#6: Customer Feedback Email

This email solicits feedback from your customer. Ask for comments on your products and services.

It lets your customers know you value their opinion.

#7: Reactivation Email

Send this email to subscribers who used to interact with you, but haven’t in a while. This email is the perfect vehicle to grow your customer base.

These emails keep your company top of mind and remind subscribers why they signed up in the first place.

To Conclude

We’ve looked at seven ways to grow your customer base with emails.

These emails are transactional emails, and they are key to your advanced email marketing strategy.

Why? Transactional emails are expected, and they encourage your users to take another action.

While some transactional emails are sent because of someone’s inaction, many more are sent because of their action.

Another word for transactional emails that grow your customer base is relationship-based emails. These emails have much higher open and click-thru rates. They also have greater revenue potential than regular emails.

It’s in your best interest to harness the power of the transactional email today. Increase your engagement opportunities with relationship-based emails and watch your customer base grow.

We’d like to leave you with one last tip. Ensure that every transactional email you send is optimized for mobile. The email must render as well on a smartphone or tablet as it does on a desktop or laptop computer.

Are you ready to squeeze more profit out of your website by fine-tuning your landing pages to skyrocket growth among your email subscribers and current customers? That’s terrific! We’re here to help you optimize your website so it works fluidly for your website visitors. In fact, we promise you we’ll do just that.

With our guarantee, you can rest assured we will increase your profits through landing page optimization.

If you’re ready to work with the leader in landing pages and conversion rate optimization, contact us today.

We’ll provide you with our FREE site performance analysis so we can work on your landing page conversion rates.

Image: Alexandru Tudorache

The post 7 Ways To Grow Your Customer Base With Emails appeared first on Landing Page Optimization Blog.

Introducing Dynamic Testing: the revolutionary way to test ecommerce experiences

You might think most marketing automation solutions feel like a black box. You’re not exactly sure what the software is doing or how it’s deciding which content to put in front of your customers. You might even feel like…

You might think most marketing automation solutions feel like a black box. You’re not exactly sure what the software is doing or how it’s deciding which content to put in front of your customers. You might even feel like...

The Universal Analytics Command Queue

For those of you who remember Google Analytics classic, the next four characters I write after this sentence should ring with nostalgia:

_gaq

_gaq is the name of the global variable that Google Analytics would install when it was executed on a page. The _gaq variable was defined (initially) as an Array. The default snippet had it right at the top:

<script type="text/javascript">
  var _gaq = _gaq || [];
  _gaq.push(['_setAccount', 'UA-XXXXX-X']);
  _gaq.push(['_trackPageview']);

  (function() {
      var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
      ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
      var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
  })();
</script>

The _gaq array was a command queue for things we wanted Google Analytics to do. The default snippet passed it some stuff right away – specifically, it passed in the instructions to create a tracker and fire a pageview. This pattern was really cool – it allowed for commands to be registered in-line while the browser was busy fetching Google Analytics and was the backbone of GA’s transition to asynchronous loading. No messy callbacks, no need to check if _gaq was defined; once Google Analytics loaded, each command would be executed in the order it appeared in the queue.

Another part of what made this exciting was that non-GA functions could also use the queue, and Google Analytics would call them for us. This allowed for clever folks to push in commands to run before Google Analytics did its tracker creation; for example, if I wanted to fire a callback function when Google Analytics loaded, I could do this:

var _gaq = window._gaq = window._gaq || [];

_gaq.push(function() {
  notifyGALoaded();
});

The first thing GA would do after it finished loading and bootstrapping itself would be to call my function, which would call notifyGALoaded, in turn. I could be comfortable adding this code anywhere, because GA would inherit any existing _gaq values non-destructively. This meant the .push() method always would work.

Then Universal Analytics came along and ruined everything – the new syntax didn’t use the familiar .push() syntax, and instead opted for commands to be pushed in as arguments to the function ga.

Getting Back Our Queue with the Alternative Syntax

Deep in the bowels of the Universal Analytics documentation, there’s an article that describes an alternative syntax for loading Google Analytics. We can use the first part of this snippet to get back our command queue:

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;

This snippet instantiates our ga global and configures the internal command queue for us. Now we can go right back to adding commands into our queue:

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;

ga(function() {
  // Do some stuff
});

Just like with our old _gaq Array, we needn’t be concerned about detechting when Google Analytics loads or accidentially overwriting the existing global.

So…?

You might be thinking “So, what? When would I need that anyways?”. Imagine you’re providing code to a 3rd party, and they want to dictate whether Google Analytics loads or not. You know Google Analytics may load, but you don’t want to force it to load if it doesn’t, and you don’t want to poll every n milliseconds to see if it has loaded. Now you can just instantiate the queue, push in your commands, and carry on worry-free.

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
ga('create', 'UA-XXXXXX-YY', ...)
// etc

You also may want to have code execute only after Google Analytics has loaded. Using this syntax, you can instantiate ga, push in your command, and be guaranteed that it will execute only after Google Analytics has loaded.

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;

ga(function() {
  // Some code I'd like to fire
});

This simple interface provides a really simple way to queue up commands before and after the Google Analytics library loads. How will you use it? Drop me a line or on Twitter.

For those of you who remember Google Analytics classic, the next four characters I write after this sentence should ring with nostalgia:

_gaq

_gaq is the name of the global variable that Google Analytics would install when it was executed on a page. The _gaq variable was defined (initially) as an Array. The default snippet had it right at the top:

<script type="text/javascript">
  var _gaq = _gaq || [];
  _gaq.push(['_setAccount', 'UA-XXXXX-X']);
  _gaq.push(['_trackPageview']);

  (function() {
      var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
      ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
      var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
  })();
</script>

The _gaq array was a command queue for things we wanted Google Analytics to do. The default snippet passed it some stuff right away - specifically, it passed in the instructions to create a tracker and fire a pageview. This pattern was really cool - it allowed for commands to be registered in-line while the browser was busy fetching Google Analytics and was the backbone of GA's transition to asynchronous loading. No messy callbacks, no need to check if _gaq was defined; once Google Analytics loaded, each command would be executed in the order it appeared in the queue.

Another part of what made this exciting was that non-GA functions could also use the queue, and Google Analytics would call them for us. This allowed for clever folks to push in commands to run before Google Analytics did its tracker creation; for example, if I wanted to fire a callback function when Google Analytics loaded, I could do this:

var _gaq = window._gaq = window._gaq || [];

_gaq.push(function() {
  notifyGALoaded();
});

The first thing GA would do after it finished loading and bootstrapping itself would be to call my function, which would call notifyGALoaded, in turn. I could be comfortable adding this code anywhere, because GA would inherit any existing _gaq values non-destructively. This meant the .push() method always would work.

Then Universal Analytics came along and ruined everything - the new syntax didn't use the familiar .push() syntax, and instead opted for commands to be pushed in as arguments to the function ga.

Getting Back Our Queue with the Alternative Syntax

Deep in the bowels of the Universal Analytics documentation, there's an article that describes an alternative syntax for loading Google Analytics. We can use the first part of this snippet to get back our command queue:

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;

This snippet instantiates our ga global and configures the internal command queue for us. Now we can go right back to adding commands into our queue:

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;

ga(function() {
  // Do some stuff
});

Just like with our old _gaq Array, we needn't be concerned about detechting when Google Analytics loads or accidentially overwriting the existing global.

So...?

You might be thinking "So, what? When would I need that anyways?". Imagine you're providing code to a 3rd party, and they want to dictate whether Google Analytics loads or not. You know Google Analytics may load, but you don't want to force it to load if it doesn't, and you don't want to poll every n milliseconds to see if it has loaded. Now you can just instantiate the queue, push in your commands, and carry on worry-free.

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
ga('create', 'UA-XXXXXX-YY', ...)
// etc

You also may want to have code execute only after Google Analytics has loaded. Using this syntax, you can instantiate ga, push in your command, and be guaranteed that it will execute only after Google Analytics has loaded.

window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;

ga(function() {
  // Some code I'd like to fire
});

This simple interface provides a really simple way to queue up commands before and after the Google Analytics library loads. How will you use it? Drop me a line or on Twitter.

What is your conversion rate? Why is it important?

How many visitors does it take to make a sale or generate a lead on your website? If you know the answer to this question, you have everything you need to calculate your conversion rate.

The actual definition is the percentage of your website visitors who take a desired action or complete a goal. Here’s how it’s calculated: if 100 visitors come to your website and 2 take the desired action, you have a 2% website conversion rate (2/100=2%).

The post What is your conversion rate? Why is it important? appeared first on Diamond Website Conversion.

How many visitors does it take to make a sale or generate a lead on your website? If you know the answer to this question, you have everything you need to calculate your conversion rate.

The actual definition is the percentage of your website visitors who take a desired action or complete a goal. Here’s how it’s calculated: if 100 visitors come to your website and 2 take the desired action, you have a 2% website conversion rate (2/100=2%).

The goal or desired action includes not only outcomes that result in sales, but also goals like getting visitors to sign up for your newsletter or watching a video.

The key is visitor expectations and the persuasiveness of your site, because for visitors to take a desired action, they must be achieving their goals.

The post What is your conversion rate? Why is it important? appeared first on Diamond Website Conversion.

WordPress Resources at SiteGround

WordPress is an award-winning web software, used by millions of webmasters worldwide for building their website or blog. SiteGround is proud to host this particular WordPress installation and provide users with multiple resources to facilitate the management of their WP websites: Expert WordPress Hosting SiteGround provides superior WordPress hosting focused on speed, security and customer…

The post WordPress Resources at SiteGround appeared first on Diamond Website Conversion.

WordPress is an award-winning web software, used by millions of webmasters worldwide for building their website or blog. SiteGround is proud to host this particular WordPress installation and provide users with multiple resources to facilitate the management of their WP websites:

Expert WordPress Hosting

SiteGround provides superior WordPress hosting focused on speed, security and customer service. We take care of WordPress sites security with unique server-level customizations, WP auto-updates, and daily backups. We make them faster by regularly upgrading our hardware, offering free CDN with Railgun and developing our SuperCacher that speeds sites up to 100 times! And last but not least, we provide real WordPress help 24/7! Learn more about SiteGround WordPress hosting

WordPress tutorial and knowledgebase articles

WordPress is considered an easy to work with software. Yet, if you are a beginner you might need some help, or you might be looking for tweaks that do not come naturally even to more advanced users. SiteGround WordPress tutorial includes installation and theme change instructions, management of WordPress plugins, manual upgrade and backup creation, and more. If you are looking for a more rare setup or modification, you may visit SiteGround Knowledgebase.

Free WordPress themes

SiteGround experts not only develop various solutions for WordPress sites, but also create unique designs that you could download for free. SiteGround WordPress themes are easy to customize for the particular use of the webmaster.

The post WordPress Resources at SiteGround appeared first on Diamond Website Conversion.

Magic Mirror Project

For Christmas this year, I was trying to come up with something unique for the long-term roommate. After some hemming and hawing, I decided to create my own Magic Mirror, in the style of Michael Teeuw(turns out I wasn’t the only one who was inspired by…

For Christmas this year, I was trying to come up with something unique for the long-term roommate. After some hemming and hawing, I decided to create my own Magic Mirror, in the style of Michael Teeuw(turns out I wasn't the only one who was inspired by Michael. One small twist - the giftee specifically requested a full-length mirror, of which we are in dire need.

Step 1: Roughing The Design & Finding The Screen

After some initial research, I sketched out a design based on the simple and clean aesthetic of this DIY walkthrough I found.

Once I had an idea of what I wanted it to look like, I determined that the size of the screen was what would ultimately shape the rest of the mirror. At first, I was considering using a lower-resolution TV we've had collecting dust in the basement, but I decided I really wanted a nice-looking screen to go into this. I settled on a 24" Acer S242HLbid LED panel I found on Craigslist for $100.00. Once I had the screen and the design, it was time to figure out how it was all going to come together.

Step 2: SketchUp

I didn't want to cut any corners, so I modeled everything out in SketchUp first. This was my first time using SketchUp, and I can't encourage you enough to use it for your next project; it was a lifesaver (and it's free!). I referenced my model constantly. It was great to be able to measure any piece of the design on-the-fly and be fully confident in the numbers I was using - I've been burned by smudged ink and guesswork in the past.

magic mirror sketchup screenshot

magic mirror sketchup back screenshot

I stayed pretty true to the design I laid out in SketchUp. I did ditch the legs, at least for the time being.

Step 2: Setting Up The Pi

Once the design was settled, I moved on to setting up the Pi. I already had a B+ lying around with a case and a WiFi dongle, so it was pretty short work getting things set up. Michael Teeuw has a great step-by-step to get you started, if you're following along at home. My initial setup was pretty vanilla - I just dropped the compliments and calendar.

My initial setup

Next, I configured the Pi to boot Chromium in kiosk mode, pointed to 127.0.0.1, a step which is also covered in Michael's tutorial. With the software in place, I set to work assembling the mirror.

Step 3: Building The Frame

One of the great things about DIY holiday gifts is they make great excuses to buy all the expensive tools you've never been able to justify purchasing. I'd been long in in need of a miter saw, and the 45 degree cuts of the frame demanded nothing less. I decided to go with something on the entry-level side of the spectrum, and picked up a well-reviewed Hitachi saw from Lowe's for a little over $120.00.

Next, I picked up the lumber for the mirror. Originally I planned on using soft pine for the frame and the support structure in the back, with the ultimate goal of staining the wood in the spring. After some reading, I decided to use oak for the frame instead, as it looks better after staining compared to pine (so I've read, anyhow).

Then I built the frame. The frame itself is held together with wood glue and heavy-duty angle irons. I learned the hard way that oak is incredibly tough, and will happily let you torque the heads right off your screws if you're not careful. Lesson learned. Here's the frame, all assembled and standing up.

Step 4: Making The Mirror & First Assembly

With the frame, Pi, and monitor ready to go, it was time to make the mirror. Early in the project, I had read that it was simple enough to make a two-way mirror with a piece of acrylic or polycarbonate and the correct film. Armed with film I bought from Amazon and polycarbonate I bought from Home Depot, I went to work.

It was tough keeping dust and hair from getting trapped between the polycarbonate and the film as I was working. Several times I had to undo a section just to pull out a single stray cat hair, then stretch and squeegee it all over again. Unfortunately, my hard work was for nought. Once everything was installed and stood up, it was easy to see that the film had ended up getting scratched, and the polycarbonate itself was flexing and warping in the frame. It looked like a fun-house mirror.

I decided to do it right and order a sheet of glass.

Step 5: Second Assembly

The best photo I could come up with from the ride By pure chance, I met one of the co-owners of a Pittsburgh-based glass repair shop called Glass Doctor on the annual Pittsburgh Icicle Bicycle Ride.

He hooked me up with the real deal - quarter-inch tempered glass with a professionally applied two-way film. The finished product speaks for itself:

On the back, I've got the monitor pressed in place with corner braces. It's also supported by the 1"X4" crossbrace in the back. Currently, the Pi sits out on its own; eventually, it will be mounted on the rear frame, too. The power buttons from the old bezel are hanging loose, but will eventually be strapped down, too.

Steps 6 to ?: Tweaks

Once everything was put together, it wasn't long before I started adding extras. The first addition I made was to incorporate the estimated arrival times of incoming busses from the Port Authority of Pittsburgh's realtime tracking API. The giftee takes the bus in the morning (when I can't be plied for a ride), and knowing when the next one is pulling in can be a lifesaver.

I also added a small map with traffic information using the Google Maps JavaScript API, so I could get an at-a-glance idea of what my commute will look like. I'm planning on nixing that, though, since my commute is mostly off the beaten path and traffic rarely factors into my morning. Instead, I'm planning on adding cycling-related data: did it rain or snow overnight, will it rain or snow during commuting hours, and the temperature differential between the morning and evening commutes. All of this is invaluable knowledge for the commuting cyclist.

As time goes on, I plan on extending the functionality of the mirror even further; I've already decided to add a motion sensor and relay for managing power to the screen more effectively, and I'm considering adding facial recognition capabilities. I'm also really looking forward to staining the frame!

If you're interested in the code for this project, I'd recommend starting with Michael Teeuw's original project. If you'd like to use my code, you can grab my fork here, but be warned, support will be minimal.

Allegheny County Real Estate Search API

Allegheny County makes available a searchable database of public records on parcels in the area. It’s a handy resource for homebuyers interested in knowing a little more about the history of a home they’re purchasing, or to do some comparison shopping when preparing to buy a home.

I’ve always been interested in examining pricing data on Pittsburgh neighborhoods for fun, so about a year ago I put together a NodeJS package that exposes a few methods to scrape the site and return the data in JSON, e.g.:

acreApi.parcel.ownerHistory('0084-N-00285-0000-00', function(err, parcel) {
    if(err) return console.log(err);
    console.log(parcel);
});

// Outputs
{
    parcelId: '0084-N-00285-0000-00',
    municipality: '107  PITTSBURGH - 7TH WARD',
    address: '5801 WALNUT ST PITTSBURGH, PA 15232',
    ownerName: 'BORETSKY ROBERT H & KAREN R (W)',
    deedBook: '10857',
    deedPage: '371',
    ownerHistory: [{
        owner: '',
        saleDate: '',
        salePrice: 1
    }, {
        owner: 'ANN MEDIS',
        saleDate: '2/8/1993',
        salePrice: 33000
    }, {
        owner: 'BORETSKY ROBERT H & KAREN R (W)',
        saleDate: '9/1/2000',
        salePrice: 125000
    }]
}

I built it to provide a programmatic way of querying that dataset, which can be found here. The API makes it super simple to aggregate data on swaths of the Pittsburgh region for analysis. Here’s a chart I built using the street.street and parcel.ownerHistory methods.

Chart graphing sales and median price in Lower Lawrenceville

For folks who live in the area, the data suggests a fire-sale in the neighborhood between 2005-2007 (about when the new Childrens Hospital was announced and moved into the neighborhood), and steady price increases from then on as interest in the Lawrenceville area has stretched into the further reaches of the neighborhood.

This was an interesting project for me; there were many challenges to overcome. I learned a little about .NET applications and how they work, and a lot about managing multiple simultaneous requests in Node. Some more interesting problems I solved include:

  • Scraping data from the application, which uses stateless URLs
  • Queueing/managing requests responsibly

The Allegheny County Real Estate search tool doesn’t store state information in the URL, meaning I couldn’t just create a method that plopped in query parameter values and returned the data. Instead, it uses data POST’d on each query or refinement that the app returns data about on the same URL.

To get around this, I had to spoof the View Statedata the .NET application expected in order to get back any meaningful results. As it stands, I’m spoofing View State information by GET’ing the /search.aspx page, extracting the information from the HTML (View State and other .NET data values are stored in hidden inputs on the page), and then proceed. This View State information would then be cached and re-used or re-written as the scraper interacted with the site.

This introduced a challenge for kicking off multiple concurrent requests when no View State has been cached. For example, if I had some code like this:

async.map([
  'Howley', 'Cabinet', 'Mintwood', 
  '38th', '39th', 'Liberty', 'Penn'
].map(function(el) { 
  return [el, 'Pittsburgh - 6th Ward'];   
}, function(arr, callback) {

    acre.street.street(arr[0], arr[1], callback);

}, function(err, results) {

  ...

});

If no View State was cached, each street being hit by async.map would cause a GET request to be issued in order to establish an initial View State – that would mean 7 immediate requests, when 1 would do.

I solved that by adding an internal queuing mechanism to the State manager, found in /lib/State.js.

var _gettingState = false;
var _storedState;

...

_State.get = function StateGet(callback) {

  if (_storedState) {

    callback(null, _storedState);

  } else if (_gettingState) {

    _queue.push(function() {

      callback(null, _storedState);

    });

  } else {

    _gettingState = true;

    Api('get', 'search', {}, function(err, html) {

      if (err) {

        _gettingState = false;
        callback(err);

      } else {

        _storedState = Parser.stateData(html);

        if (_gettingState) {

          _clearQueue();
          _gettingState = false;

        }

        callback(null, _storedState);

      }

    });

  }

};

function _clearQueue() {

  _queue.forEach(function(el) {

    el();

  });

  _queue.length = 0;

}

When a part of the API asks for a View State, the State manager returns the cached View State or it builds a new one. If additional requests come in during the process of fetching and building the View State, the State Manager caches the callback in a private queue and defers them until the View State is constructed. Once the View State is in place, _clearQueue iterates through all cached callbacks and passes them the newly created View State. No unnecessary requests required!

When I need to get at pages further in the application, say, paginated results, I manually scrape links to those pages and request them individually. This is a bit of a slow and sloppy mechanism, and it’s very fragile; any semi-serious changes to their naming conventions would break the whole thing. If I could do it again, I would be interested to try and manually build View States and POST those directly, instead of the more spider-like approach I’m taking now.

The other challenge I ran into was responsibly managing all of those requests; left to it’s own devices, the API could generate hundreds of requests to their server instantaneously. Rather than litter in setTimeout‘s all over the place, I built a Connection manager to handle queuing, executing, and pacing the requests. That can be found in /lib/Connections.js. Since then, I’ve spent more time building similar mechanisms in NodeJS, so this experience was very valuable.

The basic gist of the Connections tool is to recursively clear an internal queue such that at most only 20 connections have been opened at any given point. The whole thing is about 51 lines of code, so I’m including it here:

// Manage our requests to prevent overloading the server
var Connections = (function Connections(){

    var _Connections        = {};
    var _queue             = [];
    var currentConnections = 0; 
    var maxConnections     = 20;

    var check = function ConnectionsCheck() {

        if(currentConnections 

Allegheny County makes available a searchable database of public records on parcels in the area. It's a handy resource for homebuyers interested in knowing a little more about the history of a home they're purchasing, or to do some comparison shopping when preparing to buy a home.

I've always been interested in examining pricing data on Pittsburgh neighborhoods for fun, so about a year ago I put together a NodeJS package that exposes a few methods to scrape the site and return the data in JSON, e.g.:

acreApi.parcel.ownerHistory('0084-N-00285-0000-00', function(err, parcel) {
    if(err) return console.log(err);
    console.log(parcel);
});

// Outputs
{
    parcelId: '0084-N-00285-0000-00',
    municipality: '107  PITTSBURGH - 7TH WARD',
    address: '5801 WALNUT ST PITTSBURGH, PA 15232',
    ownerName: 'BORETSKY ROBERT H & KAREN R (W)',
    deedBook: '10857',
    deedPage: '371',
    ownerHistory: [{
        owner: '',
        saleDate: '',
        salePrice: 1
    }, {
        owner: 'ANN MEDIS',
        saleDate: '2/8/1993',
        salePrice: 33000
    }, {
        owner: 'BORETSKY ROBERT H & KAREN R (W)',
        saleDate: '9/1/2000',
        salePrice: 125000
    }]
}

I built it to provide a programmatic way of querying that dataset, which can be found here. The API makes it super simple to aggregate data on swaths of the Pittsburgh region for analysis. Here's a chart I built using the street.street and parcel.ownerHistory methods.

Chart graphing sales and median price in Lower Lawrenceville

For folks who live in the area, the data suggests a fire-sale in the neighborhood between 2005-2007 (about when the new Childrens Hospital was announced and moved into the neighborhood), and steady price increases from then on as interest in the Lawrenceville area has stretched into the further reaches of the neighborhood.

This was an interesting project for me; there were many challenges to overcome. I learned a little about .NET applications and how they work, and a lot about managing multiple simultaneous requests in Node. Some more interesting problems I solved include:

  • Scraping data from the application, which uses stateless URLs
  • Queueing/managing requests responsibly

The Allegheny County Real Estate search tool doesn't store state information in the URL, meaning I couldn't just create a method that plopped in query parameter values and returned the data. Instead, it uses data POST'd on each query or refinement that the app returns data about on the same URL.

To get around this, I had to spoof the View Statedata the .NET application expected in order to get back any meaningful results. As it stands, I'm spoofing View State information by GET'ing the /search.aspx page, extracting the information from the HTML (View State and other .NET data values are stored in hidden inputs on the page), and then proceed. This View State information would then be cached and re-used or re-written as the scraper interacted with the site.

This introduced a challenge for kicking off multiple concurrent requests when no View State has been cached. For example, if I had some code like this:

async.map([
  'Howley', 'Cabinet', 'Mintwood', 
  '38th', '39th', 'Liberty', 'Penn'
].map(function(el) { 
  return [el, 'Pittsburgh - 6th Ward'];   
}, function(arr, callback) {

    acre.street.street(arr[0], arr[1], callback);

}, function(err, results) {

  ...

});

If no View State was cached, each street being hit by async.map would cause a GET request to be issued in order to establish an initial View State - that would mean 7 immediate requests, when 1 would do.

I solved that by adding an internal queuing mechanism to the State manager, found in /lib/State.js.

var _gettingState = false;
var _storedState;

...

_State.get = function StateGet(callback) {

  if (_storedState) {

    callback(null, _storedState);

  } else if (_gettingState) {

    _queue.push(function() {

      callback(null, _storedState);

    });

  } else {

    _gettingState = true;

    Api('get', 'search', {}, function(err, html) {

      if (err) {

        _gettingState = false;
        callback(err);

      } else {

        _storedState = Parser.stateData(html);

        if (_gettingState) {

          _clearQueue();
          _gettingState = false;

        }

        callback(null, _storedState);

      }

    });

  }

};

function _clearQueue() {

  _queue.forEach(function(el) {

    el();

  });

  _queue.length = 0;

}

When a part of the API asks for a View State, the State manager returns the cached View State or it builds a new one. If additional requests come in during the process of fetching and building the View State, the State Manager caches the callback in a private queue and defers them until the View State is constructed. Once the View State is in place, _clearQueue iterates through all cached callbacks and passes them the newly created View State. No unnecessary requests required!

When I need to get at pages further in the application, say, paginated results, I manually scrape links to those pages and request them individually. This is a bit of a slow and sloppy mechanism, and it's very fragile; any semi-serious changes to their naming conventions would break the whole thing. If I could do it again, I would be interested to try and manually build View States and POST those directly, instead of the more spider-like approach I'm taking now.

The other challenge I ran into was responsibly managing all of those requests; left to it's own devices, the API could generate hundreds of requests to their server instantaneously. Rather than litter in setTimeout's all over the place, I built a Connection manager to handle queuing, executing, and pacing the requests. That can be found in /lib/Connections.js. Since then, I've spent more time building similar mechanisms in NodeJS, so this experience was very valuable.

The basic gist of the Connections tool is to recursively clear an internal queue such that at most only 20 connections have been opened at any given point. The whole thing is about 51 lines of code, so I'm including it here:

// Manage our requests to prevent overloading the server
var Connections = (function Connections(){

    var _Connections        = {};
    var _queue             = [];
    var currentConnections = 0; 
    var maxConnections     = 20;

    var check = function ConnectionsCheck() {

        if(currentConnections < maxConnections) {

            var availableConnections = maxConnections - currentConnections < _queue.length ? maxConnections - currentConnections : _queue.length; 
            process(availableConnections);

        }

    };

    var process = function ConnectionsProcess(num) {

        var i;

        for(i = 0; i < num; i++){

            _queue[0]();
            _queue.splice(0, 1);
            currentConnections++;

        }

    };

    _Connections.add = function ConnectionsAdd(item) {

        _queue.push(item);
        check();

    };

    _Connections.remove = function ConnectionsRemove() {

        currentConnections--;
        check();

    };

    return _Connections;

})();
module.exports = Connections;

Average Session Duration- What is it and Why Bloggers Should Care

There are a lot of stats to look at when viewing Google Analytics and average session duration is one of them. This article will cover what is average session duration and why bloggers should care about it. Even if you’re not a blogger, you may want to read in on this. Average Session Duration –…

The post Average Session Duration- What is it and Why Bloggers Should Care appeared first on Diamond Website Conversion.

average-session-duration-200x200There are a lot of stats to look at when viewing Google Analytics and average session duration is one of them. This article will cover what is average session duration and why bloggers should care about it. Even if you’re not a blogger, you may want to read in on this.

Average Session Duration – What is it?

According to Google,

Average session duration is total duration of all sessions (in seconds) / number of sessions.

A single session is calculated from the first time someone views your page, to the last page view of that person. So, if someone enters your site and visits a few places, say 5, on it that takes them 10 minutes, then their session is 10 minutes, or 600 seconds. If their session is one page and only 30 seconds, then their total session is 30 seconds.

The average session duration is taking the total time of the session divided by the number of sessions during a specific date range.

Average Session Duration – Why Bloggers Should Care

Average session duration can be influenced by bounce rate, page views,and sessions, but for some, this could be a indicator of how much people like to stay on specific areas of your website. For bloggers, this allows them to know if an article has been well received.

Google loves long form content. This has been said over and over by many leaders in the SEO industry. However, Google also has suggested that content in a post be at least 300 words.

Well, 300 words doesn’t take long to read. If you’re a blogger that constantly published content that ranges around 300 words, you’re not really beefing up the potential of time that your readers are spending on your website. Often, the reader will skim through in under a minute, possibly comment, and then leave.

Rather than giving the reader a “wham bam thank you mam” experience, why not do some of the following to possibly increase the average session duration, and thusly your reader’s interest in remaining on your website:

  • Create a series of posts and interlink them. People who have an interest for the topic will click to each topic and stay on the site longer.
  • Always find ways to link to other relevant posts in your website. Whether it’s a specific term that you explain or some other relevant content, this gives the reader a possible option to be curious enough to click that link and read more.
  • Have cornerstone content that is lengthier and filled will several methods in which the reader can digest your content. Aside from long form text, don’t forget that you can add images, video and audio to expand upon your point. Cornerstone content is usually quite lengthy (more than 1500 words), and sometimes may even seem like it should be in an ebook.
  • Don’t forget to link to your services, encourage visitors to comment, or ask readers to subscribe to your newsletter. It’s your website, don’t be shy. All of these encourage some type of positive action that brings them to another place on your website.

Most bloggers will probably look more at their page views, but seriously, if you’re setting goals on individual pages, you may want to also focus on whether people are staying on those pages or going to the places you want them too.

Have you taken the time to look at your site or individual article’s average session duration?

The post Average Session Duration- What is it and Why Bloggers Should Care appeared first on Diamond Website Conversion.

The Best Google Analytics Reports for Improving Websites

Google Analytics isn’t just for knowing how much traffic your website is getting, your top pages, and how your traffic sources and marketing efforts are performing. Nope. There is an even better use for it!…

best google analytics reports

Google Analytics isn’t just for knowing how much traffic your website is getting, your top pages, and how your traffic sources and marketing efforts are performing. Nope. There is an even better use for it!

It’s also really important to use it to help improve your website – so it converts many more visitors into sales, leads or subscribers. But unfortunately Google Analytics can be a little daunting at times, particularly with seemingly endless reports to check out and analyze. Where should you start for best results?

To help you make sense of this, I’ve created a list of the best Google Analytics (GA) reports so you can quickly gain more insights into your website performance and what needs improving most. I have also recently included a video of me walking you through all these great reports. Let’s get started…

The best Google Analytics reports to improve your website

Update: Watch a video of me guiding you through all these key Google Analytics reports

Last year I created a premium video about these best Google Analytics reports. It was originally part of a paid membership but I have decided to now include it on this article for everyone to watch for free. In this video you will also learn how to create a Google Analytics dashboard for these reports. Enjoy!

Check the landing pages report for pages with high bounce rates and low conversion rates
Your top landing pages (entry pages) are crucial to optimize because they often get very high levels of traffic, and are the first pages your visitors see on your website. If visitors don’t find what they are looking for or are confused, they will leave your website often within just 10 seconds!

To improve your website with this report, pull up the your landing pages report for the last 30 days (found under ‘Behavior > Site Content > Landing pages’).  Then see which pages out of the top 10 have highest bounce rate (over 50% is high) and which have lower than website average goal conversion rate (both indicated below in yellow) – these are indicators of poorly performing pages on your website.

Google Analytics landing pages report bounce rate and conversion rate

Then optimize these poor page performers first – improving headlines, benefits, imagery and call-to-action buttons are some of the best ways to do this. Optimizing these helps increase visitor engagement and increases the chances of them converting for your key website goals.

Use In-Page Analytics feature to reveal exactly what visitors click on
Don’t presume you know what visitors are doing on your pages, and what they are clicking on – it can often be different than you might expect. Use this great click map feature in GA (found under ‘Behavior > In-Page Analytics’) on your key pages to gain better insights into your visitors journey and flow around your website.

Then based on what insights you find, to improve your website you should focus your visitor’s experience on more important links. This can be done by deemphasizing less useful links (or removing them) from key pages, and reorganize your navigation menus to focus on major website goals.

Google Analytics In-page Analytics

Note: Ideally you should turn on the ‘enhanced link attribution‘ option in your settings – this makes the clicks more accurate for when you have multiple links on one page going to the same destination page.

Check the browser report for poor conversion rate performers
Your webpages can sometimes look slightly different or even break in some browsers (often due to small differences in how browsers show CSS code). This can unknowingly cause many lost sales or leads!

To make sure this isn’t negatively impacting your website, you need to regular check the ‘Browser & OS’ report (found under ‘Audience > Technology ) and make sure your conversion rates aren’t much lower for any browsers. If you see ones on this report that are much lower, you should go ahead and check for technical problems like CSS rendering issues and fix them immediately.

Google Analytics browser report

Analyze your Funnel Visualization report for high-drop off rates and optimize
It doesn’t matter how good your website is if visitors struggle to get through your checkout or sign-up flow pages. To understand how well your visitors complete that process, its vital you check your Funnel Visualization report. On this report (found under ‘Conversions > Goals > Funnel Visualization’) you can see how many visitors get through each page of your funnel (like your billing page), and which pages are most problematic – even where they go if they go to another page.

You need to pay great attention to any pages with a high drop off rate (more than 40%) and optimize those first – adding security seals and risk reducers, reducing distractions like header navigation, and improving error handling often work well. Improving these pages will greatly increase your sales or signups.

Google Analytics funnel visualization report

Note: Obviously you will need to have made sure you have setup your goals for your website adequately, including adding key pages in your goal flows. Here is a great guide on setting goals up.

Check your traffic overview report for poor performing traffic sources
Improving the quality and quantity of your traffic has huge impact on your website conversion rates, sales or leads, and its vital you gain insights into traffic performance and optimize the major sources.

To help you gain greater insights into this, pull up the ‘Channels’ report as Google calls it (found under ‘Acquisition > Channels) and check which of your top 10 traffic sources (channels) have high bounce rates (over 50%), and also for sources that seem low or missing from the top 10.

For example, you may find your email traffic isn’t as much as you had hoped for or isn’t converting well, so you should optimize your email marketing campaigns soon. Same goes for your paid search and SEO too.

Google Analytics acquisition overview report

Use the mobile overview report for tablet/mobile insights
Mobile traffic is bigger than ever before, often accounting for over 20% of total website traffic – and these visitors have very different needs due to smaller screen sizes, and often convert much lower than regular website traffic.

To understand your mobile traffic, and its performance, you need to check your ‘mobile overview’ report. Here you need to see just how high your traffic levels are for both mobile and tablet devices, and see what the conversion rate for each is. If conversion rate is much lower for any, you need to check your website on that device for key issues and fix immediately.

And if you haven’t already done so, to increase your conversion rates it’s critical to have a mobile optimized website as soon as possible (like using responsive design), particularly if your mobile traffic is over 20%.

Google Analytics mobile overview report

Check the exit pages report to find problematic pages
You also need to find out which pages are most often causing your visitors to leave (called an ‘exit’ page) – and improve and optimize those too.

To find these top exit pages, check your ‘exit pages’ report (found under ‘Behavior > Site Content > Exit Pages). In particular look for any pages that shouldn’t be in the top 10, and try to figure out why so many people exit your site on them. Also look for pages with especially high exit rate (over 50%), as this often indicates problems.

A few ways to improve these exit pages is by using and optimizing call-to-actions at the end of them, and try using exit intent popups to show a great incentive (discounts/free guides etc).

Google Analytics exit pages report

Analyze the top pages report for key missing pages and high exit rates
Your top pages report can contain some real gems for insights – not just what your top 10 pages currently are (found under ‘Behavior > Site Content > All Pages).

First you can see if any of your top pages have high exit rates (important to optimize those ASAP) and also to check if any pages relating to your key goals seem missing from this report or have low traffic. For example, perhaps few people are visiting your ‘why us’ or benefits page – making links more prominent to this page will hopefully increase sales/leads.

Google Analytics top pages report

These are the simpler reports, there’s many advanced ones too

These are just some of the simpler Google Analytics reports that will help you improve your website. Here are a couple of the many more advanced ones to learn about:

  • Using the ‘Converters’ visitor segment to figure out the behavior of people who convert for your main website goals (sales/leads etc).
  • Using the ‘Site Search’ report to find pages causing most amount of internal searches (indicates visitors not finding what they need).

If you are interested in learning more about these advanced GA reports, simply comment and let me know.

No time to analyze Google Analytics reports or not good at it?

If you don’t have time or the skills to gain insights from your Google Analytics reports you should check out my ‘Google Analytics Insights’ service – I’m sure you will find it useful for  improving your website.