Blog

A New Year, A New Site Audit

paradoxlabs site audit blog image With 2014 gone by almost a month, in-house SEOs, webmasters, and marketers may be taking a well-deserved breather. Hopefully, your 2014 marketing plan played out as well as you expected, and you’re sitting back to enjoy the high from holiday sales. Unfortunately, we all know winter sales can be as cold as the weather outside – this doesn’t mean you should let your site slip into a deep freeze.

There is something you can do to keep the fire burning on your site even in the coldest of sales months – an in-depth site audit! When your site is performing at its best, your customers will have an amazing shopping experience and the search engines will even take notice of your stellar efforts. So, dust off your crawl tools and log in to Webmaster Tools as we explore the biggest technical SEO factors to revisit in the New Year! And don’t worry if you’re not a seasoned SEO pro, we’ll let you know where to look and what tools to use to make your site audit a breeze!

First Things First, Crawl It!

The very first thing you should do is plug your site into a crawl tool. These tools, often free, allow you to get a backstage pass to what’s going on in the backend of your site. A site crawl will allow you to see what the search engines see – response code errors, incorrect or temporary redirects, title and meta elements that are improperly formatted, etc. You don’t have to be a developer or a technical pro to be able to easily read and interpret these crawl reports.

We highly recommend using Screaming Frog. This crawler is totally free, and as powerful as you need it to be. You can simply look at some high-level aspects of your site, or really dive into URIs, inlinks, outlinks, and more. With Screaming Frog, you’ll be able to see all of those important technical details of your site. As mentioned, do this step first – this crawl report will be your blueprint for the rest of our steps.

Response Required – Response Codes

After you’ve downloaded Screaming Frog and plugged your site into the crawler, we recommend sorting the crawled pages by response codes. These codes are how your site communicates with the server – which pages to display, where to direct visitors, and when to perform these actions. Having all of your response codes in order for every webpage is one of the most important technical aspects of your site. If you have a ton of 404 errors (pages coming up as “not found” on the server) you’ll end up interfering with users’ experiences – and that’s not something the search engines like to see happen.

Once you’ve sorted all of your site pages by their response codes, search for any outstanding issues. The most common codes would be 404s (not found) and 302s (temporary redirects). These pages should be addressed – you want to make sure you are directing visitors to the best possible pages on your site to match their query. Pages with outstanding errors should be removed or redirected to the appropriate page.

Also, it’s always a good idea to make sure your 301s (permanent redirects) are working properly. Please note, if you are doing a site redesign or just sprucing up a few pages, the 302s may be there for a reason and you should consult your designer/webmaster before making any changes.

Top Level Tech – XML Sitemaps and Robots.txt Files

XML sitemaps are an often overlooked, critical piece of many sites. These XML files are different from “SEO” sitemaps or “user” sitemaps that often appear in the footer of sites. Instead, XML sitemaps are HTML files that are written specifically for the search engines. These files allow the search engine’s bots and spiders to crawl your site in a more intelligent way – the way you intend for it to be crawled. With the use of an XML sitemap, the bots and spiders will eventually learn the architecture of your site, so any future crawls will be quicker and more successful. These sitemaps are also so important for proper indexing of your site.

Speaking of indexing, the next pieces of the high level tech puzzle is your robots.txt file. This is another HTML file, but unlike the XML sitemap, this file tells the bots and spiders what pages and content to crawl and what to ignore. These files are so important if you have pages you’d rather not appear in the SERPs (search engine results pages); these would be things like: login and account pages, pages that may have large amount of duplicate content, or aren’t meant to provide value to users. In these files, you’ll be able to tell the search engines to crawl and index pages (allow/index) or ignore them totally (disallow/noindex).

To check the status of your XML sitemap and robots.txt file, you can simply type your URL into the search bar followed by /sitemap.xml or /robots.txt. Or, plug your site into this tool and choose your desired factor.

Be Unique – Page Titles and Meta Descriptions

Next, use your site crawl report to look at your page titles. In Screaming Frog, you can view pages with duplicate titles, missing titles, and titles that are over the recommended length. Some SEO’s may argue that these page elements are largely irrelevant and provide no search value. Despite those theories, what matters in the world of SEO now is user experience and engagement.

Unique and proper length title tags may increase user experience, leading to more click-throughs and higher engagement signals. Title tags should be unique for every page of your site, and should be 512 pixels wide or less (55 characters or under).

The same rule applies to meta descriptions – these may be even more important than title tags, as these are snippets users read to decide if entering your site will solve their query. It’s very important to have a unique meta description for every page of your site. These should be 155 characters or under to avoid being truncated on the SERPs.

If your site has a large number of duplicate or missing page elements, writing and implementing these can be a daunting undertaking. We recommend using Webmaster Tools to pick out pages with the highest number of impressions per month; then, edit these first and work your way through the rest of the site in a way that makes the most sense for you. For a more detailed rundown, check out this post we wrote about writing these page elements.

Finally, Explore and Edit

Once you’ve addressed top-level technical issues on your site, go exploring! Browse your site like a user would – click through to category pages and multiple product pages to make sure everything is running smoothly. Here, you can take note of your navigations – do these makes sense for users? Are they easy and elegant? Do certain pages take a long time to load? Are your menus too bulky and uninteresting?

While you’re on your expedition, take note of your site content and product descriptions. Is this on-page content optimized? When we say “optimized”, we’re not talking about the old school practice of stuffing as many money keywords into pages or product descriptions to manipulate rankings. Instead, make sure the content on your site is user friendly – does it provide value and satisfy searchers’ queries? Only then is it appropriate to add some keywords into the content – remember, these need to appear sparingly and read naturally.

We hope you enjoyed our tips for performing a high-level site audit. With many sites, there’s always something that can be tweaked to make it that much better. Whether that’s creating and submitting an XML sitemap to Webmaster Tools or writing some new, unique page elements, these things can really add to your ability to be found by the search engines, and most importantly, users. It’s clear winter doesn’t have to be a stale time of year for your site and sales – performing regular site audits can vastly improve your impact.

Share this post

Leave a Reply

Your email address will not be published. Required fields are marked *

We're sorry but your browser is out-of-date!

Please update your browser to view this website correctly.Let's update my browser now

×