Let’s face it. Every website is different, and there isn’t a perfect SEO audit checklist that applies to everyone. Fortunately, however, there are some core areas that are almost universally applicable. Although you will most likely have to consider more factors than what are presented in this article, these core areas are a great place to start to ensure your I’s are dotted and your T’s are crossed Consequently, this article is divided into three sections that discuss the three core areas of any SEO audit. Although each area can’t be discussed completely here, links to additional resources have been provided. The three sections are listed below:
- Technical SEO
- On-site SEO
- Off-site SEO
Technical SEO Auditing
This section examines the technical issues that you must look for regardless of your site’s structure and purpose.
First and foremost, you must ensure that your web pages are getting indexed. After all, nothing else really matters if Google isn’t seeing your pages. What we’re looking for here is a scenario in which the number of pages on your site is significantly higher than the number of pages Google is actually seeing. To check for this, I recommend running Screaming Frog over your website. Screaming Frog itself serves a variety of functions, but in this case it can used to give the total number of pages that it finds on your site.
After running the tool, you’ll enter a simple site:domain.com command on Google, which produces a result like the following:
If the number found by Screaming Frog is close to the site search results, then your pages are being indexed. If there is a significant number missing from the index, you’ll need to figure out which pages are missing and then answer the question, “Why?”
Your first stop will likely be the robots.txt file. You’ll find this file at domain.com/robots.txt (simply put this URL in your browser). This file is used to block bots from sections of your site. As you can imagine, if you make a mistake here you can do some real damage.
Until recently, I would have said that the only sitemap you need is the XML sitemap. However, in February of 2016, Google changed its recommendation to include a human-crawlable sitemap as well. Obviously, sites like Amazon couldn’t possibly get all their pages into this sitemap. However, most sites can. If they can’t, they should at least include the core category or other key pages. If you don’t have a sitemap, you can use a tool like the free Xenu to generate one for you.
Broken Internal Links
If you aren’t fixing your broken internal links (broken links from one page of your site to another) for your users sake, then at least do it for the SEO value. Both Screaming Frog and Xenu will report your broken links, so it’s easy to identify them. Aside from the fact that this is very annoying to your visitors, finding broken links is critical because the Domain Authority (DA) that would normally flow to a page via a link evaporates if the link doesn’t hit it’s target. In other words, the DA is still assigned to that specific link, broken or not.
Google has informed us all that, starting in October 2017, they will warn all users who are filling out a form online that the form and it’s data are not secure. A big warning may not be the impression you want to give your visitors. Consequently, if you haven’t already done so, now is the time to check your site for any URL’s that may also trigger security warnings. The newest demand from Google as of August 2017 is that webmasters are to switch to HTTPS if they have forms on their site, or else Google will show a non-secure-site message.
Another area that I need to include for the sake of thoroughness is the importance of having a mobile-friendly version of your site. This has become even more crucial since November of 2016. As of that date, it has been confirmed that an experiment called the mobile-first index will take effect across the web. This means that a site will be judged primarily by its mobile version rather than its desktop version. Consequently, it is crucial that site owners ensure that their mobile versions are easily accessible, efficient, and fully crawlable.
Speed is important, especially as mobile versions become increasingly important. Since people are basically impatient, a slow site can be a direct hindrance to rankings. The resulting increase in bounce rate and negative user behavior simply add to the damage. Google provides a measure called PageSpeed and even provides us a tool to check ours . Google also gives recommendations for making improvements in problem areas.
Web Page Test is a free tool that allows you to measure how the different elements of your site are being sent to the end user’s browser, and enables you to isolate any bottlenecks and slow points.
Internal Link Structure
Every page on the web has a vote, and that vote gets divided by the number of pages it votes for. This is a brutally rudimentary description of how page juice passes between pages on a site. For example, if a page has 10 links and everything else is equal, then 10% of the passable page juice will flow to each link. If there were 100 links, then only 1% would pass to each link. The problem is that all pages are not equal. Consequently, the appropriate weight flow must be determined as well as the overall navigation.
Technical SEO Resources
- Moz On XML Sitemaps
- The Complete Guide to Robots.txt
- Optimizing your internal link structure
- HTTP to HTTPS: An SEO’s guide to securing a website
On-site SEO Auditing
Now that we’ve discussed some of the key issues with tech, let’s address on-site content optimization needs.
Titles & Descriptions
You can run Screaming Frog across your site to produce a list of all the pages, including their titles and descriptions, their character and pixel length, etc. This basically provides you with a very simple spreadsheet layout that enables you to quickly sort through your pages and look for titles and descriptions that are either too long or too short.
You can also scan through these titles and descriptions to ensure that they are appealing and clickable.
There is some disagreement as to the value of heading tags. Some in the industry attribute great value to them, while others do not. This is mentioned simply to allow you to take the following discussion of heading tags for what it is worth, and to let you know that there is disagreement in the industry concerning their value.
In a real world application, it would look something like this:
H1 – Ignitur SEO Software Summary
- H2 – What Ignitur Does
- H2 – Reporting
- H3 – Report Templates
- H2 – Features
- H3 – Link Data
- H3 – Historical Data
- H2 – Task Management
- H3 – Collaboration
- H3 – Instructions
Not only do heading tags add value to SEO, but this consistent structure helps Google understand how your page content is laid out and how to treat the various sections. This understanding helps Google to connect the dots between the sections of content, and to evaluate thoroughness and context.
Content optimization has but few generic rules since it depends largely on the niche, content purpose, and on a wide array of other factors. There are, however, a couple of global truths to follow:
- If it’s not 300 words or more in length, it should be. Thin content does not rank as well as in-depth content. Therefore, any pages with less than 300 words that you are hoping to rank for a term should be expanded on. There are very few exceptions to this rule (such as a dictionary site where a very short answer is understood to be desirable), but they are very rare
- Ask yourself, “Do you use words that Google would expect to see on the page?” For example, if you were writing an article on the history of operating systems, Google would rightfully expect to see such terms as Windows, Linux, etc. included on the page. If these terms are not included in the content, the page may not be viewed as a complete history of all operating systems, and a different page may most likely be selected.
In my global recommendations, I used to state that keywords should be used whenever possible. This, however, is no longer important. Overall topical relevancy supersedes keyword usage. However, it is a good idea to use your keywords at least a few times to ensure that both the search engines and the users are comfortable with the content, and that they can confidently conclude that the content is what they believe it to be.
Content SEO Resources
- How to create the right meta description
- The complete guide to optimizing content for SEO (with checklist)
Off-Site SEO Auditing
Now let’s move on to some of the core areas you need to consider in your offsite optimization.
Google My Business
Some people think of Google My Business as a local SEO factor only. This is not true. By claiming your company and optimizing it, you ensure that Google understands your location and topical relevancy for local ranking. However, this also reinforces your non-local topical relevancy and reinforces your company’s entity status. If you are not familiar with entities, you can read a piece on how they influence rankings.
I’m not talking about “set-it-and-forget-it” social media profiles. I’m talking about engaged and active profiles with content geared towards your target market and your peers. Social media serves two purposes, and both need to be considered during your audit.
- Engage with your audience. You need to engage your audience and be where they go for information related to your industry. For example, I have a client who is a property manager in Whistler, BC. We don’t just talk about their rentals on their social profiles, but also about what’s going on in Whistler — everything from the infamous Crankworx downhill biking event to the snowfall during the ski season. When auditing your social profiles, make sure you consider who your audience is, where they are (for our Whistler client, we focus on Facebook and Instagram because that’s where the traction is) and what their interests are (to determine if they would even be interested in your new launch, new product, or self-promotion).
- Engage with influencers. You also need to focus on following/friending influencers. The purpose of this is obviously to use them as amplifiers of your content in the future, or as part of link building opportunities. Again using Whistler as an example, I might want to interview some world class skiers on techniques, how to pick your equipment, etc. If I tried to do this with no advance groundwork, it most likely would not happen. However, if I show an interest in what they’re doing over time (by commenting on their Facebook pages, retweeting, etc.), they will be more prepared to accept my request for interviews. As I work my way up the status level of my targeted influencers, one interview will prepare the path to the next one as my targeted influencers share with each other their interview experience with me, particularly if my selected interview topic is the “right” one.
There are a variety of tools to pull backlinks data from Ignitur to ahrefs and Majestic (and they all have their pros and cons). Basically, you want to know how your site is positioned in backlinks relative to your main competitors. In other words, in this context it doesn’t matter who the biggest company might be in your niche. What matters is who ranks the highest. This is your ultimate goal.
A complete backlink audit cannot be discussed within the context of this article. Fortunately, I have written a separate article on this subject that can be found on Search Engine Land here.
Off-Site SEO Resources
- The Simplest Ways to Make the Best of Local SEO
- Back to the Basics: How to Claim and Optimize a Local Business Listing on Google My Business
- Performing a manual backlink audit, step by step
In this article we have covered some of the global truths of a site audit. As I mentioned earlier, every site is unique and every situation is different. Therefore, although everything on this list applies globally for the most part, you will likely need to look at other areas as well. As a rule of thumb, start with Screaming Frog and Google’s Search Console and work your way forward.