How To Perform an Perfect SEO Audit of Your Website

SEO Audit

Doing an Perfect SEO Audit is a tremendous task. You need to approach this from different angles, have a defined structure of what you wish to accomplish and you have to use the right tools to get the job done perfectly.

Can I do it too? Of course. Just follow the steps below:

Step 1: Perform a Crawl on the website

In this step you'll crawl your website for find out technical problems your website might encounter.

I recommend using Screaming Frog's SEO Spider to perform this crawl (it's free for the first 500 URIs and £99/year after that).

Alternatively, you can use Xenu's Link Sleuth; but keep in mind that this tool was designed to crawl a site to find broken links. It displays a site's page titles and meta descriptions, but it was not created to perform the level of analysis we're going to discuss.



Screaming Frog SEO Spider is a free small desktop program (PC or Mac) which crawls websites' links, images, CSS, script and apps from an SEO perspective. It goes through every single one of your pages and looks for the following:

  • Link Errors – Client errors such as broken links & server errors (No responses, 4XX, 5XX).
  • Redirects – Any Permanent or temporary redirects (301, 302).
  • External Links – all of the sites you link out to and their status codes.
  • Protocol – Whether the URLs are secure (HTTPS) or insecure (HTTP).
  • URL Issues – Non ASCII characters, dynamic URLs, uppercase characters, URLs that are too long, and underscores.
  • Duplicate Pages – Hash value / MD5checksums algorithmic check for exact duplicate pages.
  • Page Title Tag – Missing, duplicate, over 65 characters, short, pixel width truncation, same as h1, or multiple title tags.
  • Meta Description Tag – Missing, duplicate, over 156 characters, short, pixel width truncation or multiple meta description tags.
  • Meta Keywords Tag – the same stuff as title and meta description tags. Mainly for reference, as they are not used by search engine like Google, Bing or Yahoo.
  • Headings Tags – the types of headings you use (h1, h2, h3) as well as keyword usage, duplicates, over 70 characters, and any missing heading tags.
  • Meta Robots – what you are allowing to be indexed or not indexed as well as if you use it (Index, noindex, follow, nofollow, noarchive, nosnippet, noodp, noydir etc.).
  • Rel Canonical – in case you are pointing search engines to a different URL (Canonical link element & canonical HTTP headers.).
  • File Size – Size of URLs & images (the smaller your file sizes, the faster your load time).
  • Page Depth Level – Page Depth Levels search engines have to crawl to find all of your content.
  • Internal links – what pages you are linking to within your own website.
  • Anchor Text – All link text you are using for hyperlink. Alt text from images with links.
  • Follow & Nofollow – which of your links are being followed or not (At page and link level).
  • Images – All URIs with the image link & all images from a given page. Images over 100kb, missing alt text, alt text over 100 characters.
  • Search Engine Crawler Setting – this feature will allow you to choose your favorite search engine crawler (e.g., Googlebot, Bingbot, etc.). this helps you see what particular search engine crawler see.

Once you crawl your whole website with Screaming Frog, which shouldn’t take more than a few minutes, you can then export all of that data into Excel spreadsheet to help you better analyze the data.

Step 2: Webmaster Tools Configuration

Once the first step is completed, The crawl report gives us a ton of information, but to take this SEO audit to the next level, we need to see the website from inside search engine. Unfortunately, search engines don't like to give unrestricted access to their servers so we'll just have to settle for the next best thing: webmaster tools it is.



Most of the major search engines like Google and Bing offer a set of diagnostic tools for webmasters, but for our purposes, we'll focus on Google and Bing Webmaster Tools. If your website isn’t registered with Google Webmaster Tools and Bing Webmaster Tools yet, make sure you do so now.


Through these, you can see your website’s health, any crawl errors Google and Bing is experiencing, how fast your site is loading, and almost anything you can dream of. If you want to learn about all of the features in Webmaster Tools, check out below Guides.


Step 3: Keywords Research

With the help with Screaming Frog crawling (Look to your Step 1 Data) title tag, meta description, and meta keywords data, you can get a good understanding of what your website is trying to accomplish or rank for. If you combine that data with your Google Webmaster Tools and Google Analytics keyword data, you can see what a website is getting traffic for.

Google Webmaster Tools Keywords Report

Google Analytics Keywords Report

If you then take the keywords out of those two tools and enter them into Google’s Keyword Planner, it will give you group of keyword ideas:

Google’s Keyword Planner Suggestion Tool



The interesting fact about Google’s Keyword Planner is that it will tell you how competitive a keyword is and it will also tell you Avg. Monthly Searches within your selected country/locality each month.

This Keywords Group will help you get a better understanding of the potential keywords you should target, but currently aren’t. When looking at the Google’s Keyword Planner, keep in mind the following:

  • Focus on Local Searches.
  • Don’t Target Competitive keywords.

Step 4: SEO Friendly URLs

If you look at your Screaming Frog crawl report, you will see a list of all of your URLs. The way the URLs should analyze is:

  • Static URLs – your website URLs should be static. Dynamic URLs usually contain random characters like: $, =, +, &. Static URLs typically contain numbers, letters and dashes which might get a slight advantage in terms of clickthrough rate.
  • URL Length – try to keep URLs under 100 characters.
  • User Friendly URLs – ideally your URLs should be easy to remember. Cut away dashes and slashes when you don’t need them.

If you have URLs that don’t fit these criteria, you could create new URLs. When creating new ones, make sure you 301 redirect your old URLs to the new ones. That way you don’t lose the links that may be pointing to the old URLs.

Step 5: Title Tags

A page's title is its single most identifying characteristic. It's what appears first in the search engine results, and it's often the first thing people notice in social media. Thus, it's extremely important to evaluate the titles on your site.

Here are the rough guidelines you should use for your title tags:

  • Keep title tag short (less than 70 characters) and test how it looks like in the search engine result page.
  • Make sure that title tag is interesting and that it matches the visitor’s search intent.
  • Include your highest-value keywords in the beginning.
  • Add your brand name at the end of it when possible.
  • Make sure you don’t duplicate titles across the pages of your site.

Step 6: Meta Descriptions

The Meta Description tag does not affect keyword rankings so do not try to stuff keywords in it. Instead, use it to describe the page content succinctly and accurately. Make it actionable and encourage users to click on your link and you will see a huge impact on the click-through rate.

Step 7: Meta Keywords

Most search engines ignore this tag so you have no benefits from using it. The only thing you can accomplish by adding your keywords to this tag is to allow your competitors a sneak peek over your targeted terms.

Step 8: Headings

Although heading are not as important as page titles from an SEO point of view, the headings (H1, H2, H3, etc.) still weigh enough and you should make sure they are not missing and are used correctly on each page. More than that, headings have a great impact over how content is perceived by the reader, improving the user experience and conversion on the page.

With typical HTML standards, h1 tags are usually the largest on the page. For this reason it is important for you to use headings with large fonts within each page.

  • Every page should have an H1 tag, as search engines look to the H1 to help determine the topic of a page. It should be the first thing in the body text of the page and should appear prominently.
  • H1 tags should never contain images or logos, only text. The keyword of a page needs to be used in the H1 tag and in at least half of the total heading tags on a page, if more than one heading tag is present.
  • From a usability perspective, paragraphs should never be longer than 5 lines of text, and it is wise to break up a page every 2-3 paragraphs with a sub-heading in the form of an H tag (H2 or H3) or an image. Testing has shown that when users are faced with a large block of unbroken text, most either skim over the text or skip it altogether, so content needs to be divided into usable chunks.

Step 9: Site Content

You might heard the saying “Content is King”. Your pages need to have enough fresh content to rank well in the search engines. Having less than 300 words on a page (not counting the HTML tags) is considered sub-optimal. What’s interesting is that pages with more than 2,400 words usually receive better rankings in the search engines.

One of the major issues that could affect your rankings in the search engines is duplicate content. Regardless if you have only one product or thousands of products on your site, it is important to make the content unique and target different keywords on each page.

In the past, duplicate content could only harm that content itself, by being filtered out by the search engine or sent to the supplemental index instead. Ever since the Panda update was released though, a duplicate content problem may impact your entire site, not just the pages that are duplicated. You can have good pages on your site (that are not duplicated) lose their rankings or even fall out of the index altogether.

To find out if you have content that exists in a similar form on another page or website you can use the Copyscape tool.

Step 10: Image Text and alt Texts

A picture is worth a thousand words but unfortunately only humans can see it. To make sure the search engines also understand what your pictures are about, you should include the important keywords that describe each of them in two places: in the file name and in the alt attribute.

For a comprehensive resource on optimizing images, read Rick DeJarnette's Ultimate Guide for Web Images and SEO.

Step 11: Internal and External linking

Internal Links

Internal links are links from one page of your site to a different page on your site. Although commonly used in main navigation, when done right, they should improve both rankings and usability.

Both Webmaster Tools and Screaming Frog Tool will give you data on internal links. The more you link within your own site, when relevant, the easier it will be for search engines to crawl your whole site.

Each page of your site has the potential, through its content, to link to other pages from your site. To use this potential, you should insert contextual links to other pages from your site that you would like to rank better. Just make sure you use the keywords that you would like the target pages to rank for when you link to them.

And remember that your visitors are more likely to click on a link in the text of a page, because it feels more natural.

External link

When you link from a page of your site to another page on a different site, you send a powerful vote, endorsing the target’s page quality. Therefore it is important to make sure your site links only to high quality authority sites, otherwise your site’s trustworthiness might be affected.

In case you must link to sites that you don’t trust, make sure you use the nofollow attribute.

Step 12: Robots.txt and Meta Robots Tags

In this step the most important thing to begin with is to make sure that your content is accessible to the all search engines. One mistake here and the search engines won’t be able to crawl your site, which means you will get no rankings at all (you're doomed) . With that in mind, let's make sure your site's pages are accessible.

Robots.txt

The robots.txt file is used to restrict search engine crawlers from accessing sections of your website. Although the file is very useful, it's also an easy way to inadvertently block crawlers.

As an extreme example, the following robots.txt entry restricts all crawlers from accessing any part of your site:


Manually check the robots.txt file, and make sure it's not restricting access to important sections of your site. You can also use your Google Webmaster Tools account to identify URLs that are being blocked by the file.

Meta Robots Tags

The meta robots tags is used to tell search engine crawlers if they are allowed to index a specific page and follow its links.

When analyzing your site's accessibility, you want to identify pages that are inadvertently blocking crawlers. Here is an example of a robots meta tag that prevents crawlers from indexing a page and following its links:



Step 13: URL Canonicalization

URL Canonicalization, is one of the basic principles of SEO and it’s essential to creating an optimized website.

One of the common mistakes that most website owners do is splitting the link authority of their website because they are not redirecting the non-www section of their website correctly.

Example:

http://yourdomain.com/

should 301 redirect to:

http://www.yourdomain.com/

If you don’t do this, you are essentially telling the search engines to keep two copies of your site in the index and split the link authority between them.

How can you make sure you don’t have this problem? It’s easy to find out. Just search in Google for:

site:yourdomain.com -www

If your search does not match any documents, then you should be fine. Otherwise use the htaccess redirect tool from the Tools section below.

Step 14: Broken links

Because the Google and other search engines crawl the web link-to-link, broken links can cause SEO problems for your site. When Google is crawling your website and hits a broken link, the crawler immediately leaves your website. If Google encounters too many broken links on your website, it may consider that site has a poor user experience, which can cause a reduced crawl rate/depth and both indexing and ranking problems.

Unfortunately, broken links can also happen due to someone outside of your website linking in incorrectly. While these types of broken links can’t be avoided (or you don't control over), they can be easily fixed with a 301 redirect.

To avoid both user and search engine problems, you should routinely check Google and Bing Webmaster Tools for crawl errors and run a tool like Link Checker on your site to make sure there are no crawlable broken links.

If broken links are found, you need to implement a 301 redirect per the guidelines in the URL Redirect section.

WordPress user can use to monitor and make 301 redirects by plugin like Broken Link Checker.
You can also use your Google Webmaster Tools account to check for broken links that Google has found on your site.

Step 15: Page Load Speed

Website Visitors have a very little attention span, and if your site takes too long to load, they will leave. Similarly, search engine crawlers have a limited amount of time that they can devote to each site on the Internet. Consequently, sites that load quickly are crawled more thoroughly and more consistently than slower ones.



You can measure your site's Load Speed with a number of different tools. Google Page Speed and YSlow check a given page using various best practices and then provide helpful suggestions (e.g., enable compression, leverage a content distribution network for heavily used resources, User Experience, etc.).

Pingdom Full Page Test and GTmetrix presents an itemized list of the objects loaded by a page, their sizes, and their load times. Here's an excerpt from Pingdom's results for w3storm.com:


These tools help you identify pages that are serving as bottlenecks for your website. Then, you can itemize suggestions for optimizing those bottlenecks and improving your website's performance.

You might also see benefits by using a content delivery network (CDN) for your images like cloudflare or maxcdn.

Wordpress user can try cacheing plugin like WP Super Cache, W3 Total Cache, etc that can help with page load speed issues, and a simple CDN can be set-up via Amazon AWS for very little money.

Step 16: Inbound links

Backlinks, also known as inbound links, incoming links, inlinks, and inward links, are incoming links to a website or web page. In basic link terminology, a backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node.

The most powerful inbound links that you can get from another website are those that are within the text of a page and that are surrounded by content that is relevant to both your site and the link anchor text.

When it comes to the number of incoming links, the more the better. But it is more important to get these links from different websites (unique root domains). This means that having 1 link from 10 unique websites is a lot better than having 10 links from 1 website.

To find out the number of inbound links to your site and the anchor text distribution you can use Open Site Explorer, Ahrefs, SEO PowerSuite or Majestic SEO. All these tools provide link metrics and detailed information that can help you audit your link profile. Through Open Site Explorer you can get a great overview of your inbound links just like following image:



Just keep in mind that these tools use their own link graph, i.e. they crawl the Internet independently and create their own index. This means that they can only tell you what’s in their own index, and not what’s in the Google’s or Bing’s index database.

Step 17: Social Media Audit

Social media is one of the most effective methods to influence and engage with your customers. Your ability to engage socially and become popular on social platforms will also have a great impact on your site’s ability to achieve higher search engine organic rankings.



Both Google and Bing have clearly stated that they take social signals into account when ranking websites. In other words, social media does affects SEO.

If you want to do better on the social web, consider the following 4 tips:

  • Set up your profile and optimize for human interaction and to make sure that your logo and about/bio information are there so that your visitors recognize you.
  • Make it easy for people to share your content socially by integrating sharing features throughout your website, blog posts, etc.
  • Create content that is worthy of sharing and then reach out to people in that space via social channels to ask for feedback about said content.
  • Have a social media posting policy that your entire staff follows to maintain branding, tone and messaging consistency across all platforms.

To find out how many shares and likes a page has you can use Free Social Media Analytics Tools like BufferFollowerwonkGoogle AnalyticsSumAllKlout, or your individual social channel analytics.

Finally, you need to pay close attention to the response you get from your social audience. What is the level of engagement you achieved? Was it just a “Like” or did it go further to sharing or leaving positive comments? The higher the engagement, the more likely it is to have a major impact over the growth of your social circles.

Step 18: Competitive Analysis

If one of your main goals is to achieve high rankings for your website in the search engines, then you need to first find out who are the other websites that already rank for the keywords you are targeting: your competitor.

Analyzing your competitors will help you get a better understanding of their strength and whether you have a real chance of outranking them. Making a good decision when you enter a niche will save you many months or even years of work spent on trying to catch up with a competition that is too strong. You would be better off finding a local or smaller niche and tackle that instead for a start.

Usually the first thing you should look at when you analyze your competitors is their overall strength. SEMrushQuick Sprout Website Analyzer, and Moz Open Site Explorer are the best competitive audit tools I've used so far.

As you can see from the above picture, I have put moz, semrush, quicksprout, ahref and majestic seo in Moz's Open Site Explorer for compare their metrics and yes Moz win once again.

After you've analyzed your site and your competitors websites, Now that you’ve done an Perfect SEO Audit of Your Website. This will allow you to have a Complete SEO Audit Report that looks pretty.

You can either plug the data into the audit template, Google Document, or using Microsoft Word. Once you do so, you will see an area for you to add a subjective score of how you did overall per category, 1 being the lowest and 10 being the highest.

Additional Resources
Just in case this article weren't enough to feed your SEO audit hunger, here are a few more SEO audit resources you can go after:

Technical Site Audit Checklist - Geoff Kenyon provides an excellent checklist of items to investigate during an SEO audit. If you check off each of these items, you're well on your way to completing an excellent audit.

How to Perform Your First SEO Audit - Learn the steps to perform your first SEO audit with a step-by-step template.

Find Your Site's Biggest Technical Flaws in 60 Minutes - Continuing with the time-sensitive theme, this post by Dave Sottimano shows you just how many SEO-related problems you can identify in an hour.

How To Do Your Own 5-Minute SEO Audit - Here is how you can do your own 5-minute SEO audits. Don't worry if this takes you 10 or 15 minutes.

What Do You Think?
I would love to hear what you are doing differently when you audit a website. So why not comment below with your own audit method? It will help everyone make their own website audits better.

5 comments

Hey Parvez,
Thanks for a masterpiece content as expected. Really good valuable content :)

my website built with premium WordPress theme which load speed is really poor. any advice how can I improve my web site loading speed?

Hi,
It's very important to understand about on page seo because these things are very important to get top on Google ranking. Keywords, URLs and meta description if very important for any website. Thanks for guide us about these things.


EmoticonEmoticon