SEO audit

Learn how to perform the most comprehensive SEO audit worth $2,500, $5,000 or more, completely by yourself (or let me do it for you at a fraction of that price).

SEO Audit
SEO Audit (Illustration: Freepik)

Contents

Time it will take

As I said earlier, this is a very comprehensive audit. Therefore, the time it requires to complete it is rather significant.

Here’re my best estimates:

  • 10-12 hours – if you know what you’re doing.
  • 20 hours and up – if you’re new and need to learn.

Tools you will need

In an alphabetical order:

These are all the steps you will need to perform:

1. Findability

1.1. Is the site indexed?

Perform the following search in Google Search:

site:example.com

Use no spaces. Replace example.com with the website’s actual domain name. Do not use wwwhttp or https.

Searching Google using the site operator.
Searching Google using the site operator.

Yes. If Google Search returns any number of results, it means the website is indexed.

This website is indexed in Google.
This website is indexed in Google.

No. If the website is not indexed, you will see the “Your search – – did not return any documents” message:

This website is not indexed in Google.
This website is not indexed in Google.

Troubleshooting “did not match any documents”

1. Check your spelling. Make sure you spelled the domain name correctly. If not, fix the spelling and repeat the search.

2. Check WordPress settings. Navigate to the Settings > Reading section of the website’s WordPress admin area.

Make sure the Discourage search engines from indexing this site checkbox is unchecked. Leaving it checked tells search engine bots not to index the website.

Troubleshooting non-indexed website in the WordPress admin area.
This checkbox should be unchecked.

Note: You will be able to perform this step only if you have at least Administrator capabilities in WordPress.

3. Check the robots.txt file. Navigate to example.com/robots.txt (replace example.com with the website’s actual domain name; use http:// or https:// and www before the domain name, as needed).

Look for a possible occurrence of the following piece of code:

User-agent: *
Disallow: /

This code in the robots.txt file tells all bots not to visit any pages on the site.

4. Check the source code. Navigate to the website’s home page and press Ctrl+U or right-click inside the page and select View page source. This will open a new tab with the page’s source code.

Press Ctrl+F or click the Customize and control Google Chrome icon in the top-right corner and select Find. Type in noindex.

The presence of noindex inside the content attribute of a name tag tells bots not to index that URL.

Source code containing a "noindex" meta tag.
Source code containing a “noindex” meta tag.

5. Check with JavaScript turned off. In Google Chrome, go to Menu > Settings > Advanced > Content settings > Javascript (chrome://settings/content/javascript) and turn off JavaScript.

Turn off JavaScript in Google Chrome.
JavaScript is turned off in Google Chrome.

Now, go to the website. If you can navigate it, OK. If you are getting an error message or cannot navigate the site, it indicates a problem with findability.

1.2. DNS configuration

Enter the website’s domain name into the Pingdom DNS Health testing tool.

DNS check results.
DNS check results.

Review the results to determine if there are any configuration issues that might prevent the site from getting found.

1.3. Server uptime

Note: This analysis can be performed only when the website has a monitoring service installed and running for a reasonable period of time (at least a few weeks).

Review the results to determine any one-time incidents and/or long-term, recurring issues.

2. Indexing

2.1. Is there just one version of the site?

Go to the HTTP Status Code Checker and enter the following four variations of the website (replace example.com with the actual website’s domain name):

http://example.com
https://example.com
http://www.example.com
https://www.example.com

There should be only one version that returns a straight 200 Status Code. All the other status codes should begin with 301 and end with 200.

Checking HTTP status codes for variations of the home page.
Checking HTTP status codes for variations of the home page.

2.2. Are index file names redirected?

Go to the HTTP Status Code Checker and enter the main website’s URL (from the previous step), plus 3 other variations thereof (replace example.com with the website’s actual domain name; use http:// or https:// and www before the domain name, as needed):

http://www.example.com
http://www.example.com/index.php
http://www.example.com/index.html
http://www.example.com/index.htm

Only the main website’s URL should be returning a straight 200 Status Code. All the other status codes should begin with 301 and end with 200 or return a 404 Status code.

Checking index file name redirections.
Checking HTTP status codes for various index file names.

2.3. Are trailing slashes redirected?

Go to the HTTP Status Code Checker and enter a random (existing) URL in two variations, one with the trailing slash at the end and the other one without it (replace example.com with the website’s actual domain name; use http:// or https:// and www before the domain name, as needed):

http://example.com/existingpage
http://example.com/existingpage/

Only one of the two URLs should be returning a straight 200 Status Code. The other URL should begin with 301 and end with 200.

Checking if trailing slashes are redirected.
Checking if trailing slashes are redirected.

2.4. Has an XML sitemap been generated/submitted?

Go to Google Search Console and navigate to Crawl > Sitemaps. You want to see a sitemap submitted with indexed URLs.

XML sitemap submitted to Google Search Console.
XML sitemap submitted to Google Search Console.

If the listing shows warnings, review them.

2.5. How many URLs are/should be indexed?

Go to Google Search Console and navigate to Google Index > Index status for the property in question. Record the number of indexed URLs.

The number of indexed URLs.
The number of indexed URLs.

Next, navigate to Crawl > Sitemaps. Record the number of submitted URLs and the number of indexed URLs.

The number of submitted vs. indexed URLs.
Sitemaps: submitted vs. indexed URLs.

Now, compare the following numbers:

  • Submitted (sitemap) vs. Indexed (sitemap).
  • Submitted (sitemap) vs. Indexed (index).

If there are significant differences, it could indicate issues requiring further investigation. Also, pay attention to any fluctuation of indexed pages over time.

2.6. Are the right URLs excluded?

Start the BeamUsUp SEO Crawling Spider and run it on the live version of the website’s main URL with the following configuration:

Ideal BeamUsUp configuration.
Ideal BeamUsUp configuration.

If the website is too large or you are hitting your computer’s memory limits, uncheck the Try to load external URLs option.

Click on the TO_NOTE: blocked.meta_robots item in the right sidebar:

Results for excluded files from a BUU crawl.
List of excluded URLs.

Review the list; look for any URLs that should be indexed (but are currently excluded).

2.7. Are the right URLs redirected?

In the BeamUsUp crawl results, click on TO_NOTE: status.301 to display all the redirected URLs.

List of 301-redirected URLs.
List of 301-redirected URLs.

Review the list; look for any URLs that should not be redirected (i.e. they should be either live or non-existent) or that redirect to the wrong destination.

Repeat the step with TO_NOTE: status: 302 URLs.

2.8. What URLs are 404 Not Found?

In the BeamUsUp crawl results, click on ERROR: status.error to display all linked URLs that return a 404 Not Found status.

List of 404 Not Found URLs.
List of 404 Not Found URLs.

Review the list; look for any URLs that should be live or that should be redirected. These URLs will need to be fixed. Any incoming links to non-existing URLs will also need to be fixed.

2.9. Non-existing URLs

Go to the HTTP Status Code Checker and enter a non-existing page’s URL:

http://example.com/nonexistingpage

Replace example.com with the actual website’s domain name. The page should return a straight 404 Not Found status code, without any redirects.

Non-existing URL returning a 404 Not Found status code.
Non-existing URL returning a 404 Not Found status code.

2.10. Subdomains

Go to Find Subdomains by Pentest Tools, enter the website’s domain name and Start Scan with the default settings.

Review the returned subdomains and determine that none of them contains a duplicate or similar version of the main website. If it does, check that it is not indexed, excluded from the search engines or 301-redirected to the main website’s URL.

3. Ranking & Performance

3.1. Branded search

Go to Google Search and search for the website’s business name (main brand).

Ideally, it should rank #1, possibly with site links. Most of the other spots should be occupied by the brand’s social accounts and other web properties.

Note any results in the top 10 not owned/controlled by the business in question.

3.2. Best performing pages

Go to Google Search Console and navigate to Search Traffic > Search Analytics for the property in question.

Check all: Clicks, Impressions, CTR and Position. Check Pages. In Dates, set the date range to Last 90 days.

Best performing pages in Google Search.
Best performing pages in Google Search.

Review the data and look for the following:

  1. What are the best performing pages, overall?
  2. Are the most important pages among them?
  3. What pages have the highest CTR?
  4. What pages have the worst Impressions-to-Clicks ratio?
  5. What pages have the worst Position-to-Clicks ratio?

3.3. Best performing keywords

Go to Google Search Console and navigate to Search Traffic > Search Analytics for the property in question.

Check all: Clicks, Impressions, CTR and Position. Check Queries. In Dates, set the date range to Last 90 days.

Best-performing keywords in Google Search.
Best-performing keywords in Google Search.

Review the data and look for the following:

  1. What are the best performing keywords, overall?
  2. Are the most important keywords among them?

3.4. Pages with no Impressions

Go to Google Search Console and navigate to Search Traffic > Search Analytics for the property in question.

Check: Impressions. Check Pages. In Dates, set the date range to Last 90 days.

Impressions for Pages in the last 90 days.
Viewing Impressions for Pages in the last 90 days.

Note the total number of returned pages at the bottom right corner.

Less than 1,000 pages

This site has less than 1,000 pages with Impressions.
This site has only 86 pages with impressions.

Download the data as a CSV file. Combine the data with the list of URLs in the XML sitemaps. Extract the URLs that do not have any impressions. These are the worst performing URLs on the site.

More than 1,000 pages

If the returned total is 1,000 pages, it means you hit the limit in Google Search Console and there are likely more pages with at least some impressions. The next step depends on how many impressions do the worst returned pages get.

Sort the pages by Impressions from lowest to highest. Note the number of impressions for the worst performing page among the 1,000 returned.

If the number is in single or low double digits, carry out the same steps as for sites with less than 1,000 pages with Impressions (download, combine with XML sitemaps, extract pages with no impressions).

If the number is much higher (like in the screenshot below), you will need a tool that can extract more data from Google Search Console.

Even the worst performing page gets 244 impressions.
Even the worst performing page (of the 1,000 returned) gets 244 impressions.

One such tool is Search Console Helper, which can retrieve up to 5,000 rows of data from Google Search Console.

In Search Console Helper, navigate to GSC checker. Once there, select the correct website from the dropdown list, set Date interval to Last 90 days, uncheck all Dimensions except for Pages, and uncheck Daily. Hit Fetch.

Search Console Helper: Extracting Pages with Impressions
Search Console Helper: Extracting Pages with Impressions

Export the data as an Excel file. Combine the data with the list of URLs in the XML sitemaps. Extract the URLs that do not have any impressions. These are the worst performing URLs on the site.

Note: if you are hitting limits even with Search Console Helper (5,000 rows of data), you can still complete this step, but the results will contain also pages with Impressions. So, instead of producing a list of pages without Impressions, you will have a list of the worst performing pages, ranked #5,001 and worse by number of Impressions.

3.5. Pages with poor engagement

Go to Google Analytics and navigate to Acquisition > All Traffic > Source / Medium. Click on google / organic.

Set the date to show data for the last 90 days.

Replace the All Users segment with New Users.

Uncheck All Users and check New Users. Click Apply.
Uncheck All Users and check New Users. Click Apply.

Set the Secondary Dimension to Landing Page.

Set Secondary Dimension to Landing Page.
Set Secondary Dimension to Landing Page.

Set the Show rows number in the bottom right corner to display all results.

Select a number large enough to display all results.
Select a number large enough to display all results.

Export the data as an Excel file.

Open the Excel file, navigate to the Dataset1 sheet and click the Enable Editing button.

Name the last column (L) Engagement and type the following formula into the first row with data (L2):

=IF(F2 < 95%, "OK", IF(H2 < 5, "Low engagement", "OK"))

Next, copy the formula across all rows in the Engagement column.

Finally, filter out all the Landing Page URLs marked as “Low engagement”.

Filtering out Pages with low engagement values.
Filtering out Pages with low engagement values in Excel.

Review the “Low engagement” URLs to look for common patterns in the URL structure, and to get an understanding why these pages might be causing visitors to bounce straight away.

3.6. Google penalties

Go to Google Search Console and navigate to Messages for the property in question.

Messages in Google Search Console.
Messages in Google Search Console.

Review the messages for any indication of issues with the website.

Next, navigate to Search Traffic > Manual Actions.

Any manual actions will show up here.
Any manual actions will show up here.

Finally, go to Barracuda’s Panguin Tool to view the website’s Google Analytics data against the information on various Google updates.

The Panguin Tool check Google Analytics data against information on Google updates.
The Panguin Tool checks Google Analytics data against information on Google updates.

Look for any patterns, trends or correlations between the website’s traffic and individual Google updates.

4. Website architecture

Turn off Javascript and turn on the NoFollow browser extension. Then, go to the website’s home page and start navigating the website.

Pay attention to how individual pages are interlinked and whether there is any hierarchy. Disregard any JavaScript and “nofollow” links.

4.1. Internal linking structure

Classify the website into one of three internal linking structure types:

  1. Linear – each page links to only two pages: Next and Previous. Optionally, a page could sporadically link to supporting documents or to other pages within the sequence. All pages link to the home page.

    Linear internal linking structure
    Linear internal linking structure (Image: webstyleguide.com).
  2. Hierarchical – each page links to all (or most important) pages within the given category, but not pages in other categories. If there are multiple tiers, pages from one tier normally link to pages within 1 tier distance from them, but not further.

    Hierarchical internal linking structure.
    Hierarchical internal linking structure (Image: webstyleguide.com).
  3. Web – all pages link to all/most other pages.

    Web-like internal linking structure.
    Web-like internal linking structure (Image: webstyleguide.com).

Is this the optimal internal linking structure, given the site’s content?

4.2. Main Navigation

Review the main navigation to answer the following questions:

  1. Does it link to all key areas of the website?
  2. Is sub-navigation used appropriately to access lower-tiered areas?
  3. Do all links use appropriate anchor text?
  4. Are all links plain, text-based, and without the use of JavaScript or the nofollow attribute?

4.3. Breadcrumbs

Does the website use breadcrumbs navigation? If yes:

  1. Does the breadcrumb hierarchy correctly reflect the website’s structure?
  2. Do all links use appropriate anchor text?
  3. Are all links plain, text-based, and without the use of JavaScript or the nofollow attribute?

If no breadcrumbs navigation is found, would the website benefit from such navigation?

4.4. Sidebar/Footer navigation

Do the sidebar(s) and/or footer(s) support the website’s SEO goals by strategically linking to the most important pages?

4.5. Rel=”nofollow”

Is the rel="nofollow" attribute used anywhere in the internal navigation? If yes, is it used correctly to exclude pages that are not intended to rank in Google? If no, would the website benefit from using this attribute in the internal navigation?

4.6. Website depth

In the BeamUsUp crawl results, click on Show all URLs Found, horizontal-scroll to view the Level column.

BeamUsUp: Analyzing website depth.
BeamUsUp: Analyzing website depth.
  1. Can all important pages be accessed within 1 click from any other page?
  2. Are all pages that are intended to rank, accessible within 3 clicks from the home page?

4.7. HTML sitemap

Does the website have an HTML sitemap? If yes:

  1. Is it accessible from some part of the site-wide navigation (main navigation, sidebar, footer, etc.)?
  2. Is the link to the HTML sitemap a plain, text-based, “followed” link?
  3. Does it contain links to all pages intended to rank? If broken up into sub-pages, are all sub-pages within 1-2 clicks from the main HTML sitemap page?
  4. Are all the links in the sitemap plain, text-based, and without the use of JavaScript or the nofollow attribute?

If the website does not have an HTML sitemap, would it benefit from having one?

4.8. Pagination

Does the website use pagination (either numbered or some form of “Previous/Next”)?

If yes, is the relationship between individual pages marked, using the rel="next" and rel="prev" attributes or are the URLs properly cannonicalized?

If no pagination is used, would the website benefit from this type of navigation?

4.9. URLs

Review the URLs and the source code to determine the following:

  1. Are hyphens used as the default delimiter between words in URLs?
  2. Are the URLs sufficiently short?
  3. Do they contain the target keyword(s)?
  4. Are only absolute URL used for linking?
  5. Are all URLs lowercase?
  6. Are all uppercase URL versions properly redirected/cannonicalized to their lowercase counterparts?
  7. Are all trailing slashes standardized and automatically redirected/cannonicalized to the chosen version?
  8. If session IDs are used, are the URLs properly cannonicalized to the plain URL version?

5. On-page optimization

5.1. Title tags

In the BeamUsUp crawl results, click on ERROR: title.duplicate to display all URLs that contain duplicate title tags.

URLs with duplicate title tags.
URLs with duplicate title tags.

Review the data to identify URLs that need to be fixed.

In the BeamUsUp crawl results, click on WARNING: title.too_long to display all URLs that contain title tags over 70 characters long.

URLs with title tags that are too long.
URLs with title tags that are too long.

Review the data to identify URLs that need to be fixed.

5.2. Meta descriptions

In the BeamUsUp crawl results, click on WARNING: meta.description.duplicate to display all URLs that contain duplicate meta description tags.

URLs with duplicate meta description tags.
URLs with duplicate meta description tags.

Review the data to identify URLs that need to be fixed.

In the BeamUsUp crawl results, click on TO_NOTE: meta.description.missing to display all URLs that are missing  meta description tags.

URLs with missing meta description tags.
URLs with missing meta description tags.

Review the data to identify URLs that need to be fixed.

In the BeamUsUp crawl results, click on TO_NOTE: meta.description.too_long to display all URLs that have too long meta description tags.

URLs with too long meta description tags.
URLs with too long meta description tags.

Review the data to identify URLs that need to be fixed.

5.3. Headings

In the BeamUsUp crawl results, click on TO_NOTE: h1.missing to display all URLs that are missing H1 headings.

URLs with missing H1 headings.
URLs with missing H1 headings.

Review the data to identify URLs that need to be fixed.

5.4. Structured data

Go to Google Search Console and navigate to Search Appearance > Structured data.

Google Search Console: Structured Data.
Google Search Console: Structured Data.

Review the data types used, how/where they are used, if they are used correctly and study any errors or warnings.

5.5. Mobile usability

Go to Google Search Console and navigate to Search Traffic > Mobile Usability.

Google Search Console: Mobile usability.
Google Search Console: Mobile usability.

Review each reported usability issue.

5.6. Social sharing

  1. Are links to the website’s social accounts present in the header(s), footer(s) and/or sidebar(s)?
  2. Are social sharing icons present and visible on all relevant content pages?

5.7. Broken links

In the BeamUsUp crawl results, click on ERROR: links.broken to display all URLs containing broken links.

List of URLs with broken links.
List of URLs with broken links.

To find all broken links, click on a URL in the main window, and then on the Outgoing links tab in the window below. Sorting the outgoing links by Response Code, will give you a list of broken links to fix.

Note: This will work for external broken links only if you ran the BeamUsUp scan with the Try to load external URLs option checked.

5.8. Page speed

Analyze the website’s main URL with GT Metrix.

Analyzing website speed with GT Metrix.
Analyzing website speed with GT Metrix.

Review the results in all main tabs, including PageSpeed, YSlow, Waterfall and Timings, and recommend actions.

5.9. Images

Randomly view source code of a few pages with images.

  1. Are the images relevant to the topic of the page?
  2. Are they correctly scaled?
  3. Are they responsive?
  4. Are they served in the smallest file size possible?
  5. Do the file names correctly reflect the image content?
  6. Do all images have relevant alt text?
  7. Are captions used correctly?
  8. Are keywords used correctly in the image meta data?

5.10. Code validation

Run the website’s home page URL through the W3C Markup Validation Service.

Code validation using the W3C Markup Validation Service.
Code validation using the W3C Markup Validation Service.

Review the results and recommend actions.

5.11. Plagiarism

Go to Copyscape and check few URLs containing the most important website’s content.

Checking for plagiarized content on the web.
Checking for plagiarized content on the web.

Review the top results for each page. Look for higher percentages of matching content (~25% and up) and more known websites.

5.12. Internal duplicate content

Go to Siteliner and run an analysis on the website.

Internal duplicate content analysis.
Internal duplicate content analysis.

Review the data. If your total duplicate content is higher than the average for other sites, analyze the reasons and recommend actions.

5.13. Outgoing links

In Siteliner results, review the data on External Links per Page.

External links per page.
External links per page.

If the data falls below average, recommend correcting actions.

Also, randomly check a few pags to make sure they are linking to relevant, trustworthy sources.

6. Local SEO

Note: this section is relevant only for businesses with physical locations and/or competing in specific geographical areas.

6.1. Google My Business

Perform a search for the business name in each location in Google Maps.

Business listing on Google Maps.
Business listing on Google Maps.

Review the returned information.

  1. Is the data correct?
  2. Does each location have a separate listing?
  3. Is the data consistent across all locations?
  4. Does each location have a unique phone number?
  5. Does each location have an actual physical address (no P.O. Box or virtual address)?
  6. Are Google reviews enabled?

6.2. Name, address, phone number (NAP)

  1. Does the website have a “Contact Us” page featuring the complete name, address, and phone number of each location?
  2. (If less than 10 locations) Is the complete name, address, and phone number of each location present in the site-wide navigation (e.g. footer)?
  3. Is Schema markup used on all location data?
  4. Is the phone number highly visible on the website and clickable on mobile devices?
  5. Is the business name, address, and phone number (NAP) consistent everywhere it is mentioned?
  6. Does each location have a separate web page?
  7. Are links to all location pages present in the site-wide navigation (e.g. footer)?

6.3. Citations

Run a Business Search on WhiteSpark’s Local Citations Finder.

Local citations finder on WhiteSpark.
Local citation finder on WhiteSpark.

7. Off-page optimization

Enter the website’s domain into the Ahrefs Site Explorer. Replace example.com with the website’s actual domain name. Do not use wwwhttp or https. Keep the default *.domain/* setting to perform the analysis on all subdomains.

Ahrefs Site Explorer.
Ahrefs Site Explorer.

7.1. Referring domains

In Ahrefs Site Explorer results, click on Referring domains in the left sidebar.

Ahrefs: Referring domains.
Ahrefs: Referring domains.

Note the total number of results.

Next, order the list by DR (domain rating), from the lowest to highest.

Referring domains ordered by domain rating from lowest to highest.
Referring domains ordered by domain rating from lowest to highest.

Review the list of low-quality domains. Look for:

  • Spun-content sites.
  • Excessive number of foreign domains.
  • Bad neighborhood sites.
  • Paid links.
  • Private blog networks.
  • Forum spam.
  • Comment spam.
  • Low quality directories.

7.2. Anchor text distribution

In Ahrefs Site Explorer results, click on Anchors in the left sidebar.

Checking anchor text distribution.
Checking anchor text distribution.

Ideally, the most frequent anchor text should be the domain name and/or the company’s brand name(s). Any excessive amount of keyword-specific links needs to be investigated further.

7.3. Most linked pages

In Ahrefs Site Explorer results, click on Best by links in the left sidebar.

Pages with links from the most referring domains.
Pages with links from the most referring domains.

Review the results. Look for any unusual linking activity (e.g. disproportionate number of referring domains).

7.4. Broken backlinks

In Ahrefs Site Explorer results, click on Broken in the left sidebar.

List of broken incoming links.
List of broken incoming links.

Export the list for future work on fixing the links.

8. Competition

8.1. Main competitors

In Ahrefs Site Explorer results, click on Competing domains in the left sidebar. Sort by the number of Common keywords and focus on sites that directly compete with the website in question (ignore large brand names with little content overlap).

List of competing domains.
List of competing domains.

Note the top competing domains.

8.2. Comparison

Go to Open Site Explorer by MOZ and enter the website’s home page URL. Then, navigate to Compare Link Metrics. Enter the home page URLs of 4 main competitors (identified in the previous step).

Comparing link metrix to competitors.
Comparing link metrics to competitors.

Note the results for page specific and root domain metrics. Analyze the differences.

Do you have any questions?

Discuss here

The most comprehensive SEO audit on the market done for you (you don’t have to lift a finger)

If you need to audit your (or your client’s) website, but don’t have the time to do it or you value your time too much, I’m happy to help.

Me and my team will perform this exact SEO audit for any-size website and deliver the results to you within 10 business days (often much faster).

You will get…

…a single, easy to work with Google spreadsheet with the full results of the audit, plus the following supporting data:

  • List of excluded URLs.
  • List of redirected URLs (301 and 302, separately as two lists).
  • List of URLs returning a 404 Not Found status code.
  • List of best performing URLs.
  • List of best performing keywords.
  • List of URLs without Impressions.
  • List of URLs with low engagement.
  • List of URLs with duplicate title tags.
  • List of URLs with too long titles.
  • List of URLs with missing meta descriptions.
  • List of URLs with duplicate meta descriptions.
  • List of URLs with too long meta descriptions.
  • List of URLs with missing H1 headings.
  • List of URLs with mobile usability issues.
  • List of URLs with outgoing broken links.
  • List of local citations (if applicable).
  • List of URLs with broken incoming links.

Price

$599 – just the results.

$699 – results, plus a 60-90 min consultation over Skype.

Order NOW!

Contact me

Share