How To Guides, Search Engine Optimization, Technical SEO

40 Points You Must Check in a Website SEO Audit & Analysis

Imagine you have fully optimized your website and still it is not ranking in search results, then what will you do?I am sure, these following questions will arise in your mind:

Is my website SEO friendly? Is there is any Technical, On-page, Design, Coding or any other issues present on the website, which I am not aware of?

Is there any other error on the website which is affecting the performance of my website in search results?

Figuring out the answers of the above questions is not very simple. But, doing a website SEO audit and analysis can answer all the above questions.

Website SEO Audit & Analysis

A Website SEO audit & Analysis contains a list of all the factors to be checked from different areas such as Technical, On-page, Design, Coding, Link Building, Analytics etc.

Mainly, a website SEO audit is divided into 6 parts –

  1. Technical SEO Audit
  2. On-page SEO Audit
  3. Website Content SEO Audit
  4. Backlink SEO Audit
  5. Analytics
  6. Design & User Experience

Technical SEO Audit

1. Information Architecture Structure

IA structure is the website information architecture i.e. showing overall flow of information on the website. IA structure represents two things – ‘user flow’ & ‘data flow’.

User flow is the website navigation structure visible to the users coming to the website. It should include all the important pages of the website.

Data flow is for crawlers coming to the website for crawling & caching of the website pages. It should include all the pages of the website.

Recommendations

  • Create a logical structure for best user experience
  • Include all the important pages on the top level for better site links
  • Maintain a proper hierarchy

See below the SEO friendly IAstructure example:

Site Structure Good for SEO

Image credits – Moz

2. Footer Structure

Include all the important links in the footer of the website. Use various target keywords as an anchor text for linking in the footer.

You can also include links to the pages which are not present in the main navigation. If your website has city specific pages, then include all the important city pages in the footer with keyword rich anchor text.

3. Robots file

Check if robots file present in the root folder of the website at /robots.txt. Check if all bad bots are blocked, all unimportant files, unimportant URLs & Admin panel is blocked.

Check if any important URL folder is blocked making the folder not crawl-able for Google bots.

4. Use (nofollow,noindex) tags

Blocking URLs using meta robots file can block the crawler from crawling the page content but the URLs are still present in the index. Using nofollow,noindex will block the crawlers from indexing the URLs.

5. Crawl errors

Rectify all the crawl errors in webmaster tools. Redirect all the 404 error URLs to their respective working URLs and mark them fixed. It will help Google bot to crawl website properly and efficiently.

6. Page Load Time

Page load time is one of the 200 Google ranking factors, so improve the website page load time to improve website performance. You can check the page load time here.

Check Google page speed insight score & recommendations here

Check GTmetrix/webpagetest.org & recommendations here

7. Broken Links

A broken link stops the crawler or user from visiting the next web page which affects the site performance. Check with various broken link checker tool available online and rectify all the broken links present on the website.

Use Check my links Google chrome extension to check for the broken links on your web pages.

8. Fix Canonical Domain Redirect

During the website development phase, many versions of the homepage are created and available for Google to crawl & index.

For example – /index.html & /default.html etc. It is recommended to keep only 1 single version of homepage available for Google bot to cache & index and redirect others to the main version.

For example, there are 2 versions of the homepage URL present www.xyz.com & www.xyz.com/index.html. It is recommended to keep only 1 version from these above 2 URLs. How to decide which should be selected?

Go to Analytics and track the all-time visits for both the URLs and check which URL has the higher number of visits.

Find the number of links for both the URLs and check which URL has the higher number of inbound external links.

The higher number wins.

9. Use Self-canonical tags (rel=canonical)

The self-canonical tag is very important to use on each of the website pages to stop the creation of duplicate pages. This tag is very important, especially for e-commerce websites.

There are many strings get attached at the end of the URL, which results in the creation of multiple URLs for the same page causing content duplication.

To avoid this, it is recommended to add a self-pointing rel=canonical tag on each of the website pages.

10. Caching & Crawling of the Website

It is very important to know how Google bot sees your website. How your web pages are stored in Google’s index. See the Fetch as Google version of your website through option available in Google Webmaster tools.

Check the caching date of all the important web pages on the website and figure out how frequently Google bot is crawling your web pages.

Use ‘cache:site URL‘ to check cache date.

Check the text-only cached version of the web pages and see if there is any part of the web page which is not visible to Google bot i.e. blocked for the crawlers.

Rectify all the crawl errors present on the website and redirect all the 404 URLs.

Download all the URLs from All Pages section in Analytics and run a status check for all the URLs to find out the 404 pages. Now fix them by redirecting to the relevant pages.

Use Xenu to find out all the error URLs on the website and fix them.

11. Implement Proper Redirections (301 or 302)

301 redirections pass the link juice from the old URL to the new one, while 302 redirections don’t. So check and make all the redirection as permanent 301.

Check for the older versions of the important website pages, if they are properly redirected to the new URLs using 301 redirections or not.

12. Check your Server Response Time

High server response time affects the overall website performance by increasing the page load time of the webpage. Check the server response time of your website and rectify if there is any delay.

Go the Google page speed insights tool to check if there is any delay in server response time.

13. Use of JavaScripts is not SEO Friendly

Google cannot crawl JavaScript. (Now, Google can crack JavaScript), But still it is recommended not to use content in JavaScript on the website. Remember these few points:

  • Put the page content in HTML text to make it crawlable for the bots.
  • Do not use JavaScript in navigation/footer otherwise, the links in navigation/footer will not get crawl
  • Do not use JavaScript enabled pop-up content
  • It is recommended to not use any JavaScript on the page, but if it is necessary to use then club multiple JavaScript into a single one to improve the web page performance
  • Do not place JavaScript in the up-fold of the web page. It will increase the page load time and hinder the page rendering

Know more about how Googlebot crawls JavaScript – A test conducted by Search Engine Land.

14. No Ajax

Just like JavaScript, Ajax is also not crawl-able. So it is better not to use them on the website. But, things are changing now. Read this article from Moz where they are talking about how ajax content crawled by Google.

15. In Which Country, Your Website is Hosted

Check for the country where the website is hosted and if possible, then change it to the same geo-location where the website is targeted. It will reduce the server response time and will improve overall page load time.

16. Use Content Delivery Network

Got a heavy website? Use content delivery network and optimize the page speed of the website. Google has confirmed that page speed is one of the ranking factors and it can affect your website performance in search engines.

Check this article on Search Engine Land about How CDNs can Impact SEO.

17. Is your Content Internally Linked?

Internal linking is important for:

  • Passing the value & authority of a page to the other website pages
  • Easier for users to navigate from one page to other pages of the website
  • Improves the crawling of the website
  • Use rich keyword as an anchor text in internal linking, It helps in improving rankings for the rich keywords

18. Remove Duplicate Pages on your Website

Same pages with multiple different URLs causes page duplication issue. It confuses Google to decide which page should be ranked in Google. If the content is duplicate then the page is not considered by Google bot and don’t get crawl & indexed.

Following are the methods of removing the duplicate pages from the website:

  • Go to Xenu link sleuth tool and run domain URL
  • Download and filter out text/HTML URLs
  • Run a status check. Remove the 200 OK & permanently moved URLs
  • Mark all the URLs with same title tag, now redirect all the duplicate page URLs to the main URL

19. Remove Insignificant Sitelinks

Type your brand name in Google search and you will see the results for your website. Check the search results, if any insignificant sitelink is appearing in search results.

Go to Google webmaster tools. Select Search appearance -> Sitelinks -> Demote the insignificant site link. Demoting the insignificant sitelinks is very important so that the proper sitelink appears in the search results.

20. Use Navigational Breadcrumbs

Navigational breadcrumbs are very important for any website to rank well in search results. It improves user experience and also helps in proper crawling of the website. Check if it is implemented on the website properly.

You can also implement Schema.org tags to show the breadcrumbs in search results.

On-Page SEO Audit

21. On Page Meta Tags

On page meta tags are very important for any page to rank in search results. Follow below guidelines to write optimized on-page tags.

  • Write Catchy page titles with target keywords in it. Keep the title length around 55-60 characters
  • Write descriptive meta descriptions with word limit not exceeding more than 155 characters
  • Include keyword rich heading tags H1, H2 & H3 tags
  • Include keywords in image name, alt tags & title attributes
  • Link your content internally using target keywords as an Anchor text

22. Remove Duplicate Meta Tags

Run website URL in screaming frog and filter out all the URLs with duplicate titles and descriptions. You can also check this in webmaster tools, Search Appearance -> HTML Improvements if there is any duplicate title or description present on the website.

23. Add an Updated XML Sitemap

Generate XML sitemap and include all the important URLs in it. Do not include any broken links in XML sitemap and assign priority to each of the URL as per their importance. Check the best practices for XML sitemap.

It helps in increasing the discovery and crawling of the website pages. For bigger websites, it is recommended to use automated sitemaps.

It is recommended to use individual sitemaps for Images and web pages. You can create automated and individual sitemaps through Yoast SEO plugin.

24. Regularly Update HTML Sitemap

Having large number of pages on the website? Create a HTML sitemap and list all the website pages in a proper hierarchy to make it easier for users to find the page they want.

In a case of an e-commerce website, it is recommended to list only top level category and sub-category pages in the HTML sitemap to make it look clean.

25. SEO Friendly URL Structure

Use proper static URL structure for your website. It makes easier for crawler and users to understand the structure of the website pages and hence improves the website performance.

Read this article from Moz to understand SEO best practices for structuring URLs.

26. Use HTML Content on the Website

HTML version of the content is the easiest possible form of content for Google bot to understand. Use content on the web pages in the HTML form to make it easier for Google bot to crawl & index. Html pages are always light and improve the site speed and performance.

Website Content SEO Audit

27. Check the Keywords Targeted on the Pages

A single keyword should be targeted to only one page. Create a list of keywords and map them to all the website pages. Make sure that a single keyword is not targeted to multiple pages.

The keywords you are targeting on the website pages should be relevant and the page should be talking about the same topic as the keyword.

28. Do Not Stuff your Content with Keywords

Stuffing your content with your target keywords is not going to rank your website. The ideal keyword density is 3-4% and you should stick to it.

It is recommended to not stick with only one single keyword but use different variations of a keyword and its synonyms.

29. Publish Unique High-quality Content

Google prefer only unique content in top search results. So if you want to rank in top, publish only unique and high-quality content.

30. Check Hidden Content in Webpage Source Code

Using hidden content on the website is not a recommended practice by Google. Google says that whatever Google bot is seeing on the website should be visible to the users.

Check web page source code for hidden content. Check the text-only cached version and figure out if any content piece present in the meta content part which is not visible to the users but crawl-able for the Google bot.

31. No Thin Content on the Website

Check if there are any pages with the thin content present on the website. Thin content on the website leads to panda penalty. It is recommended to add enough unique content on each of the website pages.

Check the details of Google Panda Update.

Backlink SEO Audit

32. Audit your Backlink Profile

Check your website links if they are natural or unnatural. Audit your website backlink profile and figure out the harmful links to disavow. Follow below steps for the backlink audit of your website:

Use Majestic or ahrefs to download the backlink data for your website. Calculate the links/per referring domain.

Check the referring domains and look for the irrelevant domains linking back to your website i.e. any illegal, spam or adult site linking to your site.

Check if all the anchor texts are non-branded keywords or mix of branded & non-branded keywords

If there is any bad quality link coming from any website than disavow it.

33. Check for the Penalty/Manual Action

Check in Google webmaster tools if there is any penalty or manual action present on the website. If you are starting a new website, then do a domain history check.

Also, check the backlink history of the domain using Majestic & Ahrefs to make sure that the domain you are using is clean and there is no backlink penalty on it.

34. Do not use Sitewide Links

Giving a link to sponsor website from the footer is a very old way of link building, but these days, Google consider it as spam linking.

So remove site-wide links from footer or top/side navigation or make them nofollow. Linking different products of same brand or partner websites is fine but make sure the links are nofollow.

Check your Website Analytics & GWT

35. Check the Google Analytics Code Placement

Do you have Google analytics account present for your website? Check if correct GA code is present on all the website pages and the analytics is tracking and receiving data accurately.

Remove the GA code from <body> section and place it just above the closing </head> section of the website. If you have multiple properties in Google analytics, then make sure you have placed the correct Google analytics property code on the respective website.

36. Check for Errors & Messages in Google Webmaster Account

Create GWT account for your website to keep a track of website performance in search results and errors on the website if any.

Regularly check the messages in the Google webmaster account and check if there is any serious issues present on your website.

Design & User Experience

37. Social Sharing/Like Buttons Present on the Website

Implement social sharing/like buttons on the website to improve the social presence. There are so many widgets for social buttons present online which can track the performance.

38. Check for Custom 404 Page

What if any user types a wrong or misspelled URL and land on a page which is not found? In that case, you need a 404 Not Found page for users to navigate them to the correct website pages.

Check if custom 404 page is available on the Website or not. It should contain links for homepage & other important pages of the website.

39. Good Website Design

Use proper fonts, colors & wide width template for proper readability of the content. The design is the first thing which makes an impact on user’s mind.

So if users like your website, they are more likely to visit again.

40. Are the Images Optimized?

Use light size images on the website to keep the page size light. Heavy images take more time to load and it increases the overall page load time of the website and bad experiences for users.

So, have you finished SEO audit for your website? Please tell us if you find any problem in checking or implementing anything on your website. Please comment your suggestions and queries in the below section.

Share this Story
Load More Related Articles
Load More By Bhavuk Khandelwal
Load More In How To Guides

Leave a Reply

Your email address will not be published. Required fields are marked *

About Me


Myself Bhavuk Khandelwal, I have keen interest in writing about Digital Marketing and this is the reason of establishment of this blog Tech SEO Tricks, I am Blogging about Search Engine Optimization, Blogging Tricks, Content Marketing, Make Money Online, Affiliate Marketing, Online Marketing Promotion Strategy, Sharing How to Guides and many other topics. I am doing various Experiments and will be sharing all the Findings and Tricks on this Blog in Future.

We Use SEMrush

SEMrush

Subscribe Newsletter