A SEO Audit is a complete and detailed analysis of a website that considers every element relevant to search engines, starting from the technical aspects up to the content structure.
SEO Audit is a fundamental activity because it allows to know at 360° the current status of any type of site: blog, e-commerce or other. Through this analysis it is possible to assess the general status, detect the main structural errors, any critical issues that may affect the organic performance of the site and also the business objectives.
Furthermore, starting from the SEO Audit it is possible to identify margins for growth and improvement in terms of visibility and organic sessions and to elaborate a strategy aimed at making the site competitive on Search Engines compared to the reference market.
Let’s see now, step by step, what are the main aspects to be evaluated during the implementation of an SEO Audit.
First of all, it is essential to check the status of the site in terms of traffic, positioning and organic performance. With the support of tools such as Google Search Console you can access data that provides valuable information about a site’s health, such as positioning, click, impression and CTR data. You can then go deeper into performance in relation to specific queries or pages.
These data are the first indicators of the current state of the site and the comparison of organic performance with respect to a given period and/or with respect to a previous period already makes it possible to make fundamental observations on the performance of the site. For example, a strong discrepancy between clicks and impressions may be a first evidence that reveals the need to investigate the reasons for this difference. In a situation like this it is certainly useful to go and check the type of content that populates the SERP for the query of interest: the possible presence of competitors, ads on the Google Ads Search network, Google Shopping ads, images or Google Knowledge Graph are just some of the reasons that could explain a substantial difference between the total number of clicks and impressions.
After an initial evaluation of the current status of the site, it is necessary to check that all the desired pages are accessible for search engines and to evaluate the status of the indexing of the site.
You can quickly evaluate the indexing status using the Google Search Console. The tool, in fact, detects the difference between the number of pages present in the Google index and the total number of pages on the site. A strong discrepancy between these two values undoubtedly indicates a problem related to indexing and visibility. It is therefore necessary to examine the pages excluded from the index and assess the reason why they are excluded.
At this stage, one of the main elements to analyse is robots.txt, a text file that contains specific indications on which pages should be accessible and which should not. It is necessary to verify that the robots.txt is written correctly and that access to resources that should be accessible or vice versa is not denied. It is important to make sure that the correct sitemap.xml URL is indicated at the end of the file, including the full address.
The sitemap.xml must also be analysed. The sitemap contains information about all the resources on the site (pages, images, videos and other types of files) and is used by search engines to scan the site more efficiently. It is important to make sure that the Sitemap is error-free and that all pages are correctly inserted into it.
Finally, you must verify the correct implementation of Canonical tags. The URL indicated as Canonical indicates to search engines which version of a particular page should be indexed. Usually the canonical URL refers to the page itself but its correct setting is useful to solve possible duplication and/or cannibalisation problems. Viceversa, if the Canonical link is not correctly enhanced and implemented on the site, it could cause various and in some cases even very serious problems, both indexing and content duplication.
The information architecture represents the way contents are structured and organised within a site, it is the navigation tree and the definition of hierarchical importance of the topics on the site. The analysis of the information architecture of the site serves to assess the extent to which the contents are linked to each other and accessible to both Users and Search Engines.
At this stage, attention must be paid to two aspects in particular:
Status codes are codes with which the server responds to a browser request for a resource. There are several status codes and, among those useful to identify within the SEO Audit, the most common are:
In addition to the aspects seen so far, there is a whole series of other elements to consider that relate to the analysis of HTML code.
The HTML code of the site is what Search Engine Spiders analyse.
One of the first things to do is to verify the absence of internal links in Nofollow. Nofollow tells search engines not to follow a certain internal link, so it is essential to limit its use in order not to hinder access to the pages.
Another aspect to consider is the use of breadcrumbs which indicate the position of the page with respect to the information architecture of the site and are important for two reasons:
There are also other factors to analyse in relation to the HTML code and it is necessary to verify that the OnPage SEO Best Practices are respected. Here are some elements that must be considered and evaluated at this stage:
The loading speed of a site is a very important factor to consider both for the user’s browsing experience and because it has long been one of the main ranking factors. It is therefore essential to analyse the performance of the site and verify that the contents are loaded quickly and that the User has the possibility to start interacting with the page in a few seconds.
At this stage it is necessary to support tools such as Google PageSpeed Insights (Lighthouse) or GTMetrix. Using these tools you can test individual URLs, make a general evaluation of the site’s performance and identify any problems.
In addition, these tools offer suggestions for solving the problems detected by providing reference documentation.
One of the last but fundamental stages of the SEO Audit is the verification of structured data.
Structured data are codes that are added to the pages of the sites (or individual elements on the page) to describe content in a standard (structured) way to search engines to add information and details that help to better understand what the content represents and to show information as relevant as possible in the search results, thus allowing to obtain advantages both in terms of positioning and CTR. To implement the structured data on the site you can refer to Schema.org, a project created by the main search engines (Google, Microsoft, Yahoo and Yandex).
Using Google’s structured data test tool, you can test individual URLs to see whether or not there is structured data on the site, which types of structured data have been implemented and identify any errors that need to be corrected.
Starting from the analysis of the structured data already present on the site it is possible to identify further types of structured data that could be added to enrich the description of the contents of the individual pages.
The most common Structured Data are:
An SEO Audit, in conclusion, is a technical document that explores in depth the problems of a website. All the analysis and insights included in this article aim to improve the scanning of the site content by Search Engine Spiders. It is essential to improve the navigability and correlation between content, HTML code and meta tags, structured data and the speed of response of the pages, but one of the most important things to take into account while doing the SEO Audit analysis is the Crawl Budget, which is the time it takes Google (and in general Search Engine Spiders) to scan the pages of our site. One of the main objectives of an SEO optimisation activity is to optimise the time it takes the Spiders to scan our site, providing the best and most relevant content to our core business.