top of page

How to Conduct a Technical SEO Audit

Posted: Sept. 29, 2021

Updated: Feb. 22, 2024

Featured image by Pixlr AI Image Generator


Technical SEO is unrelated to the actual content on your site, but still affects organic rankings. This post serves as a guide of what check in a technical SEO audit.


XML Sitemap

XML sitemap is a file that helps search engines to understand your website structure and crawl it. It includes a list of all pages on your site, the pages prioritization, when they were last modified, and how frequently they are updated. Usually, the pages will be categorized by topic, post, product, etc.


You'll probably check the sitemap at the beginning of a technical audit. Find the sitemap of any page by typing /sitemapl.xml after the URL, for example,

https://datachai.com/sitemap.xml. If the site has multiple sitemaps, use /sitemap_index.xml


Register your sitemap with Google Search Console, which includes several tools to check technical SEO metrics such as mobile optimization and page speed. The Google Search Console XML Sitemap Report will give you the technical insight to achieve 1:1 ratio of URLs added to the site and updating the sitemap.


Ideally, your site has an internal linking structure that connects all pages efficiently, so you don’t need a sitemap. It’s actually optional. But it’s best practice for large sites to have a sitemap.


If your site has a sitemap, it's best practice to include all 200 OK URLs, and have a 1:1 ratio of exact URLs in the sitemap as there are on the site. 4xx and 5xx URLs, orphaned pages, and parameters should be removed.


Server Response Code and Redirects

Bulk check source codes with this Google Apps Script:

// Get http server response status code
function getStatusCode(url){
  var options = {
     'muteHttpExceptions': true,
     'followRedirects': false
   };
  var statusCode ;
  try {
  statusCode = UrlFetchApp .fetch(url) .getResponseCode() .toString() ;
  }
  
  catch( error ) {
  statusCode = error .toString() .match( / returned code (\d\d\d)\./ )[1] ;
  }

  finally {
  return statusCode ;
  }
}

// Exceed importxml limit
function importRegex(url, regexInput) { 
  var output = ''; 
  var fetchedUrl = UrlFetchApp.fetch(url, {muteHttpExceptions: true}); 
  if (fetchedUrl) { 
    var html = fetchedUrl.getContentText(); 
    if (html.length && regexInput.length) { 
      output = html.match(new RegExp(regexInput, 'i'))[1]; 
    } 
  } 
  Utilities.sleep(1000); 
  return unescapeHTML(output); 
} 

Then, use RegEx redirects if you need to bulk redirect multiple source URLs to the same destination, or .htaccess file for smaller scale redirects. However, if your site is hosted on WordPress, be careful about using .htaccess because it will be depreciated in php 7.4 and subsequent versions. WP Engine suggests alternatives such as using RegEx directly on WordPress or managing redirects in Yoast SEO Premium. Finally, if you are completely removing a page, orphan it then use a 410 so that Google can remove it more quickly.


JavaScript SEO

JavaScript is increasingly used across the web these days. Sites that have been developed with JavaScript frameworks such as React, Angular, or Vue.js have unique SEO challenges that are different from sites that have been built with CMS such as WordPress. For example, if you experience problems with getting Google to crawl your site, you can troubleshoot in Google Search Console.



Page Speed

On the topic of page speed, other best practices include using a fast DNS provider and minimizing http requests by keeping the CSS style sheet, scripts, and plugins to a minimum. You can also compress web pages by reducing image file size and cleaning up the code, especially for content in the first view. PageSpeed Insights is a free tool to check your page speed, which also provides specific recommendations on how to improve it.


robots.txt

The robots.txt file, also called the robots exclusion protocol or standard, is a text file that tells search engines which pages to crawl or not crawl. You can see the robots.txt file for any website by adding /robots.txt to the end of the domain. For example, https://www.datachai.com/robots.txt


Search engines look at robots.txt first before crawling a site, so a disallowed page will be completely excluded.


Crawl Budget

Crawl budget is the number of pages Google crawls and indexes on a website within a given timeframe. If your pages exceed your site's crawl budget, Googlebot will not index the balance, which can negatively affect your rankings.


Performing regular log file analysis can provide insights about how Googlebot (and other web crawlers and users) are crawling your website, giving you the necessary information to optimize the crawl budget. If your site is large and has crawl budget issues, you can adjust the crawl rate via search console.


SSL (Secure Sockets Layer)

SSL (secure sockets layer) is a security technology that creates an encrypted link between a web server and browser. It’s clear if a website is using SSL because the URL will start with https (hypertext transfer protocol), not http. In 2014, Google announced that they want to see https everywhere, and websites using SSL will get priority for SEO. Google Chrome now displays warnings anytime a user visits a site that does not use SSL.


These days, most top website builders such as Wix include SSL by default. If not, simply install an SSL certificate on your website.


Canonical Link Element

Even if you don't have multiple parameter-based URLs of each page, different versions of your pages using https, http, www. .html, etc. can quickly add up. That's where the rel=canonical tag comes in - allowing you to manage duplicate content by specifying the canonical or preferred version of your page. This functions to report duplicate content and tell Google to consolidate the ranking signals, so your page won't be disadvantaged.


If you are using a CMS like Wix or Squarespace, your web hosting service might automatically add canonical tags with the clean URL. For example, my homepage already has one as well.

<link rel="canonical" href="https://www.datachai.com"/>

Schema

Schema, also called structured data markup, enhances search results through the addition of rich snippets. For example, you can add star ratings or prices for your products. Adding schema by itself is not a technical SEO factor, but it is recommended by Google and can indirectly help improve rankings and increase page views.


Add Schema via WordPress Plugin

If your website is hosted on WordPress, you can use the Schema plugin to add structured markup to your pages. This plugin uses JSON-LD, which is recommended by Google and also supported by Bing.


Add Schema Manually

If your site isn't hosted on WordPress site or you prefer not to rely on a plugin, you can manually add schema with a few more steps. Schema is usually added to the page header, although it's possible to add it to the body or footer as well. Some recent WordPress themes include specific text blogs to add schema to the body.


Note that adding schema through Google Tag Manager is not recommended. If you use Google Tag Manager, the structured data will be hidden within a container, making it difficult for Google's algorithms to read and give it appropriate weight.


To add schema manually, first, use a tool like MERKLE to generate the baseline markup. Although this is already fine to use as is, you can paste the baseline markup in a text editor and continue to edit and customize the structured data for each page.


If you are using Sublime Text, go to View > Syntax > JavaScript > JSON to set your syntax appropriately.


Finally, insert additional properties that were not available on MERKLE as needed.


Add html Strings to Schema

You can add certain basic html strings to your schema markup, for example, if you'd like to include a bulleted list or hyperlink. An important thing to remember here is to escape double quotes when writing html, and simply replace them with single quotes.


Google Search displays the following HTML tags; all other tags are ignored: <h1> through <h6>, <br>, <ol>, <ul>, <li>, <a>, <p>, <div>, <b>, <strong>, <i>, and <em>


Validate Schema

Use Google's Rich Results Testing Tool to make sure that your schema markup is being read properly, and Structured Data Testing Tool to actually see all of the structured data on the page. The final step will be to request indexing for the page that you added markup, via Google Search Console. Within a few days, you should see your markup under the enhancements sidebar.


Note that in June 2021, Google has limited FAQ rich results to a maximum of 2 per snippet, so your snippet real estate may be a bit smaller. If you have 3 or more FAQs marked up, Google will show the 2 that are most relevant to the search query.


Log File Analysis

The log file is your website’s record of every request made to your server. It includes important information such as: the URL of the requested page, http status code, IP address of the request server, timestamp, user agent making the request, request method (GET/POST), client IP, and referrer.


By performing log file analysis, you can gain insights about how Googlebot (and other web crawlers and users) are crawling your website. The log file analysis will help you answer important technical questions such as:

  • How frequently is Googlebot crawling your site?

  • How is the crawl budget being allocated?

  • How often are new and updated pages being crawled?

.

You can identify where the crawl budget is being used inefficiently, such as unnecessarily crawling static or irrelevant pages, and make improvements accordingly.


Obtain the Log File

The log file is stored on your web browser, and you can access it via your server control panels’ file manager, command line, or using an FTP client (recommended).


The server log file is commonly found in the following locations.

  • Apache: /var/log/access_log

  • Nginx: logs/access.log

  • IIS: %SystemDrive%\inetpub\logs\LogFiles

.

Tools and Software

You can convert your .log file to a .csv and analyze it in Microsoft Excel or Google Sheets, or use an online log file analyzer such as SEMRush or Screaming Frog Log File Analyser. The best log file analyzer will depend on your website and what tools you might already be using for technical SEO.


Limitations

Performing regular log file analysis can be extremely useful for technical SEO, but there are some limitations. Page access and actions that occur via cache memory, proxy servers, and AJAX will not be reflected in the log file. If multiple users access your website with the same IP address, it will be counted as only one user. On the other hand, if one user uses dynamic IP assignment, the log file will show multiple accesses, overestimating the traffic count.


Core Web Vitals

In May 2020, Google introduced core web vitals as the newest way to measure user experience on a webpage. About one year later, as of June 2021, Google announced an algorithm update to use core web vitals as ranking factors.


There are three core web vitals:

  • Largest Contentful Paint (LCP)

  • First Input Delay (FID)

  • Cumulative Layout Shift (CLS)


The idea is that core web vitals point to a set of user-facing metrics related to page speed, responsiveness, and stability, which should help SEOs and web developers improve overall user experience. Let’s see each of the core web vitals in more depth.


Largest Contentful Paint (LCP)

Largest contentful paint (LCP) is the time from when the page begins loading, to when the largest text block or image element is rendered. The idea is to measure perceived pagespeed by estimating when the page’s main contents have finished loading.


Of course, lower (faster) scores are better. In general, LCP <2.5s is considered to be good, and >4s should be improved.


LCP is one of the more difficult core web vitals to troubleshoot because there are many factors that could cause slow load speed. Some common causes are slow server response time, render-blocking JavaScript or CSS, or the largest content resource being too heavy.

Note that if the largest text block or image element changes while the page is loading, then the most recent one is used to measure LCP. Also, if it's difficult to pass core web vitals in your industry (i.e. most corporate sites have graphics heavy pages), keep in mind that your pages are compared against your close competitors.


First Input Delay (FID)

First input delay (FID) is the time from when a user first interacts with your site, to when the browser can respond. Only single interactions count for FID, such as clicking on a link or tapping a key. Continuous interactions that have different performance constraints are excluded, such as scrolling or zooming.


FID of <100ms is generally considered good, while >300ms should be improved. If your FID score is high, it could be because the browser’s main thread is overloaded with JavaScript code. You can reduce FID by improving issues such as JavaScript execution time and third-party cookies impact.


Cumulative Layout Shift (CLS)

Cumulative layout shift (CLS) is about visual stability on a webpage. Instead of a time measurement, CLS is measured by a layout shift score, which is a cumulative score of all unexpected layout shifts within the viewport that occur during a page’s lifecycle. The layout shift score is the product of impact fraction and distance fraction. Impact fraction is the area of the viewport that the unstable element takes up, and distance fraction is the greatest distance that the unstable element moves between both frames, divided by the viewport’s largest dimension.


<0.1 is considered good, and >.25 is generally considered a poor score. Common causes for poor CLS include images or ads with undefined dimensions, resources loaded asynchronously, or DOM elements dynamically added to a page above existing content. The best practice is to always include size attributes for your images and videos.


How to Measure Core Web Vitals

Core web vitals are incorporated into many Google tools that you probably already use, such as Search Console, Lighthouse, and PageSpeed Insights. In addition, a new Chrome extension called Web Vitals is now available to measure the core web vitals in real time.


User experience and pagespeed often depend on the user’s connection environment and settings as well. Every time a page is loaded, LCP, FID, and CLS will be slightly different. Your site has a pool of users that make up a distribution - some people see the pages fast, others see it slower.

For the purpose of core web vitals, Google measures what the 75th percentile of users see. This and other concepts are discussed in a recent episode of Search Off the Record.


Troubleshooting

Google offers several solutions if you suspect a bug, such as if your core web vitals numbers are poor but your site has been tested and proven to be performing well. You can join the web-vitals-feedback Google group, and email list to provide feedback to Google. Google will consider the feedback when modifying the metrics going forward.


If you're looking for individual support, you can file a bug with the core web vitals template on crbug.com. You'll need to do some technical work - for example, write a little JavaScript that has a performance observer that shows the issue.


Finally, note that the core web vitals explained above are included in Google’s page experience signal. Of course, core web vitals are not the only user experience metrics to focus on. All other web vitals such as total blocking time (TBT), first contentful paint (FCP), speed index (SI), and time to interactive (TII) are non-core web vitals. As Google continuously improves its understanding of user experience, it will update the web vitals regularly. As of November 2021, Google is already preparing two new vitals metrics - smoothness and overall responsiveness.


Mobile UX

Finally, Google has been giving higher ranking to websites that have a responsive or mobile site since April 2015. At the same time, they released the mobile-friendly testing tool to help SEOs ensure that they would not lose rankings after this algorithm update. Also look into AMP (Accelerated Mobile Pages, an open source framework that aims to speed up the delivery of mobile pages via AMP’s html code.

bottom of page