SEO Teacher: website audit & optimization

Learn search engine optimization with us. We will teach you how to perform an SEO audit of your website and create an actionable digital marketing strategy.

Website optimization flow: SEO audit and strategy

An SEO workflow starts with website optimization and digital marketing strategy definition, keyword research, and semantic mapping. Then we must perform an SEO audit, identify and fix all technical issues on the website. It evolves on-site optimization, on-page optimization, SEO copywriting, content optimization, link building, page ranking, and keyword performance monitoring.

It all starts with an SEO audit of your website

Why do you need an SEO audit? Because it will allow us to establish a technical and semantic diagnosis of the website. It also provides an analysis of the visibility of the pages in search results on Google. This data allows us to find all website optimization opportunities and ways to boost website visibility and rankings in search engine search results.

SEO tools to do a website audit

To do a complex website audit we strongly advise using a powerful SEO tool, such as Semrush, Moz, DeepCrawl, Ahrefs. You can also do a quick SEO audit online with a free tool. Such an SEO platform allows you to perform an in-depth analysis of the website by checking multiple ranking factors. Based on the audit results, it highlights how efficient separate elements of your website are. The report identifies SEO errors and suggests how to fix them.

Why to do SEO audit?

An SEO audit helps us find how to improve your company’s online visibility. An SEO audit allows you to identify all website optimization opportunities. A particular attention is paid to critical SEO errors, which might block the indexation. An effective SEO and high Google rankings always start with a comprehensive analysis which allows visualizing website optimization roadmap.

SEO audit: key points

  1. Header tags verification (robots, canonical)
  2. Broken links check
  3. 3xx, 4xx, 5xx status codes check
  4. Sitemap verification
  5. Robots.txt file check
  6. Title tags analysis
  7. Meta description tag analysis
  8. Open Graph tags analysis
  9. Alt attribute on images verification
  10. SEO-friendly URLs test
  11.  Heading structure analysis (H1 – H4)
  12.  Structured Data test
  13.  Content analysis
  14.  Internal linking analysis
  15.  Backlinks analysis
  16.  Page speed analysis
  17.  Images optimization test
  18.  Mobile-friendly test

Technical audit: head tags, Sitemap, Robots.txt

Sitemap verification

An XML sitemap is a file that lists the URLs for a site. It allows to include additional information about each URL: priority for indexation, last updated, how often it changes. This file allows the website indexation by search engine robots.

A sitemap helps Google and other search engines understand the structure and the organization of your website content. Search engine web crawlers read this file to more intelligently scan your site.

The sitemap can provide valuable data, such as when the page was last updated, how often the page is changed, and the importance of the page relative to other URLs in the site.

Make your sitemap available to Google by adding it to your robots.txt file and submitting it to Search Console. Test your sitemap using the Search Console Sitemap testing tool.

Robots.txt file audit

Robots.txt file is a file at the root of the web site that indicates which parts of your site you don’t want to be accessed by search engine crawlers, and which you do. The robots.txt file uses Disallow and Allow commands, and Define User-agents. 

User-agents are search engine robots. Disallow is a command for the user-agent that tells it not to access a particular folder or type of file.

Robots.txt helps you restrict the access of search engine crawlers, and prevent them from accessing certain URLs or directories. Robots.txt file also points to Google crawlers the XML sitemap location. That’s why a robots.txt file is very important for SEO. A bad or insufficient robots.txt configuration may have an impact on the website rankings. If robots.txt is not set up, search engines robots may have problems in finding the most valuable content on the site, in crawling and indexing pages.

Broken links test

Broken links send users to non-existent web pages. They hurt the website’s usability and reputation. This impacts SEO and ranking as well. Broken links are confusing to users and can reduce traffic and ranking ability of pages. Finding and fixing broken links on the website improves both user experience and search engine rankings.

3xx—Redirection Responses

3xx status codes indicate a redirection that the user agent (a web browser or a crawler) needs to take further action when trying to access a particular resource. Generally web server automatically forwards or redirects the user agent to another resource (URL) without interaction with the user. 

Since users don’t see the original requested URL, search engines will not index the original URL instead index the final redirected URL. So 3xx status codes have more important in search engine optimization compared to other set of status codes.

  • 300 Multiple Choices
  • 301 Moved Permanently
  • 302 Moved Temporarily
  • 305 Use Proxy
  • 380 Alternative Service

4xx Code Status – Client Error

This group of HTTP status codes indicates that the request for the resource contains bad syntax or cannot be filled for some other reason, presumably by fault of the client sending the request.

  • 400 – Bad Request
  • 401 – Unauthorized
  • 402 – Payment Required
  • 403 – Forbidden
  • 404 – File Not Found
  • 406 – Not Acceptable
  • 408 – Request Timeout

5xx Status Codes – Client Error

A 5xx code means the problem was caused by the server. With a 5xx code, the request can be present with no changes and you will get the requested result when the server has been fixed. 

  • 500 Internal Server Error. This error indicates that the server has encountered an unexpected condition.  
  • 501 Not Implemented. This error indicates that the HTTP method sent by the client is not supported by the server.
  • 502 Bad Gateway. This error is usually due to improperly configured proxy servers.  
  • 503 Service Unavailable. This error occurs when the server is unable to handle requests due to a temporary overload or due to the server being temporarily closed for maintenance. 
  • 504 Gateway Timeout. This error occurs when a server somewhere along the chain does not receive a timely response from a server further up the chain. 
  • 507 Insufficient Storage. This error indicates that the server is out of free memory.

Check website language and region set in Head

The language defined in HTML should match the language in the text.

Language and region specifications are set up at the beginning of the HTML document. For English in the United States it should be:

<html lang=”en-US”>…</html>

The charset encoding (UTF-8) must be set correctly.

Robots tag verification

The robots meta tag can inform the search engines which pages on your site should be indexed. We use it to prevent the search engines from indexing specific pages on the website. A robots meta tag also instructs the search engine to follow (or not) any links on the page.

  • Noindex — prevents search engines from indexing a page;
  • Nofollow — prevents the search engine from following ALL the links on the page.
  • <meta name=”robots” content=”index, follow” />
  • <meta name=”robots” content=”index,nofollow” />
  • <meta name=”robots” content=”noindex,nofollow” />

If you don’t add a robots meta tag, the default for crawlers is to index and follow your page. If you want to index the page, make sure that there is no “noindex” on it. 

Canonical tag analysis

A canonical tag (rel=canonical) allows to tell search engines which URL is “the true one” and should be indexed, and which one is its copy. Using the canonical tag prevents having issues with duplicate content appearing on multiple URLs.

When you have several pages with identical content, you can use a canonical tag to notify the search engines which page has to be indexed as an “original” and true one. Instead of confusing Google and decreasing your ranking in the search results, we guide the crawlers. 

Additionally, we save our crawl budget (number of pages that can be indexed in one time crawl by search engine robot), as we exclude all unnecessary pages. Each page on the website with a unique content should have a canonical tag specified. In Drupal it’s the variable $URL.

In HTML it may look like this: <link href=”https://site-example.com/awesome-url/” rel=”canonical”>.

Meta tags analysis in SEO audit

Title tag

A <title> tag is a key on-page SEO element. This title appears in browsers and in the search results page. It helps users understand what your web page is about. 

The title tag is used by the search engines to determine the subject of a web page and display it in SERPs. A title tag is one of the first things that users see in Google search results. That is the place where they decide to click or not to click.

In HTML a title tag looks like this: <title>Your Awesome Title</title>

The perfect Title Tag

  • Starts with a keyword; 
  • With a length between 35 and 65 characters (512px). 
  • Describes the content of the page; 
  • Is exciting and encourages to click it; 
  • Includes brand name at the end separated by a hyphen.

Meta description tag

Meta description is a short text in the HTML <head> section of a web page. It is usually displayed in Google search results for site snippets after the title and URL. 

The meta description tag is used by search engines to display the page’s description in search results. An efficient meta description tag encourages users to click on it and go to see your web page.

The meta description helps search engines to determine the subject of the page and allows users to understand whether the page matches their query. Though meta description is not a ranking factor, it still influences a user’s decision – click or not to click.

In HTML it looks like this: <meta name=”description” content=”Your gorgeous meta description”>

The perfect Meta Description:

  • It is around 150 characters long; 
  • It describes the content of the page; 
  • It uses the page keywords; 
  • Is easy to read, interesting and exciting; 
  • Focuses on the goal of the page; 
  • Encourages users to click.

The meta description influences a click-through rate. And this generates visits, which influences rankings in Google. An accurate description that matches page content will reduce the bounce rate, and that’s another ranking factor.

However, search engines bold query keywords in the descriptions. Such bolded words draw users’ attention. Therefore, optimizing your meta description for target keywords is important.

Open Graph tags

Open Graph (OG) tags are additional meta tags in HTML <head> section of a page. They allow any webpage to get a rich snippet in social media. 

For example, the tag OG image helps us specify which image will appear when a page is shared on social media. It’s an important CTR factor as a nice snippet with a great picture will be clicked much more times than a simple page title.

Facebook developed the Open Graph protocol to enable the integration of a website with its social media platform. This protocol allows you to control how the website is presented when it’s shared on social media, in terms of image, title, and description.

In HTML it can look like this:

  • <meta name=”og:title” property=”og:title” content=”Your Fantastic Open Graph Title for Social Media”>
  • <meta property=”og:type” content=”Article (or any other type)” />
  • <meta property=”og:description” content=”Great description for social media.” />
  • <meta property=”og:site_name” content=”What the site is about” />
  • <meta property=”og:image” content=”URL of your image” />

HTML structure analysis

Header tags (h1, h2, h3, h4.)

Header tags are part of webpage content. They are the headings that developers use to build an HTML structure of a web page. These tags improve user experience and ease the reading. 

H1 – H6 tags are designed to separate text segments containing the keywords in a logical way. They will help search engines to understand the theme of the web content, and see the hierarchical structure of the website’s content. Search engines can read H1-H6 headings to determine the importance of a web page sections. They use this information as a ranking factor as well.

Headings optimization increases page quality in the eyes of the search engine robots. As a result, an optimized page, with a single H1, multiple headings H2-H5, will rank higher. The headings should be not too short, but not too long. And include target keywords.

A good headings structure optimizes the navigation on the page. Use the Hn tags to structure the page, to divide the content into several logical parts. But do not overuse subtitles tags, as it may disorient users. 

Include the most important keywords in the web page’s headings. And never duplicate the title tags. You can have only one H1 title on a web page. Use at least once an H2 title and multiple H3 and H4 titles to structure the text. 

Keep a hierarchical order of headings. The order of header tags (h1 to h5) shows the level of importance of each section.

Use only one h1 per page and further break up your content logically into sections with h2. And don’t use any other tags with text before h1. If necessary, segment h2 sections with h3, h4, and so on. If there is not enough text content, do not stuff a page with an array of headings. Use them logically and strategically to introduce the main point of each section of a page.

Quick check of headings structure

  • The H1 heading should be specified. A page should have only ONE H1. There should be no other tags with text content like (H3, H2 or p) before the H1 tag.
  • Use the Hn tags to structure the page, to divide the content into several logical parts. Use at least once an H2 heading and multiple H3 and H4 titles to structure the text. 
  • The structure of headings should not miss levels. Do not skip heading levels.
  • Don’t hide headings and paragraphs in the code. It a black-hat technique that will be banned by search engines.
  • The number of headings should be in proper relation to the amount of text. Do not overuse subtitles tags when you have little text content, as it may disorient users and seem suspicious to Googlebot. 
  • Include the most important keywords in the web page’s headings. And never duplicate the title tags. 
  • <h1>Title of content</h1>
  • <p>Paragraph of content</p>
  • <p>Second paragraph of content</p>
  • <h2>Sub title<h2>
  • <p>Paragraph of content</p>
  • <h2>Sub title2<h2>
  • <p>Paragraph of content</p>
  • <h3>Section1 </h3>
  • <p>Paragraph of content</p>
  • <h3>Section2</h3>
  • <p>Paragraph of content</p>

ALT attribute for images

The ALT attribute is used to describe the contents of an image file. This image attribute helps to specify the text that will be displayed on the web page if the image for any reason is not loaded.

ALT is used by search engines to understand the contents of the images. It improves the experience of visually impaired users and those who have disabled images in their browsers. 

ALT provides Google with useful information about the image. Search engines then uses this information to help determine the best image to return for a user’s query.

ALT is also a hidden SEO opportunity as it allows to get better placement in search results. You should write your ALT text relevantly to the image. And you should cover important keywords as well. But avoid keyword stuffing. ALT text should be relevant.

We check on the page if there are images without or with an empty ALT attribute. We implement ALT in an SEO-friendly way and flag any issues with irrelevant ALT text.In HTML it may look like this: <img src=”url” alt=”Your clear-cut image description”>

Scroll to Top