Technical SEO

All the magic that happens in the background!


Technical SEO includes a few practices that are dealing with crawling and indexation. The main issues that we will address are:

Crawling issues: We will make sure the Search Engine Bots will discover all the pages we are interested to rank with by:

  • managing the XML sitemaps (no 404s, no redirections, no URLs with canonical tags pointing somewhere else, only URLs that we want to rank should be in sitemap)
  • the meta robots
  • the robots.txt file

Crawl budget optimisation:

  • We find and compare the pages indexed by Google with the pages on the live website
  • Find the old URLs that are still hanging up in Google and de-index them
  • If we will have access to server stats, we will also analyse the server hits done by the Google bots and see what pages Google prefer most when visits the website.

Hierarchy and Internal linking: The hierarchy should be clear, we will make sure we don’t have over 300 links on a page, we will not have redirected or “404 – not found” links and the all the important content should be reachable within 3 clicks.

Duplications: Almost every website will have to face the issue of having duplicate content, especially the trading ones. A few types of duplications are often found and can be ignored by Google. To correct the faceted, content or meta tags duplications, we use a few methods to feed Google the page variant that we want to be indexed.

Pagination: Much of this type of content on the web is not marked with pagination tags (rel=”next” & rel=”prev”) which will indicate Google that the content is spread on multiple URLs.

Redirects: we will want to avoid all the non friendly redirection such as javascript, 302, 307, meta refresh and also chain redirects (Google is not following more than 5 redirects in a chain).

Structured Data: we will advise of using the Schema markup where possible or using the Data Highlighter tool in Google Search Console.

Site Loading Speed: We will optimise based on the tips from the Google Page Speed Insights Tool, and third parties like Webpagetest and GTMetrix. We will work on reducing the images’ size, minifying the CSS, Javascript and HTML code, but also in cleaning the code of everything that is not necessary and where is possible we will prioritise the visible content loading first.

Multi Language SEO: for multi-language websites, the correct usage of the href lang tag is mandatory. We have worked with large multinational companies in over 70 countries. We’ve done the mapping and generate all the XML files needed for setup and migrations.

Different content served to Users vs Search Engines. We will check if the same content is seen by both users and SE bots. We will check if the robots can see CSS and Javascript files used for the page formatting.

Connect with Us

Follow us on social media to keep up to date with everything that's happening...

Get in Touch