Skip to content

What is Technical SEO

This article looks at the basics of technical SEO and hopefully answers some of the questions as we strive to achieve high Google organic ranking. Applying the technical side of SEO can mean the difference between a high ranking site and a site that doesn’t rank at all. Technical SEO isn’t easy, and elements of it change regularly. We aim to explain, in a jargon free way, which aspects you should (ask your website designer to) pay attention to when working on the technical structure of your website.

What is technical SEO?

Technical SEO refers to improving the technical aspects of a website in order to increase the ranking of its pages in the search engines. The pillars of technical optimisation are making a website faster, understandable for search engines and easier to craw. Technical SEO is part of on-page SEO, which focuses on improving elements on your website to get higher rankings. Where as off-page SEO is about generating exposure for a website through other channels.

Other website and SEO articles that you may find beneficial include Web Marketing BenefitsWebsite Marketing Planning and the Top 5 Small Business Website Marketing Tips 

What is Technical SEO website design peterborough tractor scenery
optimise your website for increased organic traffic blue dolphin web design

Why you should technically optimise your site ?

Fundamentally Google and other search engines want to present their users with the best possible results for their query. If you are searching for “stainless steel swimming pools” Google wants to try its hardest to make sure the sites that you find organically are the best sites for “stainless steel swimming pools”. It doesn’t want non related sites to appear as this would result in a poor search experience. To achieve this, Google’s robots crawl and evaluate web pages on a number of factors.

Some factors are based on the user’s experience, like how quickly a page loads ( this is where you will see various information about page load times of 3 to 5 seconds )

Other factors help search engine robots understand and identify what your website pages are about. By improving technical aspects of your website you help search engines crawl and understand your site. This is what, amongst others, structured data does ( this is a complex subject in its own right and is covered in a separate comprehensive article). If technical seo is done well, you might be rewarded with higher Google rankings or even rich knowledge results .

If you make serious technical mistakes on your site, it works the other way around and can impact on your search results. For example by accidentally adding a trailing slash in the wrong place in your robots.txt file you can block search engines entirely from crawling your site.

Its really important to be aware that you shouldn’t just focus on the technical details of a website just to please search engines. Fundamentally for users your website should be clear, easy to use and fast. Creating a strong technical foundation often coincides with providing a better experience for both users and search engines.

The characteristics of a technically optimised website

A technically sound website provides a fast and good experience for users and is easy for search engine robots to crawl. Having a correct technical setup helps search engines to understand what a website is about, preventing confusion that could be caused by duplicate content for example. Below we consider some important characteristics of a technically optimised website.

1. Website Speed

In todays impatient, I want it now society, web pages need to load fast as people don’t want to wait to long for a page to open. Historic research showed that over fifty percent of mobile website visitors will leave if a webpage doesn’t open within three seconds. So in simple terms, if your website is slow, then people will get frustrated and try another website. So even though your site has ranked and been found you end up missing out on the traffic.

Slow web pages offer a less than optimal experience, Google knows that therefore they prefer web pages that load faster. Therefore with all things being equal  a slower loading web page ends up further down the search results than a faster equivalent, resulting in even less traffic.

Wondering if your website is fast enough? There are a wide range of tools to easily test your site speed. There are thousands of articles that provide you pointers on what to improve.

2. Search engine crawlability

Search engines use robots to crawl or spider your website. The robots follow links to discover and explore the content on your website. By having a great and well developed internal linking structure within your website will make sure they understand what the most important content on your site is. You can guide / control robots

  • Block them from crawling certain content if you don’t want them to go there.
  • Let them crawl a page, but tell them not to show this page in the search results
  • Set up not to follow the links on that page.

Robots.txt file – These files contain code that tells browsers what your site should look like and how it works. If these files are blocked, it means that search engines can’t find out if your site works properly

A small mistake in robots.txt file might prevent robots from crawling (important parts of) your site. At certain times it may be necessary to intentionally block your site’s CSS and JS files in the robot.txt file. .

The robots meta tag is a piece of code that you won’t see on the page as a visitor. It’s in the source code in the so-called head section of a page. Robots read this section when finding a page. In it, they’ll find information about what they’ll find on the page or what they need to do with it.

If you want search engine robots to crawl a page, but to keep it out of the search results for some reason, you can tell them with the robots meta tag. With the robots meta tag, you can also instruct them to crawl a page, but not to follow the links on the page.

website speed improvements blue dolphin
what is technical seo web design blue dolphin

Whilst slow websites are frustrating, it can be even more annoying to land on a page that doesn’t exist at all. When a link leads to a non-existing page on your site, people will encounter a 404 error page. At that point your carefully planned user experience goes out of the window.

Importantly, search engines don’t like to find these 404 error pages. Search engines tend to find even more dead links than a visitor will encounter because they follow every link they bump into, even when thy are hidden.

A website is (or it should be) a continuous work in progress, therefore people make things and break things which means unfortunately, most sites have (at least) some dead links. A technical seo audit will involve using tools that can help you retrieve dead links on your site.

You should always redirect the URL of a page when you delete it or move it to prevent unnecessary dead links. Ideally, you would redirect it to a direct replacement page. Certain premium priced plugins are available that can make redirects which in some instances can be beneficial.

4. Avoid duplication, duplication

If you have the same content on multiple pages of your site, or even on other sites ( if you have a number of micro sites to support your main website) then there is the potential for search engines to get confused. If these pages show the same content, which one should be ranked best, therefore the search engines might rank all pages with the same content lower.

Unfortunately, you might have a duplicate content issue without even knowing it. Because of technical reasons, different URLs can show the same content. For a visitor, this doesn’t make any difference, but for a search engine it does; it’ll see the same content on a different URL.

A technical solution to this issue is to use the, canonical link element where you can indicate the original page , or the page you’d like to rank in the search engines.  A plugin like Yoast SEO adds self-referencing canonical links to all your pages which helps prevent duplicate content issues that you’d might not even be aware of.

5. Safe and secure

A secure website is a technically optimized website. Making your website safe for users to guarantee their privacy is a basic requirement nowadays. There are many things you can do to make your (WordPress) website secure, and one of the most crucial things is implementing HTTPS.

  • HTTPS makes sure that no-one can intercept the data that’s sent over between the browser and the site.
  • So, fundamentally, if a person logs in to your site, their credentials are safe.
  • A SSL certificate is required to implement HTTPS on your website.
  • Google acknowledges the importance of security and therefore made HTTPS a ranking signal: secure websites rank higher than unsafe equivalents.
  • To check if your website is HTTPS in most browsers, on the left-hand side of the search bar of your browser, if you see a lock if it’s safe. If you see the words “not secure” you (or your website designer) have some work to do!
  • With our website design service when we control the domain we provide free SSL so you get HTTPS for free
avoid duplication on your website blue dolphin web design

Want to pick up the phone and speak to us about your Strategy, Website, Marketing or Business Development project?
Call us on: 01733 361729
Email: solutions@bdolphin.co.uk

6. Structured data

Structured data helps search engines understand your website, content or even your business better. With structured data you can detail the search engines, with what kind of product or services you have on your site.

There is a fixed format (described on Schema.org) which details how you should provide this information, so that search engines can easily find and understand it. This helps the search engines place your website content within a bigger picture.

Implementing structured data also makes your content eligible for rich results; those top of the page results with stars or details that stand out in the search results.

7. XML sitemap

At its simplest an XML sitemap is a list of all the pages of your website. Imagine it as a roadmap for search engines on your site that showcases and makes sure search engines don’t miss any important content on your site. The XML sitemap is often categorized in pages, posts, tags or other custom post types and includes the number of images and the last modified date for every page.

Potentially, a website doesn’t need an XML sitemap. If all the internal linking structure is correctly and nicely connected, robots won’t need it. However, if your website doesn’t have a great structure, having an XML sitemap will be beneficial.

8. International websites

If your website targets multiple countries where the same language is spoken, search engines need a little help to understand which countries or language you’re trying to reach e.g. UK and USA

Hreflang tags help with this, you can define for a page which country and language it is meant for. This overcomes a possible duplicate content problem: even if your USA and UK site show exactly the same content, Google will know it’s written for a different region.

what is technical seo blue dolphin web design
is your website losing you sales talk to Blue Dolphin for a free website audit

FREE Website Performance Check

  • Speed plays an important part in website performance how well does yours perform on desktop and mobile?
  • Is your site mobile responsive if not how many customers are you losing?
  • Is your site HTTPS?
  • With GDPR in place is your site legally compliant?
  • Find out about loads more website performance issues
is your website losing you sales talk to Blue Dolphin for a free website audit
Back To Top