DIGITAL HEALTH CHECK

SEO basics for councils in the UK and Ireland


In this article we look at how councils in the UK and Ireland are approaching the basics of Search Engine Optimisation (SEO). We will focus on how they approach the use of sitemaps and add a robots.txt file to help search engines to crawl and index their websites.

A sitemap and robots file are a basic requirement for effective SEO and something that all councils should consider. Both are easy to setup and there is very little excuse for not doing them well.

Before reviewing the information gathered from councils let’s consider the purpose of sitemaps and robots.txt in relation to SEO.

Sitemaps

A sitemap is a file which presents the structure of a website. This includes pages and content and the various relationships between them. Sitemaps are used by search engines to crawl websites more efficiently.

There are two types of sitemaps which can be used:

  • XML sitemaps are files written in xml for search engine crawlers. XML stands for Extensible Markup Language. XML sitemaps are the preferred format of sitemaps for search engines, such as Google.
  • HTML sitemaps look like regular web pages and help users navigate the website. HTML sitemaps were once a popular way to improve a website’s navigation and provide links to all your pages in one place. They aren’t used as much now but they can help with SEO. They can help to improve internal linking and offer an alternative layer of navigation for websites with many pages. As councils fall into this category it is easy to understand the popularity of HTML sitemaps.

Robots.txt

A robots.txt file helps search engines to identify what to crawl and what not to crawl on a website. The robots.txt file should be added to the root folder of a website (for example https://www.council.gov.uk/robots.txt). The robots file should also ideally link to the XML sitemap.

Reviewing SEO basics for councils in UK and Ireland

Do council websites have a sitemap.xml?YesNo
Number of councils324 (79.0%)86 (21.0%)
Do council websites have an HTML sitemap?YesNo
Number of councils101 (24.6%)309 (75.4%)
Do council websites have a robots.txt file?YesNo
Number of councils322 (78.5%)88 (21.5%)
Is the sitemap.xml referenced in the robots.txt file?YesNoN/A
Number of councils102 (24.9%)216 (52.7%)92 (22.4%)




34 (8.3%) councils not adding a sitemap.xml include an HTML sitemap instead. 67 councils (16.3%) add both a sitemap.xml and an HTML sitemap. 51 (12.4%) councils do not add either type of sitemap to their website.

About a quarter of all councils in the UK and Ireland (101 councils) add an HTML sitemap. 67 councils (16.3%) add both a sitemap.xml and an HTML sitemap.

18 (4.4%) councils don’t add either a sitemap (XML or HTML) or a robots.txt file. 2 of these councils add a reference in the metadata to encourage search engines to index them. A similar reference in the metadata was also found on 37 council websites.

The format of the robots file varies greatly from council to council. In some it includes the bare minimum, simply allowing all search engines to index their sites. In the majority of sites it allows search engines and then specifies a list of items to disallow from being indexed. Items to disallow include login pages and specific applications used on the site.

As mentioned earlier the robots.txt file should ideally link to the website sitemap. Only 102 councils (24.9%) that included a sitemap.xml did this and 216 councils (52.7%) that failed to include it. Those that currently omit the sitemap reference are encouraged to add it in to help search engines to reference it.

The problem with the robots.txt file is that it is generally something set up at the launch of a site and then forgotten. There is a temptation to focus on the needs of the site at that time and not to revisit the robots file at a later date when the website has changed. Revisiting the robots.txt file is a task to do periodically, particularly following any significant changes to the website. Leaving it unchecked may not cause any major issues, but examples were found that would benefit from updating the robots.txt file.

Examples include:

  • Cambridgeshire County Council
    Robots.txt points to a temporary location that just presents Lorem Ipsum text
  • The Royal Borough of Kensington and Chelsea
    Robots.txt points search engines to the sitemap.xml for Westminster City Council
  • Rugby Borough Council
    Robots.txt points search engines to a site redirects location which is no longer available.

Five top tips for councils around SEO basics

  1. Add a sitemap.xml and a robots.txt file to your website if you don’t already do so.
  2. Make sure that your robots.txt includes a current link to your website sitemap.xml.
  3. Take the time to consider what to include in your robots.txt file and disallow sensitive content such as login pages or parts of the server that may compromise security.
  4. Periodically review your sitemap.xml and robots.txt to make sure that they remain current and useful.
  5. Submit your sitemap.xml to search engines such as Google using webmaster tools such as Google Search Console.




SEO basics for council websites in the UK & Ireland – Research

You can view our research on SEO basics for council websites below. A variety of charts and a search facility can be found on the directory by selecting the Extensions option. Please get in contact if you have any feedback or want to report any updates to the research.