How to carry out a technical SEO audit: Tips for beginners


Technical SEO. Even its name sounds intimidating.

If you’re a regular reader of SEO websites, or have read any generalised SEO best practice guides, no doubt you’ll have come across many references to the importance of technical SEO.

Technical SEO, which refers to the elements of search optimisation that help search engines crawl and index your website, is often described as the “foundation” of search optimisation. It can also be – as the name implies – technical, and often technical SEO techniques will overlap as much with the work of a developer as they do with the work of a marketer.

However, that doesn’t mean that you need to be a technical specialist to carry out a technical SEO audit of your website – even if you’re a complete beginner. At last week’s Brighton SEO conference, Helen Pollitt, Head of SEO at Reflect Digital, presented a highly accessible introduction to technical SEO for beginners, packed with tips to get you started on your first SEO audit.

Here are some of her tips.

1. Checking your website

One big obstacle to spotting potential technical SEO issues is that most of the time, you’ll be accessing your website using the same device and the same browser – often a desktop computer, with a widely-used browser like Chrome or Safari.

But people will be accessing your website from all manner of different devices and browsers, and ideally, you’d like each of them to get the same great experience. So, the first tip that Pollitt gave was to “dust off Internet Explorer” – yes, really – and view your website in a browser that is still used by 3-7% of web users. How well does – or doesn’t – it perform?

Your second port of call should be to disable JavaScript, and again check your website to see what’s still working. Many people disable JavaScript to get rid of unwanted ads, or bandwidth-sucking applications.

SEO Best Practice Guide

On top of this, Pollitt noted that not all search engines are good at following links that require JavaScript to work – so the lovely animated navigation features on your site like drop-down and collapsible menus might not be benefitting your SEO at all.

Then, of course, there’s mobile. We are firmly in an age where mobile use unquestionably outstrips desktop, and Pollitt stressed that the version of your website that appears on mobile should be the best version of your site.

You should also avoid any uses of Flash like the plague: HTML5 will give you all of the same functionality, without any of the resulting issues with mobile.

A presentation slide reading 'No flash for a mobile site', with a cross next to the Flash logo and a tick next to the HTML 5 logo.2. Crawling your website

Search engines use crawlers, also called search spiders, to index the contents of a website. However, crawlers are also tools that you can use to find out what’s going on “under the hood” of your website.

Screaming Frog is a great tool for this and probably the most widely-recommended by SEOs. Its SEO spider is also free, for up to 500 URLs.

One handy thing you can do to check for technical SEO issues on mobile is to set your crawl bot to mimic Googlebot for Smartphone – though Pollitt warned that some websites may unfortunately block bots that are imitating Google, in which case you’ll need to fall back on a regular crawler.

READ ALSO  How to Create a Global SEO Strategy

Other things to look out for when you crawl your website:

  • Make sure you have set your bot to crawl sub-domains, like m.domain.com
  • Assuming that your site uses HTTPS – which it should! – check for any HTTP resources that turn up during a crawl, which indicate that your site isn’t as secure as you thought it was
  • Check that none of your pages are returning anything other than a 200 status code (which is an “OK” status) – and especially that there aren’t any 404s
  • Check your directives, also known as metatags: noindex, nofollow, canonical tags, rel=next and rel=prev
  • Look out for orphans (pages with no internal links pointing back to them) and low-linked pages – these are effectively you telling Google that this page is unimportant, since you don’t consider it worth directing visitors to! Your crucial pages should have plenty of internal links pointing to them
  • Look out for “spider traps” – pages that might have been automatically created without you intending to. If you have a relatively small website that takes a very long time to crawl, this is a tell-tale sign that there might be pages you didn’t know about on your site.

Configuring your web crawler to crawl all subdomains.

3. Using Google Search Console

Google Search Console is like a “dashboard” for webmasters and SEOs to check up on the status of their site in search. You’ll need to sign up in order to use it (it’s free!) and add your website to your Search Console, which involves verifying you own the site.

There are various ways to do this, including uploading an HTML file to the site, adding an HTML meta tag, or adding a Google Analytics tracking code or Google Tag Manager container snippet. Depending on your level of access and the tools you use, you might need a developer to carry out the verification.

SEO Best Practice Guide

The Notifications section of Google Search Console flags up developments with your site that you need to be aware of, such as index coverage issues it encounters while crawling your site, and manual actions (hopefully you won’t get any of these!).

Even if you’ve already used a crawler to check for issues on your site, Pollitt noted that the big advantage of using Google Search Console is that Googlebot can find pages that are linked elsewhere online that you might not know existed – whereas regular crawlers are confined to what’s linked to on your site.

You should also check your disavow file (a list of links you’ve asked Google not to take into account when assessing your site, e.g. spammy links) for any important, genuine links that might have been disavowed by accident, possibly because they seemed suspect at the time.

Disavowing links in Google Search Console

4. Submitting your XML sitemap

An XML sitemap is a file saved in the format .xml which lays out the structure of your website and helps search engines to crawl it. It can help indicate to search engines which pages on your site are important and how often they’re updated, as well as tell search engines about pages that exist which might not have been indexed.

READ ALSO  Does Switching from HTTP to HTTPS Affect SEO?

Screaming Frog’s SEO Spider tool can generate an XML sitemap for you after it has finished crawling your site. Here is a step-by-step guide to using the SEO Spider to generate a sitemap.

If you have an existing XML sitemap – which can also be created via a plugin on a CMS like WordPress, or built manually – you can also use Screaming Frog to audit it for errors by uploading it in list mode (BuiltVisible has a step-by-step guide here). Check that there are no server codes displaying except 200, or pages missing from your sitemap.

Uploading a sitemap in list mode on Screaming Frog

Finally, upload your XML sitemap to Google Search Console to help Google’s crawlers understand your site. The Sitemaps Report in Search Console allows you to view and submit sitemaps, and check for processing errors.

5. Using Robots.TXT

Robots.TXT, explained Pollitt, is a file on your website “for communicating with robots”. Not our future robot overlords, mind you, but web robots – most notably search engine crawlers.

Robots.txt files are used for indicating whether web robots can and cannot crawl certain parts of a website. The Robots.txt file for your site can be found at [yourdomain]/robots.txt.

Pollitt advised checking your robots.txt file to make sure you aren’t blocking crucial pages or resources. In particular, you must not block CSS and JavaScript files, as Google uses these to check whether your website works properly, and if they’re blocked, it won’t be able to render it as intended. Many SEO tools now also render webpages and use JavaScript, and would therefore encounter the same issues.

Be aware also that if you’ve blocked a page using robots.txt, search spiders won’t know whether you’ve added a noindex meta tag to the page’s HTML (to prevent it from being indexed), and it might still appear in search results, for example if another website links to it.

6. Speeding up your site

Site speed is extremely important for a good user experience, and it is also a Google ranking factor on both desktop and mobile search (though at present, Google’s mobile search only penalises the very slowest websites).

Pollitt recommended the Pingdom Website Speed Test for checking desktop speed, and Google’s mobile speed checker for mobile.

The server is often the worst culprit for slowing down websites, which is something that you would need to talk to a developer or your domain provider about in order to sort.

Other site speed fixes include storing some of your code in Google Tag Manager, or using a Content Delivery Network, which is a geographically distributed group of servers working together to provide fast delivery of internet content. It isn’t a substitute for web hosting, but can help to relieve some of the pain points of traditional web hosting, for example by using caching to reduce bandwidth.

To take your technical SEO auditing to the next level, download Econsultancy’s SEO Best Practice Guide to Technical SEO.

Econsultancy also offers training in SEO.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com