Exploring the after effects of Google’s 2018 Core Algorithm Update


clip_image002

Last March, Google’s new Search Liaison Danny Sullivanconfirmed a “broad core algorithm update”. Although his tweets were typically vague as to the particulars, the search community has broken it down into two major areas, which I will explore in this article.

Websites that got hit were not necessarily penalised for low quality

Whenever a new update rolls out, site owners are quick to assume that its aim is to penalise low-quality websites and/or spammy content. Google’s recent March update is aimed, instead, to improve how the organic search results look by giving relevant content a boost. This has resulted in two things:

Highly Relevant Content Has Benefited

Last April,John Mueller confirmed that the update is about relevance and has nothing to do with content quality. In other word it is focused on improving the quality of the SERPs not on penalising sites.

For instance, if your content provides a better answer to a given search query, then Google will rank it higher to pages that are not necessarily 100% related to that search query. In other words, highly specific and targeted content will outrank content more generalised or broad in scope. Given this, those who were negatively affected likely lost their rankings on a range of long tail search, instead of an across-the-board decline on all web pages.

Let’s look at an example. According to Search Engine Watch analysis of Google UK results for the query: “What’s the best toothpaste”,this Business Insider article (although not updated since October 2017) was pushed up to page one in the SERPs, whilstthis Colgate page for a US audience was pushed down to page five andthis Amazon best seller page was pushed down to page three.

Interestingly, during the weeks that followed the roll out, there were also a couple of more relevant websites that appeared on the first page of SERPs that did not have any previous visibility.

READ ALSO  Three ideas to create a high-converting product page

It Does Not Benefit Duplicate (Or Near Duplicate) Content

Websites, regardless of following, lost rankings on content that is duplicated or similar to content on other sites.

Clearly in this era of fake news, Google sees the need for more accountability for facts. Analysis from Rocket Mill, has shown that, as a result, UK tabloid newspapers have lost out the most during the recent update.

Their research shows The Mirror lost 27% of organic search visibility, The Sun lost 21%, Mail Online and the Daily Express both lost 20%, and The Daily Record lost 24%. Even BuzzFeed saw a 30% drop in visibility.

Moreover, dictionaries and song lyrics were demoted in favour of Google’s featured snippets. Mobile-optimised and digital-friendly broadsheets, on the other hand, have gained visibility in both UK and US SERPs.

The same results were demonstrated inMarie Haynes’ analysis, wherein big brands that engage in rewriting content found on another site have all lost visibility.

You Can’t “Fix” Lost Rankings

clip_image004

Google was clear—there’s no way to recover visibility loss from the March update today. On the plus side, however, it does not mean that you were doing something wrong. It only means that your website is not entirely relevant to search queries.

Moreover, Google most likely recognised that one (or more) of your competitors have provided much better content than what’s available on your website.

The only remedy to this is to improve your website. Here are some basics to get you started.

  • Don’t panic; run audits. In the event of visibility loss, rule out any cause of penalties. Check if you have any technical mishaps (e.g. broken links, page speed issues, etc) and fix them. If your site is all good on that front, then lost rankings could well be the recent update. Keep an eye on your rankings for other changes.
  • Follow and discuss with search community. In cases of algorithm changes, plenty of other websites will be affected. Monitor other sites in your industry or niche and confirm that what your site is experiencing isn’t an isolated case.
  • Read Google’s Quality Raters’ Guidelines. Although it’s 160 pages long and hardly a riveting read, you’ll likely glean some valuable insight on how you can improve your site content.
  • Analyse your competitors. If possible, have someone else work on this so you’ll get an impartial comparison of where your site stands among your competitors and how else you can improve. Among other points, find out if there are content (e.g. guides, videos, reviews) that you can produce so your audience will go to your site instead.
  • Improve current content. If you have existing content that can be improved (e.g. add more data, turn text into videos, update findings, conclusions, etc), make sure that you beef them up as Google will more likely favour this over older more out of date content.
READ ALSO  Five ways to use predictive analytics

Take a look atMonitor Backlink’s strategy for recovering from the broad core update. To sum up, they updated valuable (i.e. long-form) blog posts that lost them rankings and wrote better content for high-quality keywords. Their recovery was quick, as updated blog posts quickly went back up the SERPs.

Work On Your Content

To reiterate, this recent update is not a quality issue; rather, a relevance issue. Make sure that your content are relevant to the keywords you’re ranking for. Update your current content or create new ones that are immensely better than your competitors.

It’s no secret that Google only wants to provide better search results to its users, and as such, would keep refining their algorithm. If your M.O. has always been providing high-quality and valuable content to your target audience, then it won’t be hard for you to bounce back from this update.





Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com