How SEO is changing in 2020

How SEO is changing in 2020


Every digital marketing department is in a scramble to respond to core algorithmic updates that rolled out early in 2020. Are you up to speed with the latest?

On January 13, 2020, Google began to release a January 2020 Core Update after giving just one hour’s notice. The purpose of Google search engine algorithms is to search through relevant databases and deliver results to search queries. They guide Google to sort web pages by ranking. The algorithms used to be updated rarely but spammy content by website owners and advances in technology have made regular updates necessary. SEO analysts, online marketers and content developers find it hard to keep up and understand the algorithm patterns.

Olivia Hernandez Marketing Lead at Search Recon

A number of websites were affected by the January 2020 Core Update. Google advised website owners that they don’t have to do anything if they start losing rankings except to have “great content” on their websites. As one SEO expert, but as Olivia Hernandez of the agency Search Recon explains: “Businesses should keep in mind that Google’s comment relates to the latest update. However a series of constant updates were made in recent months and years – and if you’re still making use of updated tactics, now is a good time to stop and reconsider your game plan. It is amazing to see how some sites regain their traffic when they make appropriate changes”.

Importance of Google updates

Many small updates are never noticed but sometimes Google releases updates that have a huge effect on website rankings and penalize sites that run against its policies. These need to be keenly analyzed to understand the fluctuations in Google ranking and websites’ organic traffic and to improve SEO strategies. This will improve a website’s search engine optimization.

Latest Google algorithm updates

Here are the latest major updates in the Google algorithm that can help SEO analysts, online marketers and content writers to change their SEO strategies.

Broad core algorithm updates

Google announced, via Twitter, a major update on March 9, 2018 that it called Google’s Broad Core algorithm Update.

The aim of the update was to provide improve search results and help pages that were under-rewarded yet had great user value by encouraging them to continue creating quality content. When the update came, it automatically down-ranked some sites and the sites could do nothing about it to improve their rankings.

Maccabees update

Maccabees disturbed many people in December 2017 because it was designed to catch long-tail keywords with permutations. It resulted in loss of up to 30% traffic for hundreds of websites because they had multiple pages full of large keyword permutations. Most of the sites affected by Maccabees were affiliate websites, e-commerce, real estate and travel. The reason behind the update was that all the long keywords are huge traffic drivers and Google wanted to make search results more relevant.

History of major Google algorithm changes

The following huge updates over the years caused shock waves as some sites got good ranking while others did not.

Panda

Google’s Panda update was released in February 2011 with the aim of returning higher-quality sites near the top of search results and lowering the rank of websites that had scant or poor-quality content. Panda’s search filter is updated quite frequently and sites that make the required changes escape the filter.

Panda targets plagiarized content, thin content, duplicate content, user-generated spam and keyword stuffing. It allocates quality scores to web pages depending on content quality to rank the pages in SERP. To comply, website owners should regularly check their websites for plagiarized content, scant content and keyword stuffing using tools such as Copyscape and Siteliner.

Penguin

Penguin was released in April 2012 with the aim of filtering out websites that use spammy links, links with over-optimized text and irrelevant links to boost their ranks in SERPs. Spammy links are links that are bought or got from sites whose sole purpose is to sell links to boost Google ranking. Penguin checks the quality of backlinks and down-ranks sites that have low-quality links. Website owners can comply by regularly auditing backlinks for quality and tracking their growth profile by the links using tools such as SpyGlass.

Hummingbird

Hummingbird’s significance lies in its ability to rank pages for the complete query instead of searching for individual terms within the query. Even when a query doesn’t contain exact terms, Hummingbird relies on the importance of keywords to help a page to rank well in the SERPs. It targets keyword stuffing and low-quality content. Website owners can survive Hummingbird by increasing research on keywords, focusing on conceptual queries, and searching on related queries, co-occurring words and synonymic queries. Google Autocomplete and Google Related Searches will help.

Pigeon

Pigeon was released in 2014 with the aim of enhancing the rankings of local listings based on the user’s location. It is linked to Google maps and ties the traditional core algorithm to the local algorithm. This update targets poor off-page optimization and dangerous on-page optimization. Website owners have to improve on-page and off-page SEO. They can start with on-page SEO and then adopt the best off-page SEO methods (e.g. be in local listings) to rank high in Google’s SERPs.

Mobile-friendly update

Google Mobile-friendly update, first released in April 2015, aims to boost mobile-friendly pages in mobile search results while filtering out pages not optimized for mobile and non-mobile-friendly pages. The update targets poor mobile-user interfaces and lack of mobile optimization and either downranks or filters out web pages that are not optimized for mobile. Website owners should make their pages mobile-useable and reduce loading time. Google’s mobile-friendly test is a good tool to use.

RankBrain

RankBrain update is a machine learning AI system released in October 2015 with the aim of understanding queries better, processing search engine results efficiently, and providing relevant content to the user. It targets poor user-interface, irrelevant features on the web page and insubstantial content. Website owners can survive RankBrain by analyzing competition and optimizing web pages for completeness and their content for relevance. Tools such as Spyfu an SEMRush are useful for such analysis.

Possum

The Possum update, first released in September 2016, is the most significate update after Pigeon. It focuses on ranking web pages for local searches. Any business close to the search query is likely to see itself at the top of the local search results, including businesses outside the physical city area. Website owners can benefit by increasing their keywords list and performing location-specific rank tracking.

Fred

Fred was released in March 2017. It targets content that is thin, low-quality, ad-centered and affiliate-heavy, i.e. pages that violate the guidelines of Google webmaster. If a web page targets audience mainly to make revenue by driving traffic, it is likely to suffer from the Fred update. Websites can adapt to Fred by analyzing content through Google Search Quality Guidelines and removing all thin content. Any website that allows ads should make sure they are on pages with high-quality and useful content for users.

Survival through compliance

Algorithm updates are Google’s way to encourage website owners to maintain the trust of their audience by providing relevant and useful content. They discourage the use of tricks and ruse to get high rank on SERPs. Website owners who are serious about business must comply to rank high.



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version