SEO: 7 Reasons Your Site’s Indexation Is Down


Misuse of page removal tools at Bing Webmaster Tools and Google Search Console can lower overall indexation of a site.

Misuse of page removal tools at Bing Webmaster Tools and Google Search Console can lower overall indexation of a site.

Without indexation, there is no possibility of ranking in natural search results. When indexation numbers drop, there are a handful of potential reasons.

Site Speed

Increasingly, slower site speeds are masquerading as 500 server errors in Google Search Console and Bing Webmaster Tools, impacting indexation.

When search engine crawlers can’t access a page at all — or at least within the maximum time allotted for each page to load — it registers as a mark against that page. With enough failed crawl attempts, search engines will demote a page in the rankings and eventually remove it from the index. When enough pages are impacted, it becomes a sitewide quality issue that could erode the rankings for the entire site.

Duplicate Content

There’s no value to a search engine in indexing two or more copies of the same page. So when duplicate content starts to creep in, indexation typically starts to go down. Rather than deciding which of two or more pages that look the same should be indexed, search engines may decide to pass on the whole group and index none of them.

This extends to very similar pages of content as well. For example, if your browse grids for two subcategories share 75 percent of the same products, there’s no upside to the search engine in indexing them both.

Duplicate content can accidentally be introduced as well when pages that are truly different look identical or very similar because they do not have any unique characteristics that search engines look for, such as title tags, headings, and indexable content. This can plague ecommerce sites in particular because browse grids can start to look very similar when their reason for existing isn’t clearly labeled in the copy on the page.

READ ALSO  How to Increase Page Speed on Image-Heavy Websites

New Architecture or Design

Changes to a site’s header and footer navigational structures often impact categories and pages. When areas of the site are removed from those sitewide navigational elements, search engines demote the value of those pages because they receive fewer internal links. Demoted value can result in deindexation.

Likewise, changes in design can affect indexation if the amount of content on the page is reduced or the text is suddenly concealed within an image as opposed to being readily indexable as plain HTML text. As with duplicate content, a page can have value that isn’t readily apparent to search engines; make sure it’s apparent via indexable text to retain indexation.

New URLs

Ecommerce platforms can make unexpected changes to URLs based on changes to taxonomy or individual product data.

When a URL changes but the content does not, the search engines have a dilemma. Do they continue to index the old page that they know how to rank? Or do they index the new page with which they have no history? Or maybe they index both, or neither? All four are options. In one instance, indexation doubles. In the other instance, it falls to zero.

Page Deletion or Redirection

Likewise, when a page is removed from the site, or when a redirect is created to another site, the number of viable URLs for that site decreases. In this instance, you’d expect to see indexation decrease.

Robots.txt and Meta Robots Noindex

The robots commands have great power to affect crawl and indexation rates. They are always the first and easiest place to look when you have concerns about indexation.

READ ALSO  11 Free Chrome Extensions for SEO

Robots.txt is an archaic text file that tells search bots which areas of the site they can crawl and which they should stay out of. Each bot can choose to obey, or not, the robots.txt file. The major search engines usually respect them. Thus, a decrease in indexation would come as a result of disallowing bots from crawling certain files and directories.

Similarly, the noindex attribute of the robots meta tag instructs bots not to index an individual piece of content. The content will still be crawled, but the major search engines typically obey the command not to index — and therefore not to rank — pages that bear the noindex stamp.

Page Removal Tools

Last but not least, Google Search Console and Bing Webmaster Tools offer page removal tools. These tools are very powerful and very effective. Content entered here will be removed from the index if it meets the requirements stated by the engines.

However, it can be easy for someone to remove too much content and accidentally deindex larger swathes of the site. After checking the robots.txt file and meta tags, make these tools your second stop, to check into any recent manual deindexing.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com