Google Downgrades Nofollow Directive. Now What?

Google Downgrades Nofollow Directive. Now What?


On March 1, Google will no longer consider nofollow attributes as commands. Instead, they will be hints, much like canonical tags.

Until now, nofollow attributes have been a protective barrier between your site’s authority and the potentially questionable sites it links to. It’s the equivalent of telling Google, “Hey, I don’t know this guy; I can’t vouch for him.”

For example, anyone can leave a comment on a product page or blog with a link to her own site. You wouldn’t want that link to damage, by association, your reputation and authority.

It’s the equivalent of telling Google, “Hey, I don’t know this guy; I can’t vouch for him.”

Placing a nofollow attribute on a link’s anchor tag or in a page’s meta robots tag has always been a reliable tool for a discipline — search engine optimization — that deals in gray areas.

Some sites use nofollow links in another way: to limit the indexation of internal pages with no organic-search value. This tactic could be effective if every link to the page included the nofollow directive. However, if even one “followed” link found its way to a page that was linked elsewhere with nofollow attributes, that page could be included in the index.

Regardless, all that changed last fall with Google’s announcement that it will downgrade the nofollow directive to a hint. At that time, Google also introduced two new attributes for link anchor tags only: ugc (for user-generated content, such as reviews and comments) and sponsored (for links in ads).

If you haven’t already, review by March 1 your site’s nofollow attributes to determine if you need to use other methods to control link authority and indexation — see “Restricting Indexation,” below.

Protecting Links

You can use nofollow, ugc, and sponsored attributes to hint that you don’t want the link to pass authority. But remember that it’s just a request, not a command.

Affiliate sites once used 302 redirects (“moved temporarily”) to strip authority from their links. The authority-stripping value is questionable now, however, since Google declared a couple of years ago that 302 redirects pass as much link authority as 301s (“moved permanently”).

The foolproof method now to avoid passing link authority to questionable sites is to remove the links. For example, if your site suffers from review or comment spam, where visitors post irrelevant links to their site, you could remove the offending comments or reviews. If the volume is too high, consider eliminating comments or reviews altogether.

Unfortunately, that would also prevent legitimate customers from submitting reviews and comments that could boost your relevance to search engines.

If the content is relevant but you don’t want to vouch for included links, consider removing the anchor tag that forms the link. Such a drastic step, however, is necessary only if you know you’re linking to spammy sites, intentionally or not.

Restricting Indexation

It’s always best — especially now that nofollow attributes are hints — to use a method that search engines will interpret as a command. The only surefire, 100-percent effective way to prevent a page from appearing in Google’s index is to remove it from your site or 301 redirect its URL.

Otherwise, here are four options:

  • Meta robots noindex tag. Placing this meta tag in the head of a page’s HTML directs search engines not to index that page. They have to crawl the page to discover the tag, though, and continue to crawl it to confirm the tag remains in place. Thus pages with noindex tags still waste crawl budget, limiting the number of new pages that bots can discover with each crawl, even though they don’t get indexed.
  • Robots.txt disallow command. Robots.txt is a text file at the root of your site. Including a disallow directive for a page or group of pages prevents search engine bots from even crawling them. It stops new indexation and preserves crawl budget, but it can take a long time for already-discovered pages to be purged from the index.
  • Password protection. Bots don’t fill out forms or use login credentials. So adding password protection would stop the crawl and prevent indexation. It’s too extreme for most ecommerce sites because it places a barrier between the products and customers. But it is an option for some forms of content and is essential for account and cart pages.
  • Request removal. In Google Search Console, you can submit a request to remove a URL from the index. If approved, the removal is temporary, lasting just six months.



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version