Robots.txt tip from Bing: Include all relevant directives if you have a Bingbot section


Frédéric Dubut, a senior program manager at Microsoft working on Bing Search, said on Twitter Wednesday that when you create a specific section in your robots.txt file for its Bingbot crawler, you should make sure to list all the default directives in that section.

Specify directives for Bingbot. “If you create a section for Bingbot specifically, all the default directives will be ignored (except Crawl-Delay),” he said. “You MUST copy-paste the directives you want Bingbot to follow under its own section,” he added.

What does it mean? This probably means that Bing has seen a number of sites complain that Bingbot is crawling areas of their websites that they do not want to be crawled. It’s likely some webmasters assumed that if they gave Bingbot some specific instructions, it would follow the rest of the default directives not listed. Instead, if you have a section for Bingbot, it will only follow the directives you’ve specifically listed in that section of your robots.txt file. If you do not have a specific section for Bingbot, then Bingbot will follow the default directives.

Why it matters. Make sure that when you set up your robots.txt file, that all the search engine crawlers can efficiently crawl your site. If you set up specific directives for blocking, crawl delays or other directives, then make sure that all the search engine crawlers are listening to those directives. They may not listen if there are syntax issues, if you do not follow their protocols or they have issues accessing such directives.

For more on setting up a robots.txt for Bing, see the help documents.


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com