The MSN bot or BING bot crawls your website to index your website’s content and show the results under the Bing search engine.
Bing supports the directives of the Robots Exclusion Protocol (REP) which can be listed in a site’s robots.txt file.
The crawl rate of this bot can be controlled using this robots.txt file. In order to decrease the crawl rate you should add the following content inside the file:
User-agent: msnbot
Crawl-delay: 1
Bing recommends using the lowest possible value for the crawl-delay. Here is a list of values which you can use on your website:
- No crawl delay set – Normal
- 1 – Slow
- 5 – Very Slow
- 10 – Extremely Slow
An alternative method of managing the Bing crawl rate for your website is to register for Bing’s webmaster tools and set up your settings there.