If you want you want to allow SharePoint 2010 or 2013 to crawl your web site add the following to your robots.txt file.
User-agent: MS Search 6.0 Robot
Disallow:
Even though the crawler sends Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 6.0 Robot) as the user agent string, this is not what you should check against. Logical…. nah, but it is what it is.
Total cost to figure this out: 6h
Reference: The SharePoint Server crawler ignored directives in Robots.txt