Quote:
Originally Posted by Chop Smith
If this was in Celtic's robots.txt file, would it have caused his problem?
"User-agent: *
Crawl-Delay: 20"
|
I don't know why that would have caused a problem. Unless maybe it was misunderstood/mistaken for something else. All it should do is tell a bot how long to wait between crawls. The crawl-delay only works when supported, which I believe is by the bots for some search engines: Yahoo, MSN, Ask, etc. (It's not supported by Google's bots I believe... anyone know for sure?)
I could see someone getting upset at things you might do with the noindex, nofollow, disallow directives. But a crawl-delay of 20 seconds can't be used to do anything nasty/tricky as far as I know... and someone please correct me if I'm missing something here.