Greenguy's Board


Go Back   Greenguy's Board > Link Lists & Getting Listed
Register FAQ Calendar Today's Posts

Reply
 
Thread Tools Search this Thread Rate Thread Display Modes
Old 2006-08-10, 09:08 AM   #1
Simon
That which does not kill us, will try, try again.
 
Simon's Avatar
 
Join Date: Aug 2003
Location: Conch Republic
Posts: 5,150
Send a message via ICQ to Simon Send a message via AIM to Simon Send a message via Yahoo to Simon
Quote:
Originally Posted by Chop Smith
If this was in Celtic's robots.txt file, would it have caused his problem?
"User-agent: *
Crawl-Delay: 20"
I don't know why that would have caused a problem. Unless maybe it was misunderstood/mistaken for something else. All it should do is tell a bot how long to wait between crawls. The crawl-delay only works when supported, which I believe is by the bots for some search engines: Yahoo, MSN, Ask, etc. (It's not supported by Google's bots I believe... anyone know for sure?)

I could see someone getting upset at things you might do with the noindex, nofollow, disallow directives. But a crawl-delay of 20 seconds can't be used to do anything nasty/tricky as far as I know... and someone please correct me if I'm missing something here.
__________________
"If you're happy and you know it, think again." -- Guru Pitka
Simon is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 06:16 PM.


Mark Read
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
© Greenguy Marketing Inc