Quote:
Originally posted by amadman
I belive hitbots request everything like a surfer.
If so then a hitbot hit would burn as much as a real surfer hit.
|
Wrong,
Bots, crawlers, spiders ( excluding those offline browsing site downloading bots AND image search bots) use very little of your bandwidth when compared to browsers.
When using a browser, the browser reads the HTML code and execute it. it will download every css/script/jpeg/gif/bmp etc
that the web page indicates as needed for being written in its source code. otherwise the surfer will see non of the images you put "on the page".
for every img src , css & scripts href tags the browser will send a request to your server to recive it (using your bandwidth of course)
on the other hand,
Bots, crawlers, spiders etc will most likely need to visit your site for indexing it or validate part of your source code.
since the source code (the page HTML) is the only required element for indexing or validating parts of your code (to verify a recip link for example). only the source will be downloaded.
Bots that crawl for images or for offline browsing might and will most likely will use as much bandwidth as browsers.
If you will look at Google search results for example, you will see the size of each document right after the url before the "Cached" link.