Greenguy's Board


Go Back   Greenguy's Board > General Business Knowledge
Register FAQ Calendar Today's Posts

 
 
Thread Tools Search this Thread Rate Thread Display Modes
Prev Previous Post   Next Post Next
Old 2006-05-11, 03:21 PM   #4
juggernaut
Registered User
 
juggernaut's Avatar
 
Join Date: Apr 2005
Location: Central Jersey! If I was rich and powerful I would dress as my avatar does.
Posts: 1,448
Send a message via Yahoo to juggernaut
http://www.webmonkey.com//templates/...ex4a_meta.html

"Counting Pageviews
The number of pageviews you count is not the actual number of pageviews of your site. "How can this be?" you ask. "I'm simply counting records in my Web server's access log." Well, the fact is a lot of requests never make it to your access log.

First, browsers - at least Netscape and Internet Explorer - have caches. If a person requests a page from your site and soon requests it again, the browser may not go back to your server to request the page a second time. Instead, it may simply retrieve it from its cache. And you would never know. You can try using "expires" or "no cache" tags to stop browsers from caching your pages, but you can never be sure if your tags are read or not.

Second, let's say that a user's browser doesn't retrieve your page from its cache but actually re-requests the page from your server. Many ISPs use proxy servers, and proxy servers cache pages just like browsers. If a person using an ISP with a proxy server makes a request, the proxy server first checks its cache. If the page is there, it serves that page to the person, instead of going to your server. And you would never know.

Again, you can try using the tags I've described above, but there's no Proxy Server Police making sure proxy servers respect your tags.

Another tracking obstacle are bots, or spiders. These software programs scour the Web, either cataloging pages for search engines or looking for information for their owners.

Do you care if your pageview counts include hits from bots? If you do care, then you'd better find a way to ignore these hits. You can create a list of IP addresses to ignore, but with new bots born every day, the list will always be one step - or 100 steps - behind. Similarly, you can use the requester's user-agent string, but there's nothing keeping developers from sending any old string they please. Lastly, you can take a daily count of the hits and just ignore repeat hits from the same IP address if their total number passes some threshold. Then you run the risk of accidentally ignoring hits from an ISP that uses a proxy server and sends its own IP address - instead of a different IP address for each user.

With no perfect solutions, it's up to you to decide which method you can learn to live with. "
juggernaut is offline   Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 11:05 AM.


Mark Read
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2026, Jelsoft Enterprises Ltd.
© Greenguy Marketing Inc