Quote:
Originally Posted by Halfdeck
Tag cloud is a good idea, but you might try temporarily robots disallowing them (say, for 14 days) and see if a few pages pop into the main index.
|
It is the oddest of coincidences that I just removed /pervetags from my robots.txt last night after looking at my scan stats in Google's webmaster tools.
URLs restricted by robots.txt (2467) - and most them being tag pages. So I said,
screw it, and allowed the hungry bot in.
I've been reading your blog, as well as those you link to, in order to get a better grip on Page Rank distribution and its relationship with the supplemental index. So I'm not completely oblivious of what you're saying - just damn close.
Here's my single issue: if I'm getting traffic from those supplemental pages, isn't that better than missing out on that keyword traffic completely? We can theorize that by blocking most of my superfluous pages (the tag and title pages) with robot.txt, that my Page Rank distribution would narrow down to my category pages and hopefully raise their value to Google and therefore [fingers crossed] gain me more traffic to those pages. But I've never had much success with Google on my other sites, which is why I've come to the point that I'm at now; creating a ton of pages with the hopes that a few of them will find some hits. Yes, I'm throwing shit at the wall and hoping that at least its odor will stick.