|
|
|
|
|
|
|
|
Thread Tools | Search this Thread | Rate Thread | Display Modes |
![]() |
#20 | |||
You can now put whatever you want in this space :)
|
Quote:
Quote:
![]() Bill, what interests me about that Matt Cutts transcript are the concepts of "continuum" and "range." (warning: MY INTERPRETATION) BTW, not taking Matt Cutts at face value is a given. I've already pointed out on a different thread where he misreported how Google treats <STRONG> as opposed to <BOLD>. I also questioned him on his blog when he said invalid markup doesn't cause problems for Google. That's not his department. He wasn't even aware of the recent TBPR update till he read about it on the Web. In this thread, I qualified his transcript by adding "use this info at your own risk." I do NOT blindly trust polls, consensus, popular opinions, rumors, forum posts, people working at wikipedia or cnn (though cnn.com ranks #1 on Google for "news" - it doesn't make them seo experts a.k.a snake oil salesmen), or statements made by so-called "authorities." During the spaces.live.com migration, according to Matt Cutts, a flag was tripped to due to millions of pages suddenly added to live.com. Trust "rating" of each page took a serious enough hit that Google refused to index the new pages without a handjob. He explained this is usually not a problem unless you're talking about hundreds of thousands of new pages. That's an example of a "range", where publishing 100 new pages a day goes under the radar, but publishing million of pages a day trips a flag. As for the concept of a continuum... What if there's a threshold value (say 5 million new pages in the span of a day). That would mean adding 5 million new pages a day will trip a filter, but adding 4.9 million pages or 2.5 million pages a day for three days won't? That makes no sense. On the other hand, a continuum would gauge one factor on a scale. So, for example, having 100 sites on the same IP with the same whois may be better than having 200 sites on the same IP, and having 1000 sites on the same IP may be better than having 2000 sites on the same IP - assuming Google algorithmically runs whois on domains (there are cases where thousands of domains belonging to different people sit on the same IP on a virtual host) instead of only when other flags are tripped, and assumming there's at least a weak connection between what Matt Cutts says and reality. Add to this the concept of co-dependent factors (quote from caveman, a mod at wmw): Quote:
There is no simple yes or no answer. What worked for me isn't necessarily going to work for you.
__________________
Success is going from failure to failure without a loss of enthusiasm. |
|||
![]() |
![]() |
Thread Tools | Search this Thread |
Display Modes | Rate This Thread |
|
|