|
|
|
|
|
|
|
![]() |
#1 |
old enough to be Grandma Scrotum
|
You're right of course, Walrus. In theory, sites that are good for users should rank well in Google.
There's actually a post in that webmaster world thread that says: "Sigh. Here we are, building sites for Google rather than surfers."
__________________
![]() |
![]() |
![]() |
![]() |
#2 | |
You can now put whatever you want in this space :)
|
I know grandmascrotum is looking for a second opinion, so I should really keep my mouth shut, but here's my take anyway:
http://googlewebmastercentral.blogsp...e-content.html Written by Adam Lasnik, a Googler - PR guy but IMO reliable. I would not trust what's posted about duplicate content on WMW. Quote:
- block appropriately - Use 301s - Syndicate carefully - Minimize boilerplate repetition. "For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details." - Avoid publishing stubs (what I call "thin pages" - e.g. a blog category page with just one post) - Understand your CMS. "Make sure you're familiar with how content is displayed on your Web site, particularly if it includes a blog, a forum, or related system that often shows the same content in multiple formats."
__________________
Success is going from failure to failure without a loss of enthusiasm. Last edited by Halfdeck; 2007-04-11 at 01:31 AM.. |
|
![]() |
![]() |
![]() |
#3 | |
Oh no, I'm sweating like Roger Ebert
|
Quote:
If any of the SE gods like it fine, if not fuckem. I sleep at night and not dream of what the next change of the SE gods wind my bring. |
|
![]() |
![]() |
![]() |
#4 |
Lord help me, I'm just not that bright
|
I use a varation of that:
Code:
<?php if( is_home() | is_single() | is_page() ){ if ( $paged < 2) { echo '<meta name="robots" content="index,follow" />'; } else { echo '<meta name="robots" content="noindex,follow" />'; }} else { echo '<meta name="robots" content="noindex,follow" />'; }?> |
![]() |
![]() |
![]() |
#5 |
Shut up brain, or I'll stab you with a Q-tip!
Join Date: Dec 2005
Posts: 118
|
I wouldn't worry too much about duplicate content, i have a vbulletin board just like this board which has duplicate content virtually everywhere. From the archive to the single forum post view. When google couldn't detect good and bad duplicate content it would ban all vbulletin boards. But it seems it doesn't so google must be quite smart.
|
![]() |
![]() |
![]() |
#6 |
Oh no, I'm sweating like Roger Ebert
|
Here are a couple different articles on duplicate content proofing your WP blog.
I like this method (no plug-in) but this article links to a WP plug-in which others may like. |
![]() |
![]() |
![]() |
#7 |
Rock stars ... is there anything they don't know?
Join Date: May 2007
Posts: 19
|
I just use a robots.txt to tell the SE's not to index the category and archive pages so just the post pages get indexed. Seems to work fine for me.
__________________
Promote my wife's hardcore solo girl site - JustineCash - POTD | FHG's etc. |
![]() |
![]() |
![]() |
#8 | |
bang bang
|
Quote:
![]() Cheers, B |
|
![]() |
![]() |
![]() |
|
|