|
|
|
|
|
|
![]() |
#1 |
old enough to be Grandma Scrotum
|
Blogs and Duplicate Content in Google
I've been delving into the rather scary, complicated and insane world of SEO the last few days and it's been a bit of a roller coaster ride, to say the least.
One of the big things I've been looking at is duplicate content, because we all know it's something that Google hates. Today I read that some people are concerned about the amount of duplication that occurs in Wordpress thanks to category, date and tag pages. http://www.yellowhousehosting.com/re...googles-index/ http://www.webmasterworld.com/google/3097706.htm Various solutions were offered. One was to make sure index, cat and date pages were only partial so that only the posts themselves were unique. Someone suggested putting this code in the header: HTML Code:
<?php if(is_home() ¦¦ is_single() ¦¦ is_page()){ echo ‘<meta name="robots" content="index,follow">’; } else { echo ‘<meta name="robots" content="noindex,follow">’; }?> And then someone else said: "I have heard Google handles wordpress out of the box with no problem. I can't speak from experience since all my blogs are small. Matt Cutts uses Wordpress. Search for the the character that Matt dressed up as last year. Matt seems to rank ok for that term." What do the good bloggers here think about this?
__________________
![]() |
![]() |
![]() |
![]() |
#2 |
old enough to be Grandma Scrotum
|
This page sums up the theories on this
http://www.seoresearcher.com/how-to-...ntent-safe.htm
__________________
![]() |
![]() |
![]() |
![]() |
#3 |
Oh no, I'm sweating like Roger Ebert
|
I don't know shit but here goes my take.
First, while I do believe there is a duplicate content penalty, I think most people get too excited when they here about such things. I pretty much posted my feelings on it here http://www.greenguysboard.com/board/...7&postcount=44 Content is also defined by more than just the text and images. Presentation also fits into content. With date, category and tags you are sorting your posts in a manner that is actually beneficial to the surfer, Your allowing the information to be presented to the surfer in the manner he is most interested in. In other words you are enhancing the surfer experience. I find it only logical that Google et al would find that a positive thing rather than penalizing you for it. Finally...damn I forgot what finally was. If I remember I guess I'll just post again. |
![]() |
![]() |
![]() |
#4 |
old enough to be Grandma Scrotum
|
You're right of course, Walrus. In theory, sites that are good for users should rank well in Google.
There's actually a post in that webmaster world thread that says: "Sigh. Here we are, building sites for Google rather than surfers."
__________________
![]() |
![]() |
![]() |
![]() |
#5 | |
You can now put whatever you want in this space :)
|
I know grandmascrotum is looking for a second opinion, so I should really keep my mouth shut, but here's my take anyway:
http://googlewebmastercentral.blogsp...e-content.html Written by Adam Lasnik, a Googler - PR guy but IMO reliable. I would not trust what's posted about duplicate content on WMW. Quote:
- block appropriately - Use 301s - Syndicate carefully - Minimize boilerplate repetition. "For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details." - Avoid publishing stubs (what I call "thin pages" - e.g. a blog category page with just one post) - Understand your CMS. "Make sure you're familiar with how content is displayed on your Web site, particularly if it includes a blog, a forum, or related system that often shows the same content in multiple formats."
__________________
Success is going from failure to failure without a loss of enthusiasm. Last edited by Halfdeck; 2007-04-11 at 01:31 AM.. |
|
![]() |
![]() |
![]() |
#6 | |
Oh no, I'm sweating like Roger Ebert
|
Quote:
If any of the SE gods like it fine, if not fuckem. I sleep at night and not dream of what the next change of the SE gods wind my bring. |
|
![]() |
![]() |
![]() |
#7 |
Lord help me, I'm just not that bright
|
I use a varation of that:
Code:
<?php if( is_home() | is_single() | is_page() ){ if ( $paged < 2) { echo '<meta name="robots" content="index,follow" />'; } else { echo '<meta name="robots" content="noindex,follow" />'; }} else { echo '<meta name="robots" content="noindex,follow" />'; }?> |
![]() |
![]() |
![]() |
#8 |
Shut up brain, or I'll stab you with a Q-tip!
Join Date: Dec 2005
Posts: 118
|
I wouldn't worry too much about duplicate content, i have a vbulletin board just like this board which has duplicate content virtually everywhere. From the archive to the single forum post view. When google couldn't detect good and bad duplicate content it would ban all vbulletin boards. But it seems it doesn't so google must be quite smart.
|
![]() |
![]() |
![]() |
#9 |
Oh no, I'm sweating like Roger Ebert
|
Here are a couple different articles on duplicate content proofing your WP blog.
I like this method (no plug-in) but this article links to a WP plug-in which others may like. |
![]() |
![]() |
![]() |
#10 |
Rock stars ... is there anything they don't know?
Join Date: May 2007
Posts: 19
|
I just use a robots.txt to tell the SE's not to index the category and archive pages so just the post pages get indexed. Seems to work fine for me.
__________________
Promote my wife's hardcore solo girl site - JustineCash - POTD | FHG's etc. |
![]() |
![]() |
![]() |
#11 | |
bang bang
|
Quote:
![]() Cheers, B |
|
![]() |
![]() |
![]() |
#12 | |
Rock stars ... is there anything they don't know?
Join Date: May 2007
Posts: 19
|
Quote:
__________________
Promote my wife's hardcore solo girl site - JustineCash - POTD | FHG's etc. |
|
![]() |
![]() |
![]() |
#13 |
Oh! I haven't changed since high school and suddenly I am uncool
Join Date: Dec 2004
Posts: 251
|
The way I see it this is such a common problem that Google has to be able to handle it. Although they would *like* the "one url = one piece of information" idea to be real we and they know that in real life one piece of information may end up on many urls.
So they filter (as stated above) -- Our only concern as webmasters should be that we are not one of the ones filtered out. so using text that exists on another site is not wise, we are likely to be filtered out, but on our own sites? Who cares -- one of our pages will show, that'll do just fine for me.
__________________
$$$ Webcam programs that earn for me -- YMMV $$$ |
![]() |
![]() |
![]() |
Thread Tools | Search this Thread |
Display Modes | Rate This Thread |
|
|