|
|
|
|
|
|
|
![]() |
#1 |
Certified Nice Person
|
My understanding of the situaton is that in order to be penalized for duplicate content, page B would have to have many similarities to page A. Simply aggregating other peoples' posts and allowing them to mix with your own, on your unique site, shouldn't even bring you close to a penatly. With the popularity of RSS feeds being used on, and pulled from major mainstream sites, I get the sense that very few people who are "in the know" are worried about being penalized. Articles and blog posts have been republished across this mighty web for a considerable period now. If Google viewed the re-use of some content as duplicate, we would have known by now.
Also, blogs are not static - or at least they shouldn't be. Blog content is always updating and changing. You'd be hard pressed to convince me that the average blog would need to worry about the dupe penatly.
__________________
Click here to purchase a bridge I'm selling. |
![]() |
![]() |
![]() |
#2 |
Rock stars ... is there anything they don't know?
Join Date: Jun 2006
Posts: 10
|
Well, these things are definitely true if one takes the page as a whole. But the SEs know how to tease out the content of individual posts in a blog and treat them each as a single unit. They also know how to tell the difference between syndicated content (i.e. AP/Reuters stories on CNN, Yahoo, MSN, etc) and duplicate content that is published as if it is original content.
That being said -- I don't think there'd be a serious penalty for the duplication, like losing PR or something, so much as Google wouldn't index your syndicated post in the same way that it would index an original post. I think Walrus found some good info on how this would play out. If you do something to make the syndicated post your own (like CNN, Yahoo, MSN etc often do with AP/Reuters stories) then you may be able to get the SEs to treat your version as original content. |
![]() |
![]() |
![]() |
#3 | |
With $10,000, we'd be millionaires! We could buy all kinds of useful things like ... love!
|
Quote:
With another WP plugin, the "extra" content in some RSS feeds can be used. For example, on Adult Weblogger I am using an RSS from Rabbit's Reviews. In the RSS are links to a graphic of the front page of each reviewed site, and sometimes links to galleries (if Rabbit has them). Although they are in the RSS, these 'extras' don't show on my syndicated content... only on my site, as I have coded them directly into the index.php of the theme, not with the post as it is stored in the database. Although the wording remains the same, there is more to the article than on other sites, so is this seen as 'different'? ![]() Just thought... Google themselves can be 'guilty' of using duplicate content. Do they not use the data from DMOZ.org for their directory? Yes they have the option to present the listings in a different order (by pagerank), but it is essentially the same data. Plus the miriad of other 'directories' that are simply a copy of DMOZ, but with their own .css. Some of them are quite high on Google's ranking.
__________________
Playboy Webmasters - The name says it all! $35 per signup or 60% revshare. |
|
![]() |
![]() |
![]() |
|
|