Greenguy's Board

Greenguy's Board (http://www.greenguysboard.com/board/index.php)
-   Blogs and Blogging (http://www.greenguysboard.com/board/forumdisplay.php?f=8)
-   -   Thanx from a lazy fuck (http://www.greenguysboard.com/board/showthread.php?t=32430)

dr_montelbaun 2006-06-30 07:35 PM

Just thought I'd jump in with $.02 on the duplicate content issue. Search Engines are sure to recognize duplicate content across multiple sites and will penalize for it. So to deal with that you'll need to either a) not care about search engines or b) make your own mods to the aggregated content as it comes in.

Obviously option 2 foils the hands-off approach, which isn't very attractive. But it is still a lot easier than coming up with all that new content out of whole cloth.

oast 2006-06-30 10:08 PM

Duplicate content is bound to happen... there are 8+ billion webpages out there! Same as all the major news sites use 3 or 4 main sources for their content.
As long as a few 'original' posts are also included on the pages, I don't see a problem... maybe I'm wrong.

docholly 2006-06-30 11:05 PM

and since all blogs have different keywords and unique names it not going to really be that much of an issue we don't think..

if some one was searching for "x-rated nudes" they might find an entry on Simon's blog but they could also find the same entry when searching for "foxy nudes" might end up at Sue's.

there are so many variables and like oast said, lots of news feeds go everywhere every day, and as long as there is some other relevant text with in the blog it won't be the same.

But then again the way google has acted stranger things have happened. |huh

walrus 2006-06-30 11:26 PM

I sure the hell ain't no SE guru but this topic has been popping up a bit lately so I thought I might want to look into it a bit. Especially since I use RSS feeds in a couple of places, including sponsor feeds and I also have my feeds published a number of places. So, a SE penalty would really hurt.

My conclusion is that there definately is a duplicate content penalty. That having been said, I also believe that the content duplication has to be on a fairly large scale in order to take affect and I've decided that it probably isn't the type of penalty that is going to kill you in the long run anyway.

According to James D Brausch, the self proclaimed founder of the term "Duplicate Content Penalty", if you were to submit an article to 300 sites, it would take Google about 6 weeks to see it and then react to it. But I'm not so sure I would consider what happens next a penalty. According to Mr. Brausch, you would then see the following "In order to show you the most relevant results, we have omitted some entries very similar to the 16 already displayed. If you like, you can repeat the search with the omitted results included."

If it's my content, I don't see a penalty here as long as all my links and affiliate codes are left in tact and if they are not well that's a whole different story since it moves it from a duplicate content penalty to content theft. So even if I don't end up in one of the top 16 slots before Google cuts off the listing, I still can be pretty content that my post is still getting exposure and that I have sites higher in the SERP's than me linking to my content, thus improving my overall positioning within Google.

Now if I'm using someone else's content and don't figure into the top 16, then I guess a small penalty has been imposed but looking at the time frame, 6 weeks, I doubt that will hurt much and from everything I can tell, it has no adverse affects on anything else having to do with my site. At least, I've yet to find any concrete proof that it does.

In fact, the whole myth aspect of this is probably rooted in the fact that it does take such a high number of iterances of the content being copied that it is un-noticable to the honest webmaster who used a few feeds here and there. The only place where you might need to be careful is with using sponsor feeds or creating a blog from nothing other than sponsor feeds. These could easily start to fall within that category and if that is the only content on your blog, work to strangle it to just the few visitors you can get from link trades and directory submittals.

Useless 2006-07-01 12:27 AM

My understanding of the situaton is that in order to be penalized for duplicate content, page B would have to have many similarities to page A. Simply aggregating other peoples' posts and allowing them to mix with your own, on your unique site, shouldn't even bring you close to a penatly. With the popularity of RSS feeds being used on, and pulled from major mainstream sites, I get the sense that very few people who are "in the know" are worried about being penalized. Articles and blog posts have been republished across this mighty web for a considerable period now. If Google viewed the re-use of some content as duplicate, we would have known by now.

Also, blogs are not static - or at least they shouldn't be. Blog content is always updating and changing. You'd be hard pressed to convince me that the average blog would need to worry about the dupe penatly.

dr_montelbaun 2006-07-01 02:25 AM

Well, these things are definitely true if one takes the page as a whole. But the SEs know how to tease out the content of individual posts in a blog and treat them each as a single unit. They also know how to tell the difference between syndicated content (i.e. AP/Reuters stories on CNN, Yahoo, MSN, etc) and duplicate content that is published as if it is original content.

That being said -- I don't think there'd be a serious penalty for the duplication, like losing PR or something, so much as Google wouldn't index your syndicated post in the same way that it would index an original post. I think Walrus found some good info on how this would play out. If you do something to make the syndicated post your own (like CNN, Yahoo, MSN etc often do with AP/Reuters stories) then you may be able to get the SEs to treat your version as original content.

oast 2006-07-01 10:26 AM

Quote:

Originally Posted by dr_montelbaun
If you do something to make the syndicated post your own (like CNN, Yahoo, MSN etc often do with AP/Reuters stories) then you may be able to get the SEs to treat your version as original content.

That is something I'm looking into with two of my new "blogs".

With another WP plugin, the "extra" content in some RSS feeds can be used. For example, on Adult Weblogger I am using an RSS from Rabbit's Reviews. In the RSS are links to a graphic of the front page of each reviewed site, and sometimes links to galleries (if Rabbit has them). Although they are in the RSS, these 'extras' don't show on my syndicated content... only on my site, as I have coded them directly into the index.php of the theme, not with the post as it is stored in the database.
Although the wording remains the same, there is more to the article than on other sites, so is this seen as 'different'? |huh

Just thought... Google themselves can be 'guilty' of using duplicate content. Do they not use the data from DMOZ.org for their directory?

Yes they have the option to present the listings in a different order (by pagerank), but it is essentially the same data. Plus the miriad of other 'directories' that are simply a copy of DMOZ, but with their own .css. Some of them are quite high on Google's ranking.

walrus 2006-07-08 11:49 AM

Quote:

Originally Posted by walrus
I haven't actually had a chance to try this....hopefully I will soon.

Well, I finally got my lazy ass in gear and did this and honestly I think it turned out pretty cool. Not only am I adding the feed to a page....I've added the chicklets so that people can subscribe directly to the feeds themselves. If interested, you can check it out here


All times are GMT -4. The time now is 02:11 PM.

Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
© Greenguy Marketing Inc