|
2006-06-27, 01:43 PM | #26 | |
Lonewolf Internet Sales
|
Quote:
|
|
2006-06-27, 03:55 PM | #27 |
Took the hint.
|
If you allow you feeds to be aggregated, then you should not be putting other aggregate material in them. Quite simply, if I continue to see duplicate posts on pornpig, I will elminate one or other of the feeds and assume that there is not enough original material on the blog.
If you want to aggregate stuff, start your own aggregation site and list your sites on it rather than screwing up your existing site. |
2006-06-27, 04:09 PM | #28 |
Took the hint.
|
I just want to add this: Pornpig's intention is to aggregate UNIQUE blogs into a format that drives you traffic, views, and exposure for your unique work. When your blog becomes mostly someone else's work, I can no longer aggregate the content, because the risk of duplication (and see here) is a bit overwhelming, even in small numbers.
Jel, for the moment Ihave disabled your feed into the blog and removed the obvious duplicate posts from the front page of the site. If your blog returns to a unique situation, I don't have a problem re-adding it, but I cannot aggregate blogs that are pre-aggregated other source material in this manner. Anyone else interested or want to discuss it, email alex@monolithnet.com |
2006-06-27, 05:08 PM | #29 |
Oh no, I'm sweating like Roger Ebert
|
I haven't actually had a chance to try this....hopefully I will soon.
It should be possible to add the feeds into pages on your blog instead of posts. I know there is a plug-in that will let you run PHP in a page and I'm thinking of trying an aggregator like Carp to actually add the feeds onto the page. You'll only have the most recent posts (however many are output by the sponsor) and not the history but I still think it could be cool plus it would allow you to use the feeds and maintain your blogs individuality. I think I saw something....somewhere that would allow you to treat page links differently than the default WP install does. If anyone has any ideas, I'd like to take a look at them...or I guess I'll investigate over the weekend. |
2006-06-27, 05:27 PM | #30 | |
Certified Nice Person
|
Quote:
Try to ignore the sloppiness of the next site link I give you - I use it for testing stuff out. Anyway, this is a WP page with Exec-PHP activated and using that new tool we released on Monday. http://www.maladapted.com/?page_id=288 So yes, you can run feeds on a WP page. You should be able to install a third-party aggregator and pull any feeds you wish. There may be a plugin aggregator for use on pages, but I haven't stumbled on one yet. And, as you assert, this would allow bloggers to employ feeds on their blogs they wish to submit to aggregators like "Porn Pig" without the worry of having your sponsor aggregated content killing the uniqueness or decreasing the value of your blog's feed. Of course, this would also disclude you the opportunity of getting your affiliate coded posts into Technorati and the like.
__________________
Click here to purchase a bridge I'm selling. Last edited by Useless; 2006-06-27 at 05:33 PM.. |
|
2006-06-27, 07:25 PM | #31 | |
Oh no, I'm sweating like Roger Ebert
|
Quote:
|
|
2006-06-27, 07:33 PM | #32 |
Certified Nice Person
|
That hurts.
__________________
Click here to purchase a bridge I'm selling. |
2006-06-27, 09:44 PM | #33 | |
Certified Nice Person
|
Quote:
Please help me make sense of this stuff.
__________________
Click here to purchase a bridge I'm selling. |
|
2006-06-27, 10:14 PM | #34 |
Oh no, I'm sweating like Roger Ebert
|
Exactly....
|
2006-06-27, 10:24 PM | #35 | |
Certified Nice Person
|
Quote:
Looks like I have to go edit 8 copies of quicktags.js
__________________
Click here to purchase a bridge I'm selling. |
|
2006-06-28, 12:00 AM | #36 |
Eighteen 'til I Die
|
Damn, I have got to quit reading this thread or I will post some mushy crap about how much this board and its members have meant to me and the 'staff's" little program.
|
2006-06-28, 01:28 AM | #37 | |
I'm the only guy in the world who has to wake up to have a nightmare
Join Date: Feb 2004
Location: London, United Kingdom
Posts: 1,895
|
Quote:
I see totally where you're coming from and that's no problem at all. Looking at the other posts and trying to work out how I can get around it. Just recently woke here so a little foggy in the brain right now lol. |
|
2006-06-28, 07:54 AM | #38 |
That which does not kill us, will try, try again.
|
Could someone hit me up later on icq about what needs to be changed in the quicktags.js to make the tagging work automatically? (Or maybe it's worth posting here for others too.)
Working on new mantra: "If it's not worth automating, it's not worth doing again."
__________________
"If you're happy and you know it, think again." -- Guru Pitka |
2006-06-28, 08:30 AM | #39 | |
Certified Nice Person
|
Quote:
Open - wp-includes/js/quicktags.js Look for (pretty much at the end of the file)- Code:
function edInsertLink(myField, i, defaultValue) { if (!defaultValue) { defaultValue = 'http://'; } if (!edCheckOpenTags(i)) { var URL = prompt('Enter the URL' ,defaultValue); if (URL) { edButtons[i].tagStart = '<a href="' + URL + '">'; edInsertTag(myField, i); } } else { edInsertTag(myField, i); } } edButtons[i].tagStart = '<a href="' + URL + '">'; to- edButtons[i].tagStart = '<a href="' + URL + '" rel="tag">'; Coincidentally, this is also where you could add a mouseover into your anchors, if you wished, because that is the code that inserts the linking structure when you hit the "link" button.
__________________
Click here to purchase a bridge I'm selling. |
|
2006-06-28, 11:06 AM | #40 | |
With $10,000, we'd be millionaires! We could buy all kinds of useful things like ... love!
|
Quote:
Code:
function edInsertLink(myField, i, defaultValue) { if (!defaultValue) { defaultValue = 'http://'; } if (!edCheckOpenTags(i)) { var URL = prompt('Enter the URL' ,defaultValue); var REL = prompt('Enter the rel' ,'external'); if (URL) { edButtons[i].tagStart = '<a href="' + URL + '" rel="' + REL + '">'; edInsertTag(myField, i); } } else { edInsertTag(myField, i); } }
__________________
Playboy Webmasters - The name says it all! $35 per signup or 60% revshare. |
|
2006-06-30, 07:35 PM | #41 |
Rock stars ... is there anything they don't know?
Join Date: Jun 2006
Posts: 10
|
Just thought I'd jump in with $.02 on the duplicate content issue. Search Engines are sure to recognize duplicate content across multiple sites and will penalize for it. So to deal with that you'll need to either a) not care about search engines or b) make your own mods to the aggregated content as it comes in.
Obviously option 2 foils the hands-off approach, which isn't very attractive. But it is still a lot easier than coming up with all that new content out of whole cloth. |
2006-06-30, 10:08 PM | #42 |
With $10,000, we'd be millionaires! We could buy all kinds of useful things like ... love!
|
Duplicate content is bound to happen... there are 8+ billion webpages out there! Same as all the major news sites use 3 or 4 main sources for their content.
As long as a few 'original' posts are also included on the pages, I don't see a problem... maybe I'm wrong.
__________________
Playboy Webmasters - The name says it all! $35 per signup or 60% revshare. |
2006-06-30, 11:05 PM | #43 |
Nothing funnier than the ridiculous faces you people make mid-coitus
|
and since all blogs have different keywords and unique names it not going to really be that much of an issue we don't think..
if some one was searching for "x-rated nudes" they might find an entry on Simon's blog but they could also find the same entry when searching for "foxy nudes" might end up at Sue's. there are so many variables and like oast said, lots of news feeds go everywhere every day, and as long as there is some other relevant text with in the blog it won't be the same. But then again the way google has acted stranger things have happened. Last edited by docholly; 2006-06-30 at 11:30 PM.. Reason: so i can wave at UW, Chop, Tart, Celtic |
2006-06-30, 11:26 PM | #44 |
Oh no, I'm sweating like Roger Ebert
|
I sure the hell ain't no SE guru but this topic has been popping up a bit lately so I thought I might want to look into it a bit. Especially since I use RSS feeds in a couple of places, including sponsor feeds and I also have my feeds published a number of places. So, a SE penalty would really hurt.
My conclusion is that there definately is a duplicate content penalty. That having been said, I also believe that the content duplication has to be on a fairly large scale in order to take affect and I've decided that it probably isn't the type of penalty that is going to kill you in the long run anyway. According to James D Brausch, the self proclaimed founder of the term "Duplicate Content Penalty", if you were to submit an article to 300 sites, it would take Google about 6 weeks to see it and then react to it. But I'm not so sure I would consider what happens next a penalty. According to Mr. Brausch, you would then see the following "In order to show you the most relevant results, we have omitted some entries very similar to the 16 already displayed. If you like, you can repeat the search with the omitted results included." If it's my content, I don't see a penalty here as long as all my links and affiliate codes are left in tact and if they are not well that's a whole different story since it moves it from a duplicate content penalty to content theft. So even if I don't end up in one of the top 16 slots before Google cuts off the listing, I still can be pretty content that my post is still getting exposure and that I have sites higher in the SERP's than me linking to my content, thus improving my overall positioning within Google. Now if I'm using someone else's content and don't figure into the top 16, then I guess a small penalty has been imposed but looking at the time frame, 6 weeks, I doubt that will hurt much and from everything I can tell, it has no adverse affects on anything else having to do with my site. At least, I've yet to find any concrete proof that it does. In fact, the whole myth aspect of this is probably rooted in the fact that it does take such a high number of iterances of the content being copied that it is un-noticable to the honest webmaster who used a few feeds here and there. The only place where you might need to be careful is with using sponsor feeds or creating a blog from nothing other than sponsor feeds. These could easily start to fall within that category and if that is the only content on your blog, work to strangle it to just the few visitors you can get from link trades and directory submittals. |
2006-07-01, 12:27 AM | #45 |
Certified Nice Person
|
My understanding of the situaton is that in order to be penalized for duplicate content, page B would have to have many similarities to page A. Simply aggregating other peoples' posts and allowing them to mix with your own, on your unique site, shouldn't even bring you close to a penatly. With the popularity of RSS feeds being used on, and pulled from major mainstream sites, I get the sense that very few people who are "in the know" are worried about being penalized. Articles and blog posts have been republished across this mighty web for a considerable period now. If Google viewed the re-use of some content as duplicate, we would have known by now.
Also, blogs are not static - or at least they shouldn't be. Blog content is always updating and changing. You'd be hard pressed to convince me that the average blog would need to worry about the dupe penatly.
__________________
Click here to purchase a bridge I'm selling. |
2006-07-01, 02:25 AM | #46 |
Rock stars ... is there anything they don't know?
Join Date: Jun 2006
Posts: 10
|
Well, these things are definitely true if one takes the page as a whole. But the SEs know how to tease out the content of individual posts in a blog and treat them each as a single unit. They also know how to tell the difference between syndicated content (i.e. AP/Reuters stories on CNN, Yahoo, MSN, etc) and duplicate content that is published as if it is original content.
That being said -- I don't think there'd be a serious penalty for the duplication, like losing PR or something, so much as Google wouldn't index your syndicated post in the same way that it would index an original post. I think Walrus found some good info on how this would play out. If you do something to make the syndicated post your own (like CNN, Yahoo, MSN etc often do with AP/Reuters stories) then you may be able to get the SEs to treat your version as original content. |
2006-07-01, 10:26 AM | #47 | |
With $10,000, we'd be millionaires! We could buy all kinds of useful things like ... love!
|
Quote:
With another WP plugin, the "extra" content in some RSS feeds can be used. For example, on Adult Weblogger I am using an RSS from Rabbit's Reviews. In the RSS are links to a graphic of the front page of each reviewed site, and sometimes links to galleries (if Rabbit has them). Although they are in the RSS, these 'extras' don't show on my syndicated content... only on my site, as I have coded them directly into the index.php of the theme, not with the post as it is stored in the database. Although the wording remains the same, there is more to the article than on other sites, so is this seen as 'different'? Just thought... Google themselves can be 'guilty' of using duplicate content. Do they not use the data from DMOZ.org for their directory? Yes they have the option to present the listings in a different order (by pagerank), but it is essentially the same data. Plus the miriad of other 'directories' that are simply a copy of DMOZ, but with their own .css. Some of them are quite high on Google's ranking.
__________________
Playboy Webmasters - The name says it all! $35 per signup or 60% revshare. |
|
2006-07-08, 11:49 AM | #48 | |
Oh no, I'm sweating like Roger Ebert
|
Quote:
|
|
|
|