Greenguy's Board


Go Back   Greenguy's Board > Possible Cheaters
Register FAQ Calendar Today's Posts

Reply
 
Thread Tools Search this Thread Rate Thread Display Modes
Old 2006-09-22, 04:02 PM   #1
Mr. Blue
Searching for Jimmy Hoffa
 
Mr. Blue's Avatar
 
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
Quote:
Originally Posted by Halfdeck View Post
I know LL owners like to get all the backlinks they can get their hands on (so would I), but you're forgetting one thing:

Duplicate content.

Getting linkbacks from supplemental pages is not going to do anyone any good.

EDIT:

Not to mention low quality backlinks from free sites aren't going to make or break your ranking on Google (though MSN probably eats them up). In a few years, who knows, Google may ignore them altogether.

One way to look at recips is advertising your LL via increasing brand awareness. Approach them as means of inflating your SE position -- and you're in violation of Google guidelines.

Preventing duplicate content is a legitimate reason for disallowing mirrors, however. A large percentage of supps under a domain may negatively impact the entire domain.
This is one of the reasons when I was submitting freesites regularly I'd only submit to around 12 so there wouldn't be duplicate page penalties. If however I wanted to submit to more than 12, I'd just change the page content enough to avoid the duplicate page penalty...you can keep the template the same, change the sales text, alt tags, title tags, etc, etc, etc. enough where you wouldn't get penalized for a duplicate page penalty. A little extra work, but feasibly 1 freesite can be optimized for a whole bunch of keywords depending on how many LL you're submitting to and how much you're changing the page.
__________________
69Blue.com
ICQ #223487665
Mr. Blue is offline   Reply With Quote
Old 2006-09-22, 08:28 PM   #2
Halfdeck
You can now put whatever you want in this space :)
 
Halfdeck's Avatar
 
Join Date: Oct 2004
Location: New Haven, CT
Posts: 985
Send a message via ICQ to Halfdeck
Quote:
This is actually more of the point that I was interested in.
I hear you Preacher. However, META NOINDEX should solve that problem. It's also a more reliable alternative to robots.txt for keeping URLs out of Google's index.

Quote:
Originally Posted by Mr. Blue View Post
A little extra work, but feasibly 1 freesite can be optimized for a whole bunch of keywords depending on how many LL you're submitting to and how much you're changing the page.
Mr. Blue, that's as realistic a statement as saying it only takes a little extra work for LL owners to send hand-written rejection emails.

Quote:
Unfortunately picxx you were a victim of bad advice. I'd suggest you no longer listen to the person that gave you this advice in the first place
http://www.greenguysboard.com/board/...ghlight=robots

In that thread, I wrote:

Quote:
1000 incoming links are worthless if the links point to a page Google considers spam. If Google chooses an index.html with small LL linking to it over an index.html with links from penisbot, LOR, etc, then you're shit out of luck...I wouldn't leave it up to Google to decide that for me.

To customize your free sites to make them appear unique to SEs seems to be a waste of time to me.

One suggestion I saw to use robots.txt to prevent Google from indexing the duplicate content...so you'd get nice SE traffic off your primary index.html and then pull nice LL traffic off the duplicate doorways.

Personally, I'd build 1 site and submit to 20. The 20 LL will give you say 30,000 uniques over a few months and then 5,000 or whatever from SE a month.

If you spent time submitting to the other 80, lets be optimistic and say you'd get 5,000 uniques from them a month. Still, if you had to choose between 5k from google or 5k from LLs...which would you choose?
Linkster wrote:

Quote:
First - Im glad to see people thinking along these lines - unfortunately you are all giving Google a heck of alot more credit that its due.
If the pages are all on one domain - yeah Google will pick one and call the rest dupes - if you dont change the pages - if you change them AND - change the inside pages for each copy of the site - then you might have a chance of getting more than one copy in Google.

as far as the robots text thingie - you are submitting free sites to a LL - if the LL bans you because of this then they are not a LL - they are trying to build a SE hub - granted we would like to get some SE benefit but it sure isnt the main reason to run a LL - or make decisions on listing someone - there are definitely much better ways for a LL to get the phrases they want in a SE - which is why after testing niche recips we went back to the single recip.

I will repeat what I have said in other threads about this - the sites you submit to LLs should NOT be the copies you are trying to get in the SEs - a site that specifically is built for the SEs will do much better and can have a few more aggressive ways of getting surfers to buy - of course thats just my way of doing things but it seems that it has worked well for a bunch of us over the years
__________________
Success is going from failure to failure without a loss of enthusiasm.
Halfdeck is offline   Reply With Quote
Old 2006-09-22, 08:48 PM   #3
virgohippy
Madness is like gravity. All it takes is a little... push.
 
virgohippy's Avatar
 
Join Date: Feb 2006
Location: California
Posts: 1,679
Quote:
Originally Posted by Halfdeck View Post
Interesting thread Halfdeck... now my head is spinning.
__________________
~Warm and Fuzzy.
virgohippy is offline   Reply With Quote
Old 2006-09-22, 08:59 PM   #4
Halfdeck
You can now put whatever you want in this space :)
 
Halfdeck's Avatar
 
Join Date: Oct 2004
Location: New Haven, CT
Posts: 985
Send a message via ICQ to Halfdeck
Quote:
Seems to me most submitters aren't able to produce and submit more than a small handful of mirrors anyway.
It only takes two to tango.

Mirror sites and robots.txt disallow both lead to your LL likely getting no link juice from recips whatsoever. If you want decent backlinks, you might think about accepting only unique free sites. Even then, if you're linking to each other, chances are the link is completely ignored by Google.
__________________
Success is going from failure to failure without a loss of enthusiasm.
Halfdeck is offline   Reply With Quote
Old 2006-09-22, 10:04 PM   #5
virgohippy
Madness is like gravity. All it takes is a little... push.
 
virgohippy's Avatar
 
Join Date: Feb 2006
Location: California
Posts: 1,679
Quote:
Originally Posted by Halfdeck View Post
Mirror sites and robots.txt disallow both lead to your LL likely getting no link juice from recips whatsoever. If you want decent backlinks, you might think about accepting only unique free sites. Even then, if you're linking to each other, chances are the link is completely ignored by Google.
Seems to me there's a bit of give somewhere in that statement. If that were true my LL should be worthless to google - right now it's only a notch or two above worthless.

But I see your point. In my own experiments I've noticed that backlinks from unique pages with backlinks from other unique pages gives a much higher return than backlinks from non-unique pages... well, from google at least.
__________________
~Warm and Fuzzy.
virgohippy is offline   Reply With Quote
Old 2006-09-23, 01:01 PM   #6
Preacher
There's Xanax in my thurible!
 
Preacher's Avatar
 
Join Date: Apr 2005
Location: Wherever they screw on my head
Posts: 2,441
Send a message via ICQ to Preacher
Quote:
Originally Posted by Halfdeck View Post
...Even then, if you're linking to each other, chances are the link is completely ignored by Google.
I don't think the link is completley ignored, but I do think it becomes highly devalued.
__________________
NSCash * This Depraved World
Preacher is offline   Reply With Quote
Old 2006-09-25, 04:50 PM   #7
Halfdeck
You can now put whatever you want in this space :)
 
Halfdeck's Avatar
 
Join Date: Oct 2004
Location: New Haven, CT
Posts: 985
Send a message via ICQ to Halfdeck
Quote:
Out of those options the most sane and easy one is to submit to a very small high quality group of linklists that you know will list you regularly.
I agree Mr Blue.

Quote:
btw Halfdeck, this is a purely academic debate on my part and I respect your opinions...you may be completely right and I can be wrong on it. On the surface, at least to me, the robot.txt thing seems wrong, but I do understand your points on the topic and I'd be curious to see what others think on it as well.
Same here. I don't think we're talking about right/wrong anyway. We're comparing odds.

Quote:
So, there's a certain futility in following the robot.txt advice if it prevents you from easily getting listed at the LLs you're submitting to.
In any case, robots.txt isn't the best way to keep pages out of Google's index. Matt Cutts has stated that even if you disallow urls using robots.txt, if other sites link to that url, Google may list it in its index, albeit url only (no title/description). If you want to hide the url completely from Google's index, the commonly recommended course of action is labeling pages using META noindex tag. I don't know LL scripts, but I doubt META robots tag would interfere with their crawling. Else I dare say they should be rewritten.

Quote:
Essentially you are breaking a Link Lists rules because you're completely negating any benefit the Link List owner was trying to get by having category specific recips.
If robots.txt is one of your LL rules.

My point though, is if you accept mirror free sites, chances are you're getting linked from a supplemental page which does you no good anyway. Also, Google seems to be getting pickier about duplicate content especially from unknown, untrusted, 1 month old domains, so just tweaking the title/meta tag and on-page text may not always be enough to keep a page in the main index.

Let me post an example.

I have a list of free sites here:

http://www.nastyxvids.com/sitemap/

Mind you, I built these free sites before I was even aware of search engines, so this isn't exactly scientific (also, site: search is a bit quirky lately, and you may see something different from what I'm seeing depending on which DC you're hitting). The domain is a little short of 2 years old.

Pages listed in Google's main index:

http://www.google.com/search?q=site%...en-US:official
http://www.google.com/search?hs=6Db&...2F&btnG=Search
http://www.google.com/search?hs=7tv&...2F&btnG=Search
http://www.google.com/search?hs=duv&...2F&btnG=Search
http://www.google.com/search?hs=SaG&...2F&btnG=Search
http://www.google.com/search?hs=JGb&...2F&btnG=Search
http://www.google.com/search?hs=YbG&...2F&btnG=Search
http://www.google.com/search?hs=ewv&...2F&btnG=Search
http://www.google.com/search?hs=eHb&...2F&btnG=Search
http://www.google.com/search?hs=Hxv&...2F&btnG=Search

Most of the LLs I submitted to are getting no link love from my submissions on that domain.

------------------------------------

The way I'd go about free site mirrors now would be this:

/index.html
/main.html
/gallery1.html
/gallery2.html
/doorway1.html -> links to main.html
/doorway2.html -> link to main.html

Provided doorway1.html is significantly different from /index.html, and assuming 100s of templates a submitter uses are significantly different from each other, (and assuming 10,000s of submitted free sites are unique enough in terms of on-page text/HTML structure), and assuming further that a submitter build free sites on a one year+ old, trusted, TBPR 3+ domain, there are plenty of unique text (200-300 words+) on each page ..... I think all pages will be indexed as unique pages in Google, and no robots.txt disallow is needed.

Still, my main objection would be against tactics aimed at artificially boosting your SE rankings. I wouldn't assume grey hat methods like recips (they're not citations or "votes" with minimal traffic value) will work indefinitely.

Quote:
I don't think the link is completley ignored, but I do think it becomes highly devalued.
I don't think anything - which is why I said "chances are" - because I have no concrete evidence either way.

Quote:
Advertising's fine
Buying links for PR: bad
Google senses much
Adam Lasnik (Google's new PR guy):
http://www.webmasterworld.com/google/3079355.htm

Whether he's bluffing or not who knows. I do know Google already detects and kills PageRank transfers on *some* bought links, and I assume the same to be happening with some traded, "made for SE ranking" links.

Another relevant quote (Matt Cutts):

Quote:
After looking at the example sites, I could tell the issue in a few minutes. The sites that fit “no pages in Bigdaddy” criteria were sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling.
http://www.mattcutts.com/blog/indexing-timeline/

I still do not agree with the mentality "how can I improve/optimize my ranking without getting penalized?" whch seems to be driving this robots.txt discussion. A better question imo would be "how can I make my site more valuable to visitors, and more visible, so more people will find what they're looking for?"

Bottom line: I see nothing wrong with blocking duplicate content pages using robots.txt or meta noindex tag - that's commonly recommended SEO practice. A free site submitter doesn't gain PageRank by disallowing / noindexing a page. It only prevents duplicate content from being indexed. Tagging a free site page with NOFOLLOW would send me a different signal (a free site submitter trying to hog PageRank), but that's another issue.

P.S. Off topic, but if I ran a LL, I would think about tagging links to free sites with NOFOLLOW, as does Technorati tag pages, which are starting to rank very well on Google. You eliminate the reciprocal linking issue (turn all free site links into one way links), and possible negative trust brought on by linking to supplemental/duplicate content pages on untrusted domains.
__________________
Success is going from failure to failure without a loss of enthusiasm.

Last edited by Halfdeck; 2006-09-25 at 04:54 PM..
Halfdeck is offline   Reply With Quote
Old 2006-09-26, 12:25 AM   #8
Mr. Blue
Searching for Jimmy Hoffa
 
Mr. Blue's Avatar
 
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
Excellent post Halfdeck, you've really explained the topic perfectly and now I do agree with your stance on it. Just a few things.

Quote:
Originally Posted by Halfdeck View Post
The way I'd go about free site mirrors now would be this:

/index.html
/main.html
/gallery1.html
/gallery2.html
/doorway1.html -> links to main.html
/doorway2.html -> link to main.html
That's the way I used to do my freesite linking for mirrored pages but it seems like a lot of linklist don't like that format for linking. I started to do the linking differently to meet with the requirements, but it annoyed me enough where I decided to go down to a small tight focused group of LinkLists instead.

Quote:
P.S. Off topic, but if I ran a LL, I would think about tagging links to free sites with NOFOLLOW, as does Technorati tag pages, which are starting to rank very well on Google. You eliminate the reciprocal linking issue (turn all free site links into one way links), and possible negative trust brought on by linking to supplemental/duplicate content pages on untrusted domains.
Just two comments on it. When you refer to tagging links you mean the links coming in would have a rel="tag" type tagging? So, to use your above example domain if I were a LL owner...the link would be included as

<a href="http://www.nastyxvids.com/keyword/index.html" rel="tag">Keyword</a>

Something like that?

Also, since we're really talking about small link lists and their rules (as big linklist probably don't have to worry about the No Follow rule). Wouldn't it be wiser for smaller link lists to use recips more like TGPs use recips? Sorry, I'm more of a tgp guy so I'm going to explain this in those terms...but for TGPs we use a single recip with almost no hope of getting SE off that recip. What we use it for is more or less getting a percentage of traffic from other tgps that list the same gallery (hopefully getting listed with tgps of equal size or bigger than your own). So the focus of the recip is heavily on branding the domain name / name of the tgp or if it's a niche tgp highlighting the niche quality of your tgp that might draw people off of a general tgp to your site if they have a specific fetish.

Shouldn't smaller link lists move to that single recip that heavily brands instead of a category recip?
__________________
69Blue.com
ICQ #223487665

Last edited by Mr. Blue; 2006-09-26 at 12:28 AM..
Mr. Blue is offline   Reply With Quote
Old 2006-09-26, 04:03 AM   #9
virgohippy
Madness is like gravity. All it takes is a little... push.
 
virgohippy's Avatar
 
Join Date: Feb 2006
Location: California
Posts: 1,679
Quote:
Originally Posted by Mr. Blue View Post
Just two comments on it. When you refer to tagging links you mean the links coming in would have a rel="tag" type tagging? So, to use your above example domain if I were a LL owner...the link would be included as

<a href="http://www.nastyxvids.com/keyword/index.html" rel="tag">Keyword</a>

Something like that?
I think halfdeck is referring to this:

<a href="http://www.nastyxvids.com/keyword/index.html" rel="nofollow">Keyword</a>

A nofollow link is like wearing a condom.
__________________
~Warm and Fuzzy.
virgohippy is offline   Reply With Quote
Old 2006-09-26, 04:01 AM   #10
virgohippy
Madness is like gravity. All it takes is a little... push.
 
virgohippy's Avatar
 
Join Date: Feb 2006
Location: California
Posts: 1,679
Quote:
Originally Posted by Halfdeck View Post
Still, my main objection would be against tactics aimed at artificially boosting your SE rankings. I wouldn't assume grey hat methods like recips (they're not citations or "votes" with minimal traffic value) will work indefinitely.
For the moment, there may be some mutual benefit, albeit small, but it's certainly reassuring to know that in the future things can only get more strict/difficult. |shocking|

Quote:
Originally Posted by Halfdeck View Post
P.S. Off topic, but if I ran a LL, I would think about tagging links to free sites with NOFOLLOW, as does Technorati tag pages, which are starting to rank very well on Google. You eliminate the reciprocal linking issue (turn all free site links into one way links), and possible negative trust brought on by linking to supplemental/duplicate content pages on untrusted domains.
I can think of a few other wembasters who stopped submitting to a few lists because of this. As a word of caution, just as a LL owner wouldn't want to recieve a nofollow in their link, I don't think a submitter would be too happy to get one either.

If I had a choice, I'd rather link to fewer sites and send more of my traffic to submitters who make quality sites and unique warning pages. IMO, doing so not only benefits both of us as far as SEO is concerned, but it also maintains a quality brand for my own LL.
__________________
~Warm and Fuzzy.
virgohippy is offline   Reply With Quote
Old 2006-09-23, 02:56 AM   #11
Mr. Blue
Searching for Jimmy Hoffa
 
Mr. Blue's Avatar
 
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
Quote:
Originally Posted by Halfdeck View Post
Mr. Blue, that's as realistic a statement as saying it only takes a little extra work for LL owners to send hand-written rejection emails.
That's not entirely true as if you're working on a template system it takes only a little bit more time to edit the pages for multiple LLs...hell, it doesn't really take much time to change the meta tags, sales text, and alt tags. If you run between 12 recips per freesite...if you're submitting to roughly 36 LLs...that's only 3 freesites you have to edit and optimize. It's not like submitted to tgps where you may have 2000+ pages to edit.

Now, I personally always submitted to a small LL grouping to remove the duplicate page penalty, but if there were more LLs that interested me, I would now do as I stated above.

As for whether it's good advice or not to use the robot.txt...if this is setting off people's scripts, if it's too much hassle for them to review your freesites, reviewers will do exactly do as preacher and jel did...they'll most likely not even bother reviewing or listing your sites.

So, there's a certain futility in following the robot.txt advice if it prevents you from easily getting listed at the LLs you're submitting to. As a submitter you have the options of either submitting to very few LL's (that was my choice when I was regularly submitting), Change the pages enough to avoid the duplicate page penalty (not hard to do when you're working with a template system), or use the robot.txt and not get listed on a number of sites you're submitting to.

Out of those options the most sane and easy one is to submit to a very small high quality group of linklists that you know will list you regularly.

You posted something Linkster said and maybe he can stop in and help me out on this point as I will gladly defer to his expertise in this area because I know he knows wayyyy more about this topic than I ever will. In that post you quoted, Linkster says that linksforsex went to a single recip, but a few months ago he went back to the category specific recips and that's still the case today. Now, why would a Link List switch back to category recips? As far as I can tell category recips have mainly been put in place for SEO? I'm not sure what benefit category specific recips would have other then SEO.

If LLs are using category specific recips, it seems they're doing so for SEO, if that's the case then it would behoove all involved to make google as happy as they can and take effort to remove the duplicate page penalty, but at the same time not negate the category recips that so many LL's use by using a robot.txt to block the engine from searching those pages. Essentially you are breaking a Link Lists rules because you're completely negating any benefit the Link List owner was trying to get by having category specific recips.
__________________
69Blue.com
ICQ #223487665
Mr. Blue is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 10:02 AM.


Mark Read
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
© Greenguy Marketing Inc