|
|
|
|
|
|
|
![]() |
|
Thread Tools | Search this Thread | Rate Thread | Display Modes |
|
![]() |
#1 |
Searching for Jimmy Hoffa
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
|
Unfortunately picxx you were a victim of bad advice. I'd suggest you no longer listen to the person that gave you this advice in the first place
![]() There's a lot of bad information tossed around on boards, etc, and you really have to be careful on who you listen to. This board is a good place to post questions about LL as you won't get steered wrong as you'll be getting the info straight from the horses (owners) mouth. |
![]() |
![]() |
![]() |
#2 | |
I'm the only guy in the world who has to wake up to have a nightmare
Join Date: Feb 2004
Location: London, United Kingdom
Posts: 1,895
|
Quote:
![]() |
|
![]() |
![]() |
![]() |
#3 |
You can now put whatever you want in this space :)
|
I know LL owners like to get all the backlinks they can get their hands on (so would I), but you're forgetting one thing:
Duplicate content. Getting linkbacks from supplemental pages is not going to do anyone any good. EDIT: Not to mention low quality backlinks from free sites aren't going to make or break your ranking on Google (though MSN probably eats them up). In a few years, who knows, Google may ignore them altogether. One way to look at recips is advertising your LL via increasing brand awareness. Approach them as means of inflating your SE position -- and you're in violation of Google guidelines. Preventing duplicate content is a legitimate reason for disallowing mirrors, however. A large percentage of supps under a domain may negatively impact the entire domain.
__________________
Success is going from failure to failure without a loss of enthusiasm. Last edited by Halfdeck; 2006-09-22 at 11:59 AM.. |
![]() |
![]() |
![]() |
#4 | |
on vacation
|
Quote:
![]() BUT make sure your meta tags don't say anything different, that's all. now I know this was an honest mistake but it's a good opportunity for everyone to make sure they check the little details ![]() |
|
![]() |
![]() |
![]() |
#5 | |
Searching for Jimmy Hoffa
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
|
Quote:
|
|
![]() |
![]() |
![]() |
#6 | |||||
You can now put whatever you want in this space :)
|
Quote:
Quote:
Quote:
In that thread, I wrote: Quote:
Quote:
__________________
Success is going from failure to failure without a loss of enthusiasm. |
|||||
![]() |
![]() |
![]() |
#7 | |
Madness is like gravity. All it takes is a little... push.
Join Date: Feb 2006
Location: California
Posts: 1,679
|
Quote:
![]()
__________________
~Warm and Fuzzy. ![]() |
|
![]() |
![]() |
![]() |
#8 | |
You can now put whatever you want in this space :)
|
Quote:
Mirror sites and robots.txt disallow both lead to your LL likely getting no link juice from recips whatsoever. If you want decent backlinks, you might think about accepting only unique free sites. Even then, if you're linking to each other, chances are the link is completely ignored by Google.
__________________
Success is going from failure to failure without a loss of enthusiasm. |
|
![]() |
![]() |
![]() |
#9 | |
Madness is like gravity. All it takes is a little... push.
Join Date: Feb 2006
Location: California
Posts: 1,679
|
Quote:
![]() But I see your point. In my own experiments I've noticed that backlinks from unique pages with backlinks from other unique pages gives a much higher return than backlinks from non-unique pages... well, from google at least. ![]()
__________________
~Warm and Fuzzy. ![]() |
|
![]() |
![]() |
![]() |
#10 |
There's Xanax in my thurible!
|
I don't think the link is completley ignored, but I do think it becomes highly devalued.
|
![]() |
![]() |
![]() |
#11 | |
Searching for Jimmy Hoffa
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
|
Quote:
Now, I personally always submitted to a small LL grouping to remove the duplicate page penalty, but if there were more LLs that interested me, I would now do as I stated above. As for whether it's good advice or not to use the robot.txt...if this is setting off people's scripts, if it's too much hassle for them to review your freesites, reviewers will do exactly do as preacher and jel did...they'll most likely not even bother reviewing or listing your sites. So, there's a certain futility in following the robot.txt advice if it prevents you from easily getting listed at the LLs you're submitting to. As a submitter you have the options of either submitting to very few LL's (that was my choice when I was regularly submitting), Change the pages enough to avoid the duplicate page penalty (not hard to do when you're working with a template system), or use the robot.txt and not get listed on a number of sites you're submitting to. Out of those options the most sane and easy one is to submit to a very small high quality group of linklists that you know will list you regularly. You posted something Linkster said and maybe he can stop in and help me out on this point as I will gladly defer to his expertise in this area because I know he knows wayyyy more about this topic than I ever will. In that post you quoted, Linkster says that linksforsex went to a single recip, but a few months ago he went back to the category specific recips and that's still the case today. Now, why would a Link List switch back to category recips? As far as I can tell category recips have mainly been put in place for SEO? I'm not sure what benefit category specific recips would have other then SEO. If LLs are using category specific recips, it seems they're doing so for SEO, if that's the case then it would behoove all involved to make google as happy as they can and take effort to remove the duplicate page penalty, but at the same time not negate the category recips that so many LL's use by using a robot.txt to block the engine from searching those pages. Essentially you are breaking a Link Lists rules because you're completely negating any benefit the Link List owner was trying to get by having category specific recips. |
|
![]() |
![]() |
![]() |
|
|