|
|
|
|
|
|
|
![]() |
|
Thread Tools | Search this Thread | Rate Thread | Display Modes |
|
![]() |
#1 | |
Searching for Jimmy Hoffa
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
|
Quote:
|
|
![]() |
![]() |
![]() |
#2 | |||||
You can now put whatever you want in this space :)
|
Quote:
Quote:
Quote:
In that thread, I wrote: Quote:
Quote:
__________________
Success is going from failure to failure without a loss of enthusiasm. |
|||||
![]() |
![]() |
![]() |
#3 | |
Madness is like gravity. All it takes is a little... push.
Join Date: Feb 2006
Location: California
Posts: 1,679
|
Quote:
![]()
__________________
~Warm and Fuzzy. ![]() |
|
![]() |
![]() |
![]() |
#4 | |
You can now put whatever you want in this space :)
|
Quote:
Mirror sites and robots.txt disallow both lead to your LL likely getting no link juice from recips whatsoever. If you want decent backlinks, you might think about accepting only unique free sites. Even then, if you're linking to each other, chances are the link is completely ignored by Google.
__________________
Success is going from failure to failure without a loss of enthusiasm. |
|
![]() |
![]() |
![]() |
#5 | |
Madness is like gravity. All it takes is a little... push.
Join Date: Feb 2006
Location: California
Posts: 1,679
|
Quote:
![]() But I see your point. In my own experiments I've noticed that backlinks from unique pages with backlinks from other unique pages gives a much higher return than backlinks from non-unique pages... well, from google at least. ![]()
__________________
~Warm and Fuzzy. ![]() |
|
![]() |
![]() |
![]() |
#6 |
There's Xanax in my thurible!
|
I don't think the link is completley ignored, but I do think it becomes highly devalued.
|
![]() |
![]() |
![]() |
#7 | |||||||
You can now put whatever you want in this space :)
|
Quote:
Quote:
Quote:
Quote:
My point though, is if you accept mirror free sites, chances are you're getting linked from a supplemental page which does you no good anyway. Also, Google seems to be getting pickier about duplicate content especially from unknown, untrusted, 1 month old domains, so just tweaking the title/meta tag and on-page text may not always be enough to keep a page in the main index. Let me post an example. I have a list of free sites here: http://www.nastyxvids.com/sitemap/ Mind you, I built these free sites before I was even aware of search engines, so this isn't exactly scientific (also, site: search is a bit quirky lately, and you may see something different from what I'm seeing depending on which DC you're hitting). The domain is a little short of 2 years old. Pages listed in Google's main index: http://www.google.com/search?q=site%...en-US:official http://www.google.com/search?hs=6Db&...2F&btnG=Search http://www.google.com/search?hs=7tv&...2F&btnG=Search http://www.google.com/search?hs=duv&...2F&btnG=Search http://www.google.com/search?hs=SaG&...2F&btnG=Search http://www.google.com/search?hs=JGb&...2F&btnG=Search http://www.google.com/search?hs=YbG&...2F&btnG=Search http://www.google.com/search?hs=ewv&...2F&btnG=Search http://www.google.com/search?hs=eHb&...2F&btnG=Search http://www.google.com/search?hs=Hxv&...2F&btnG=Search Most of the LLs I submitted to are getting no link love from my submissions on that domain. ------------------------------------ The way I'd go about free site mirrors now would be this: /index.html /main.html /gallery1.html /gallery2.html /doorway1.html -> links to main.html /doorway2.html -> link to main.html Provided doorway1.html is significantly different from /index.html, and assuming 100s of templates a submitter uses are significantly different from each other, (and assuming 10,000s of submitted free sites are unique enough in terms of on-page text/HTML structure), and assuming further that a submitter build free sites on a one year+ old, trusted, TBPR 3+ domain, there are plenty of unique text (200-300 words+) on each page ..... I think all pages will be indexed as unique pages in Google, and no robots.txt disallow is needed. Still, my main objection would be against tactics aimed at artificially boosting your SE rankings. I wouldn't assume grey hat methods like recips (they're not citations or "votes" with minimal traffic value) will work indefinitely. Quote:
Quote:
http://www.webmasterworld.com/google/3079355.htm Whether he's bluffing or not who knows. I do know Google already detects and kills PageRank transfers on *some* bought links, and I assume the same to be happening with some traded, "made for SE ranking" links. Another relevant quote (Matt Cutts): Quote:
I still do not agree with the mentality "how can I improve/optimize my ranking without getting penalized?" whch seems to be driving this robots.txt discussion. A better question imo would be "how can I make my site more valuable to visitors, and more visible, so more people will find what they're looking for?" Bottom line: I see nothing wrong with blocking duplicate content pages using robots.txt or meta noindex tag - that's commonly recommended SEO practice. A free site submitter doesn't gain PageRank by disallowing / noindexing a page. It only prevents duplicate content from being indexed. Tagging a free site page with NOFOLLOW would send me a different signal (a free site submitter trying to hog PageRank), but that's another issue. P.S. Off topic, but if I ran a LL, I would think about tagging links to free sites with NOFOLLOW, as does Technorati tag pages, which are starting to rank very well on Google. You eliminate the reciprocal linking issue (turn all free site links into one way links), and possible negative trust brought on by linking to supplemental/duplicate content pages on untrusted domains.
__________________
Success is going from failure to failure without a loss of enthusiasm. Last edited by Halfdeck; 2006-09-25 at 04:54 PM.. |
|||||||
![]() |
![]() |
![]() |
#8 | ||
Searching for Jimmy Hoffa
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
|
Excellent post Halfdeck, you've really explained the topic perfectly and now I do agree with your stance on it. Just a few things.
Quote:
Quote:
<a href="http://www.nastyxvids.com/keyword/index.html" rel="tag">Keyword</a> Something like that? Also, since we're really talking about small link lists and their rules (as big linklist probably don't have to worry about the No Follow rule). Wouldn't it be wiser for smaller link lists to use recips more like TGPs use recips? Sorry, I'm more of a tgp guy so I'm going to explain this in those terms...but for TGPs we use a single recip with almost no hope of getting SE off that recip. What we use it for is more or less getting a percentage of traffic from other tgps that list the same gallery (hopefully getting listed with tgps of equal size or bigger than your own). So the focus of the recip is heavily on branding the domain name / name of the tgp or if it's a niche tgp highlighting the niche quality of your tgp that might draw people off of a general tgp to your site if they have a specific fetish. Shouldn't smaller link lists move to that single recip that heavily brands instead of a category recip? Last edited by Mr. Blue; 2006-09-26 at 12:28 AM.. |
||
![]() |
![]() |
![]() |
#9 | |
Madness is like gravity. All it takes is a little... push.
Join Date: Feb 2006
Location: California
Posts: 1,679
|
Quote:
<a href="http://www.nastyxvids.com/keyword/index.html" rel="nofollow">Keyword</a> A nofollow link is like wearing a condom. ![]()
__________________
~Warm and Fuzzy. ![]() |
|
![]() |
![]() |
![]() |
#10 | ||
Madness is like gravity. All it takes is a little... push.
Join Date: Feb 2006
Location: California
Posts: 1,679
|
Quote:
Quote:
![]() If I had a choice, I'd rather link to fewer sites and send more of my traffic to submitters who make quality sites and unique warning pages. IMO, doing so not only benefits both of us as far as SEO is concerned, but it also maintains a quality brand for my own LL. ![]()
__________________
~Warm and Fuzzy. ![]() |
||
![]() |
![]() |
![]() |
#11 | |
Searching for Jimmy Hoffa
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
|
Quote:
Now, I personally always submitted to a small LL grouping to remove the duplicate page penalty, but if there were more LLs that interested me, I would now do as I stated above. As for whether it's good advice or not to use the robot.txt...if this is setting off people's scripts, if it's too much hassle for them to review your freesites, reviewers will do exactly do as preacher and jel did...they'll most likely not even bother reviewing or listing your sites. So, there's a certain futility in following the robot.txt advice if it prevents you from easily getting listed at the LLs you're submitting to. As a submitter you have the options of either submitting to very few LL's (that was my choice when I was regularly submitting), Change the pages enough to avoid the duplicate page penalty (not hard to do when you're working with a template system), or use the robot.txt and not get listed on a number of sites you're submitting to. Out of those options the most sane and easy one is to submit to a very small high quality group of linklists that you know will list you regularly. You posted something Linkster said and maybe he can stop in and help me out on this point as I will gladly defer to his expertise in this area because I know he knows wayyyy more about this topic than I ever will. In that post you quoted, Linkster says that linksforsex went to a single recip, but a few months ago he went back to the category specific recips and that's still the case today. Now, why would a Link List switch back to category recips? As far as I can tell category recips have mainly been put in place for SEO? I'm not sure what benefit category specific recips would have other then SEO. If LLs are using category specific recips, it seems they're doing so for SEO, if that's the case then it would behoove all involved to make google as happy as they can and take effort to remove the duplicate page penalty, but at the same time not negate the category recips that so many LL's use by using a robot.txt to block the engine from searching those pages. Essentially you are breaking a Link Lists rules because you're completely negating any benefit the Link List owner was trying to get by having category specific recips. |
|
![]() |
![]() |
![]() |
|
|