Greenguy's Board


Go Back   Greenguy's Board > Possible Cheaters
Register FAQ Calendar Today's Posts

 
 
Thread Tools Search this Thread Rate Thread Display Modes
Prev Previous Post   Next Post Next
Old 2006-09-23, 02:56 AM   #31
Mr. Blue
Searching for Jimmy Hoffa
 
Mr. Blue's Avatar
 
Join Date: Jan 2005
Location: Long Island, NY
Posts: 771
Quote:
Originally Posted by Halfdeck View Post
Mr. Blue, that's as realistic a statement as saying it only takes a little extra work for LL owners to send hand-written rejection emails.
That's not entirely true as if you're working on a template system it takes only a little bit more time to edit the pages for multiple LLs...hell, it doesn't really take much time to change the meta tags, sales text, and alt tags. If you run between 12 recips per freesite...if you're submitting to roughly 36 LLs...that's only 3 freesites you have to edit and optimize. It's not like submitted to tgps where you may have 2000+ pages to edit.

Now, I personally always submitted to a small LL grouping to remove the duplicate page penalty, but if there were more LLs that interested me, I would now do as I stated above.

As for whether it's good advice or not to use the robot.txt...if this is setting off people's scripts, if it's too much hassle for them to review your freesites, reviewers will do exactly do as preacher and jel did...they'll most likely not even bother reviewing or listing your sites.

So, there's a certain futility in following the robot.txt advice if it prevents you from easily getting listed at the LLs you're submitting to. As a submitter you have the options of either submitting to very few LL's (that was my choice when I was regularly submitting), Change the pages enough to avoid the duplicate page penalty (not hard to do when you're working with a template system), or use the robot.txt and not get listed on a number of sites you're submitting to.

Out of those options the most sane and easy one is to submit to a very small high quality group of linklists that you know will list you regularly.

You posted something Linkster said and maybe he can stop in and help me out on this point as I will gladly defer to his expertise in this area because I know he knows wayyyy more about this topic than I ever will. In that post you quoted, Linkster says that linksforsex went to a single recip, but a few months ago he went back to the category specific recips and that's still the case today. Now, why would a Link List switch back to category recips? As far as I can tell category recips have mainly been put in place for SEO? I'm not sure what benefit category specific recips would have other then SEO.

If LLs are using category specific recips, it seems they're doing so for SEO, if that's the case then it would behoove all involved to make google as happy as they can and take effort to remove the duplicate page penalty, but at the same time not negate the category recips that so many LL's use by using a robot.txt to block the engine from searching those pages. Essentially you are breaking a Link Lists rules because you're completely negating any benefit the Link List owner was trying to get by having category specific recips.
__________________
69Blue.com
ICQ #223487665
Mr. Blue is offline   Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 11:47 AM.


Mark Read
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
© Greenguy Marketing Inc