View Single Post
Old 2006-09-22, 08:28 PM   #6
Halfdeck
You can now put whatever you want in this space :)
 
Halfdeck's Avatar
 
Join Date: Oct 2004
Location: New Haven, CT
Posts: 985
Send a message via ICQ to Halfdeck
Quote:
This is actually more of the point that I was interested in.
I hear you Preacher. However, META NOINDEX should solve that problem. It's also a more reliable alternative to robots.txt for keeping URLs out of Google's index.

Quote:
Originally Posted by Mr. Blue View Post
A little extra work, but feasibly 1 freesite can be optimized for a whole bunch of keywords depending on how many LL you're submitting to and how much you're changing the page.
Mr. Blue, that's as realistic a statement as saying it only takes a little extra work for LL owners to send hand-written rejection emails.

Quote:
Unfortunately picxx you were a victim of bad advice. I'd suggest you no longer listen to the person that gave you this advice in the first place
http://www.greenguysboard.com/board/...ghlight=robots

In that thread, I wrote:

Quote:
1000 incoming links are worthless if the links point to a page Google considers spam. If Google chooses an index.html with small LL linking to it over an index.html with links from penisbot, LOR, etc, then you're shit out of luck...I wouldn't leave it up to Google to decide that for me.

To customize your free sites to make them appear unique to SEs seems to be a waste of time to me.

One suggestion I saw to use robots.txt to prevent Google from indexing the duplicate content...so you'd get nice SE traffic off your primary index.html and then pull nice LL traffic off the duplicate doorways.

Personally, I'd build 1 site and submit to 20. The 20 LL will give you say 30,000 uniques over a few months and then 5,000 or whatever from SE a month.

If you spent time submitting to the other 80, lets be optimistic and say you'd get 5,000 uniques from them a month. Still, if you had to choose between 5k from google or 5k from LLs...which would you choose?
Linkster wrote:

Quote:
First - Im glad to see people thinking along these lines - unfortunately you are all giving Google a heck of alot more credit that its due.
If the pages are all on one domain - yeah Google will pick one and call the rest dupes - if you dont change the pages - if you change them AND - change the inside pages for each copy of the site - then you might have a chance of getting more than one copy in Google.

as far as the robots text thingie - you are submitting free sites to a LL - if the LL bans you because of this then they are not a LL - they are trying to build a SE hub - granted we would like to get some SE benefit but it sure isnt the main reason to run a LL - or make decisions on listing someone - there are definitely much better ways for a LL to get the phrases they want in a SE - which is why after testing niche recips we went back to the single recip.

I will repeat what I have said in other threads about this - the sites you submit to LLs should NOT be the copies you are trying to get in the SEs - a site that specifically is built for the SEs will do much better and can have a few more aggressive ways of getting surfers to buy - of course thats just my way of doing things but it seems that it has worked well for a bunch of us over the years
__________________
Success is going from failure to failure without a loss of enthusiasm.
Halfdeck is offline   Reply With Quote