First - Im glad to see people thinking along these lines - unfortunately you are all giving Google a heck of alot more credit that its due.
If the pages are all on one domain - yeah Google will pick one and call the rest dupes - if you dont change the pages - if you change them AND - change the inside pages for each copy of the site - then you might have a chance of getting more than one copy in Google.
as far as the robots text thingie - you are submitting free sites to a LL - if the LL bans you because of this then they are not a LL - they are trying to build a SE hub - granted we would like to get some SE benefit but it sure isnt the main reason to run a LL - or make decisions on listing someone - there are definitely much better ways for a LL to get the phrases they want in a SE - which is why after testing niche recips we went back to the single recip.
I will repeat what I have said in other threads about this - the sites you submit to LLs should NOT be the copies you are trying to get in the SEs - a site that specifically is built for the SEs will do much better and can have a few more aggressive ways of getting surfers to buy - of course thats just my way of doing things but it seems that it has worked well for a bunch of us over the years
|