Hello together, I'm pretty new to this board so maybe I should introduce myself first
...ok, ok hold on, I know this is not the place for the "hello thing", I have already posted here:
http://www.greenguysboard.com/board/...179#post111179
so everybody can see what I do and who I am.
And now the problems described on this thread: This is a serious problem that google has. Google is pretty limitated to their capacity of indexing sites. So if let's tell all sites are beeing copied that are currently indexed/cached by Google that would mean that there are twice as much sites/pages that google "should" index/chache as they are able to.
So what they have to do is to decide which sites they would like to index and which ones not. So this might be the reason why they are using a "duplicate content filter"
10 minutes over °