After going through so many articles & reviews about duplicate content, I came to a conclusion that google does not see multiple copies as duplicate content, they just discount it, alternatively in some cases it has been noticed that the traffic of those sites where duplicate content was being used has been increased dramatically, the reason might be google taking into account the “relevancy” of a particular site and how the “content” compares to the origin of the site’s content.
Matt’s views on this: “Do not worry about G penalizing for this. Different top level domains: if you own a .com and a.fr, for example, don’t worry about dupe content in this case. General rule of thumb: think of SE’s as a sort of a hyperactive 4 year old kid that is smart in some ways and not so in others: use KISS rule and keep it simple. Pick a preferred host and stick with it…such as domain.com or www.domain.com”.
Seems the worst that is likely to happen is the page with the duplicate content will be ignored. This should not be considered as a penalty. You can even split your articles into parts and make printable versions, better to write 8 to 10 short articles and submit each one to the same number of sites, use article sites like a book abstract, and add pages on it like a book. Note some duplicate content may be considered as spam because particular search engines may only allow 2 pages per site per SERP. Thus additional content will end up in the supplemental results but no penalty will be given.
However I would suggest as far as possible avoid duplicate – duplicate content.
Article by Paul Johnson, CMO, http://www.eTechSupport.net