No duplicate content, because all three sites would be penalized for non-unique content. The goal is to test to combination of built-in and community supported SEO tools to se which combination provided the best overall SEO benefit.
Actually, duplicate content is fine providing it is on unique domains. News stories in HTML5 content tags is near duplicate between thousands of sites, depending on the velocity of the news story, without issue.
If you're testing software for SEO purposes, then duplicate content is exactly what you want to use, as each software will slightly change the originality factor by default on how its standard design is coded, thus there is the uniqueness.
The idea is that you should be able to pull out a unique phrase to those sites being tested, and Google lists the domains in question in a specific order based on how Google has defined each uniquely for credibility (aka: likeness of code), not content, as you are not testing content, you are testing software coding with built-in SEO features.
The content must be the same on all softwares tested so the content is not a factor in ranking, only the software default coded layout.