We added this to the bottom of our robots.txt:
# Block all query strings
The site was evaluated twice since then on webmaster tools, but we still have pages showing up with duplicate content, example:
I have seen this issues logged by others as well. Is there any solution to this, since Webmaster Tools does not allow you to batch upload a removal request of old pages?