We added this to the bottom of our robots.txt:

Code:
# Block all query strings
Disallow: /*?
The site was evaluated twice since then on webmaster tools, but we still have pages showing up with duplicate content, example:

/updates
/updates?interviews=1

I have seen this issues logged by others as well. Is there any solution to this, since Webmaster Tools does not allow you to batch upload a removal request of old pages?