Neither MSN or Slurp support the Allow directive so the above will block them.
Robots.txt is not a very flexible tool Nancy, you will need to disallow on an individual basis. Plus, there are many more traffic sending search engines than those 3, so blanket blocking everthing else, even if you could do it, would be a poor idea. Ande, even if you could, as you say why would they listen to your robots.txt file? Most aggressive bots are malicious and won't take any notice at all!
If they are genuine bots that will listen, then they would be listed with a user agent and a web address.