The META robots is almost entirely useless/disused. Very few robots today even look for it. The only reliable way to control spiders is to use a ROBOTS.TXT file and give explicit instructions for all bots to not visit certain files/areas on your site, or explicit instructions for only certain bots....
For example, the attached file is what I've used on shadowstorm for years, it blocks all bad spiders from indexing the site (of course, I could write a bot that specifically ignores the robots.txt orders, anyone could...so it only blocks the common siphons/harvesters and naughty spiders).
And apple...you block certan bots/spiders who you know do bad things, like email harvesting or content siphoning. You can also block certain search engines from indexing certain areas of your site (for example, block google from viewing any area of your site that is not built specifically for best google results...though this method is not as efficient any longer, considering improved spider redirects)