Hi:

I have a site that has both categories and geographic areas, and provides sorting and filtering by either. This results in some url's that look like this:

Code:
www.mysite.com/category-1-0/Category Name.html
wherein "1" = the category number and "0" = ALL geographic areas. This also results in a boatload of virtual urls (for all 11 areas) that look like this:

Code:
www.mysite.com/category-1-1/Category Name.html
www.mysite.com/category-1-2/Category Name.html
www.mysite.com/category-1-3/Category Name.html
and so forth. End result is 1500+ urls I don't want crawled. I've tried multiple disallow schemes, which don't seem to be working. My latest looks like this:

Code:
Disallow: /category-*-1/* 
Disallow: /category-*-2/* 
Disallow: /category-*-3/* 
Disallow: /category-*-4/* 
Disallow: /category-*-5/* 
Disallow: /category-*-6/* 
Disallow: /category-*-7/* 
Disallow: /category-*-8/* 
Disallow: /category-*-9/* 
Disallow: /category-*-10/* 
Disallow: /category-*-11/* 
Disallow: /category-*-12/* 
Disallow: /category-*-13/* 
Disallow: /category-*-14/*
And it is simply NOT working. I've been having this discussion with webado2 over at Google's GSoftCrawler discussion group (as part of tweaking my sitemap), but apparently she's out of ideas as well as to why this is not working vis-a-vis the Googlebot. I know that wildcards mayn't be accepted by some bots, but I have to get this under control at least partially. Interestingly, this:

Code:
www.mysite/category-15-11/real-estate-and-property/land-for-sale/ 
offer_wanted-all.html 

with a disallow of this:

/*/*/*/offer_wanted-all.html
IS working, so it's not my robots.txt file in general that's ferschplutzed.

Does ANYONE have any suggestions as to what might work in this situation? (I didn't write the original php code, please don't hose me over the naming conventions, thanks). I'd appreciate any help, I've perused myriad articles on this but cannot seem to sort out what I'm not getting right.

Thanks,

ClancyCat