I have the following as part of my .htaccess file:
RewriteRule ^(.*)robots.txt$ $1/robots.php [L]
This allows me to do some PHP stuff with page metrics before actually sending the robots.txt directives, and it makes the presentation of robots.txt seemless as expected.
However, today Google threw me a curve ball by requesting robots.php out of the blue. I don't want them (or anyone else) to do that so I need to block direct access to robots.php, issuing a 404. To do that, I tried the following:
RedirectMatch 404 ^(.*)robots.php$
And it worked...too well. Now my robots.txt returns my 404 page as well.
I presume this is a direct result of the rewrite rule which must count just the same as a direct URI "get" request to the page (which I had not intuited).
So is there a way to simultaneously serve robots.php when robots.txt is requested AND issue a 404 error when robots.php is directly requested?