Are you trying to check these files by the content of the files (code)? This seems like it would be a lengthy process and scripting something for all of them could be sketchy if the files aren't generated the same way. For instance, looking or specific keywords would be wanted, but if one page is missing a keyword or another contains these keywords (for external links, webpage keywords in general, or some other reason.
If this is what you want, it is possible to do in php I can tell you that from the start, however if you don't have a php server installed or even know the language I would probably look around for a better language to use (someone else may suggest one that works better, I'm not too sure).
The main functions I would look into are scandir, is_dir, file_get_contents, stripos (or strpos, depending if you want it case-sensitive) and foreach (would look through each file/directory).
Since I don't know the exact file storing structure or the exact way you are going to search for these, I can't really post code that would help you.
Here would be a sample of a file crawler (possibly, not really testing).
$files = scanpath(getcwd());
$filelist = array();
$files = scandir($directory);
foreach($files as $name)
$filelist = $directory.'/'.$name;
I haven't tested the code but it would be something like that, it will probably need some editing.
As for the file checking, you could do something like this, after the list function above works.
foreach($files as $file)
$content = file_get_contents($file);
$keywords = array();//List your words here
foreach($keywords as $word)
if(stripos($content, $word) === false)
continue 2; //Doesn't contain a keyword, skip
//Do whatever here for files that passed the check
Again I haven't tested this nor do I know for sure what you need, but if you have a decent knowledge level of php it should definitely get you going to what you need.