11-14-2002, 08:04 PM
Is there a way to block, at file, directory, or even server level, calls to external URLs (i.e. external to the domain)?
I have a private directory where we store some pages grabbed on the Web; I'd like to prevent external elements (graphics, ad banners, etc.) to load automatically when the archived page is displayed -- without having to manually edit each file.
Thanks for any idea.
11-15-2002, 01:34 AM
Perhaps replace "http://" with nothing when you display the archived pages, unless the URL string contains your domain?
11-15-2002, 07:36 AM
Err... sounds good but...
How can I automate that?
The idea is *not* to have to manually edit files.
Pages will mostly be displayed using php's "include()", so I reckon there may be a way to trim the http:// part. But as some pages have something in their source html (I haven't really enquired that yet) preventing them to display properly (or to display at all) via php, I'd like to keep the option of directly displaying a page.
So the question remains how exactly can I automatically "sterilize" outbound links at server level?
11-17-2002, 11:58 AM
Could someone please tell me at least in which direction to enquire...
11-17-2002, 01:53 PM
I'm not quite sure how you could do it in php (although I wouldn't give up hope - there may be a way!).
Keep in mind I use ASP every day, not PHP.
If you do end up manually editing files, you can open a hundred of them at once with Textpad and just do a global search and replace of
with nothing... I know, not the best solution but it should work in a pinch. Maybe someone else will come along with a better suggestion.
Like some way of denying any external access from the archived folders, I would imagine (except to your script). That should be possible I would think.