View Full Version : if !in_array "allow" or "deny" ?

03-13-2009, 11:37 AM
I have 4 pages setup with this code below, deny.php doesn't have it.
It's not working like it's ment to, it keeps going deny.php.

$referer = $_SERVER['HTTP_REFERER'];
$fromURL = array("1.php","2.php","3.php","4.php");
if(!in_array($referer,$fromURL)) {
header('Location: deny.php');

Any ideas on how to verify that you are from the pages from within the site?
This code is so bad it's actually an embarrasment.


03-13-2009, 11:52 AM
You're right, you should be embarrased ;)
I'm just kidding. Its close. I'm about 95% certain that HTTP_REFERER includes all the url information. So, use parse_url (http://php.ca/manual/en/function.parse-url.php) to get just the targetted script name.

This isn't a great great method though. The problem with it is that a proxy can be used with the same page names, and would validate to true. Instead, consider using sessions to verify that pages are from the current site. If you set a $_SESSION['curpage'] to always equal $_SERVER['SCRIPT_NAME'] somewhere on a load, then when you hit this page it will contain the previous call to SCRIPT_NAME (which should correspond with one of you're pages in the array).

03-13-2009, 11:58 AM
So I could use this?

$page = basename($_SERVER['SCRIPT_NAME']);
$page = explode(".", $page);
$page = strtolower($page[0]);
$_SESSION['curpage'] = $page;

03-13-2009, 12:15 PM
Lessee, yeah that looks like it would work. Then use $_SESSION['curpage'] as the value on the next page.
Instead of using basename + explode, consider using pathinfo instead.


Something like that.

Oh yeah, I should mention. SCRIPT_NAME refers to the executing script, not the specific file:

// include.php

// Index:
require 'include.php'; // Prints index.php, not include.php

This is likely what you're wanting to do.

03-13-2009, 12:32 PM
Forgive me for not fully understanding...

$referer = $_SERVER['HTTP_REFERER']; // this goes or stays
$fromURL = array("1.php","2.php","3.php","4.php"); //these change or not?
if(!in_array($referer,$fromURL)) {
header('Location: deny.php');

03-13-2009, 12:44 PM
Going from the session side?

$page = isset($_SESSION['curpage']) ? $_SESSION['curpage'] : '';
$aAllowed = array("1.php","2.php","3.php","4.php");
if (!in_array($page, $aAllowed))
header("Location: deny.php");


This assumes that curpage is set successfully into the session(s) from the refering page.

03-13-2009, 12:47 PM
Based on that;

$page = pathinfo($_SERVER['SCRIPT_NAME'], PATHINFO_BASENAME); //just added this in to test
//$page = isset($_SESSION['curpage']) ? $_SESSION['curpage'] : ''; //this must come out to work
$aAllowed = array("1.php","2.php","3.php","4.php");
if (!in_array($page, $aAllowed)){
header("Location: deny.php");

<a href="1.php">link1</a> <a href="2.php">link2</a> <a href="3.php">link3</a> <a href="4.php">link4</a>Each clicked link goes to deny.php is that right?

03-13-2009, 12:59 PM
You don't want to set the $page on this one (sorry, in the way that you are using it). For this, you'd use the $_SERVER['HTTP_REFERER'] instead of the $_SERVER['SCRIPT_NAME']. The script name would be used in conjunction with the $_SESSION to set the session value on the previous page (before loading this one).
With the links at the bottom of this page, it looks like this isn't quite what you want to do. Once you click one of those links, the referer and the script name will become whatever __FILE__ is (so whatever the name of this script is).
Does that make sense?

03-13-2009, 01:11 PM
Sort of...
This $_SERVER['SCRIPT_NAME'] works better than SERVER['HTTP_REFERER']
I may have explained wrong.

When opening a web site, say "example.com"
Clicking on any link in "example.com" must be validated against other links/pages that exist in that site, otherwise denied.

I am trying to get rid of human/bot comment spammers.
Unless these bots can click links they are denied.

I hope what I am saying makes sense.

03-13-2009, 01:16 PM
Bots can be told (and are supposed to adhere) to not follow any links.
The problem is, none of this actually prevents this. The idea of checking the referrer is to ensure that the request is made just from you're site, and not from another. This technique tends to be used more often for forms though.

This all depends a lot on what you're allowing from you're users. If you force a user login system, then only those users should be able to say post comments. If you want anonymous users, a captcha often does the trick.
The biggest problem with the $_SERVER superglobal is that most of it can be altered by the browsing user. The REMOTE_ADDR, HTTP_REFERER, CONTENT_TYPE, all sorts of things, so its not overly reliable.

Could you try to explain sort of what you're trying to achieve in a usecase or sequence style? (User clicks on 'Control Panel', we want to check if they have access, for example). I'm just trying to figure out exactly what you're trying to do.

03-13-2009, 01:27 PM
I'll try my best.
I followed a tutorial on how to build a link exchange system.
To add a link you click on or open the submission page with a link.

Everyday I get about 5 or so dummy links, they are added to pages for other bots to harvest. (I must have did someone in for this to happen)
I have tried two separate captcha systems which work wonders on my other sites.
I have tried: renaming pages/forms/inputs even password protecting the page, still no luck.

Do you need more info? //that was a stupid question
If anyone adds a link from example.com or example.com/links, Unless they came from my site or are allowed they are denied with a 403 : Forbidden

03-13-2009, 01:43 PM
Is you're edit what is currently happening, or what you'd like to happen?

Sounds like you're just getting spammed. We get that a lot on the forums here - last night I banned 3 accounts and cleaned up about 10 - 15 posts. Jeremy was online too, and doing some of his own cleaning from the looks of it.
If the links are dummy links, you can probably try to curl them first for a head. If it succeeds, its likely a real site, otherwise, I'd just deny. I'm a little curious of whats happening, can you post a link for the tutorial you followed?

03-13-2009, 01:49 PM
I would like that to happen.

Each page would need the code or even the .htacces would allow or disallow based on input added by me.

It seems like a complicated method but I think that each site should have an access list of sorts.
If you ask nicely and follow the rules you get in, otherwise no.

Forgot the link.
link exchange tut (http://www.sebastiansulinski.co.uk/web_design_tutorials/dreamweaver/link_exchange_system_part_1_a.php)

I test my .htaccess with this tester (http://www.botsvsbrowsers.com/SimulateUserAgent.asp)

03-13-2009, 02:06 PM
I'll check the links out when I get home (getting close to ending work, so I guess I gotta do some work).
Until then, here is a link I found for combating spam with .htaccess:

That may be of some help. Looks like its intention is to directly protect the posted to script.

03-13-2009, 02:09 PM
Usign this in the mean time; too bad it's not dynamic!

RewriteEngine on
# Options +FollowSymlinks
RewriteCond %{HTTP_REFERER} badsite\.com [NC,OR]
RewriteCond %{HTTP_REFERER} anotherbadsite\.com
RewriteRule .* - [F]

That one seems better than mine, thanks for your time here Fou-Lu (http://www.codingforums.com/member.php?u=2783)

03-13-2009, 02:12 PM
np, I'm just at work anyway ;)