Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 3 of 3
  1. #1
    Regular Coder zro@rtv's Avatar
    Join Date
    Feb 2005
    Location
    on the network
    Posts
    433
    Thanks
    0
    Thanked 1 Time in 1 Post

    Parsing SERVER['REQUEST_URI'] for includes and such

    I was going to do this, but I thought I might be tangling up my logic, or overlooking something whether it be elegance or security... I was hoping someone could weigh in.

    Is there any problem with doing something like this?

    htaccess:
    Code:
    RewriteRule ^[a-zA-Z0-9/_-]+/?$ /index.php
    index.php:
    PHP Code:
    $uriArr explode('/'$_SERVER['REQUEST_URI']);
    include(
    $uriArr[1].'php'); 
    Is there something obviously bad about this?

    It seems nice because then say 'domain.com/article/title' was requested,
    index.php would include article.php, which then could contain code to query db for $uriArr[2]. And could use switch/case so if count($uriArr) < 2 then it could list 'articles'.And would be endlessly expandable.

    One problem I can think of may be that it would be greedy?
    As in if request URI was 'domain.com/article/non-existant/nonsense', if might not return a 404, if error checking wasn't robust?

    What are the implications of not returning 404's when they should, and anything relevant to this method?
    ._-zro
    zro@redtv
    zro.redtv.org

    "If HTML and the Web made all the online documents look like one huge book, RDF, schema, and inference languages will make all the data in the world look like one huge database"
    -Tim Berners-Lee, Weaving the Web, 1999

  • #2
    ess
    ess is offline
    Regular Coder
    Join Date
    Oct 2006
    Location
    United Kingdom
    Posts
    865
    Thanks
    7
    Thanked 29 Times in 28 Posts
    I don't think there are any problems with your approach. I personally use a similar approach but a more strict approach with servers that don't support mod_rewrite for example.

    I am not going to hand over any code here, but here is a clue as to how I go about handling requests.

    After configuring the server to forward all requests, including error requests, to a specific page such as index.php, I create a collection of regular expressions to match specific url patterns in that page, and handle every request accordingly. For example

    Code:
    /^articles\/([a-z]+)\/([a-z0-9\_]*\.html)$/i
    The above expression would match something like:
    Code:
    articles/java/mysql_connections01.html
    Depending on the number of different urls you wish to serve, you can create an array filled with regular expressions...each of which is customized to point to a specific page that would know how to handle a given request.

    However, you should handle all requests and ensure that when a resource is not available, a 404 error is thrown and caught correctly without generating pages vulnerable to XSS attacks.

  • #3
    Regular Coder zro@rtv's Avatar
    Join Date
    Feb 2005
    Location
    on the network
    Posts
    433
    Thanks
    0
    Thanked 1 Time in 1 Post
    thanks for the input ess.
    Sometimes when I have an idea and it seems simple enough and like it will work, I like to just bounce it off someone first.
    In the past I've thought of something and thought it would be a good idea, then realize later the obvious flaw in logic or blatant security flaw that i simply overlooked.

    any other input from anyone would be interesting as well.
    Thanks again!
    ._-zro
    zro@redtv
    zro.redtv.org

    "If HTML and the Web made all the online documents look like one huge book, RDF, schema, and inference languages will make all the data in the world look like one huge database"
    -Tim Berners-Lee, Weaving the Web, 1999


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •