Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 3 of 3
  1. #1
    New Coder
    Join Date
    Mar 2004
    Posts
    58
    Thanks
    0
    Thanked 0 Times in 0 Posts

    dynamic pages and SEO

    I am running into a little SEO problem with my php generated pages. The search engines don't like the variables in the back of the URL ( ?name=xyz) I have decided to keep my pages dynamic, but the way I do it is by having a php page called file.php and then all this file has is something like includefile "anyfile?name=xyz" I basically pull in a dynamically created page into a staticly named page.

    This is fine with 20 or even more files.

    I now want to do an entire directory for about 50K entries. I can't manually generate those files. So I was wondering if there any tools or scripts out there, that I can run which will automatically generate the files for me. The file names need to come from a database or excel, whatever works.

    Any ideas how to do this a smarter way>

    thanks

  • #2
    raf
    raf is offline
    Master Coder
    Join Date
    Jul 2002
    Posts
    6,589
    Thanks
    0
    Thanked 0 Times in 0 Posts
    I don't realy understand it.

    you can use PHP to dynamically create the files (using data from the db or from whatever other souce) and save them, but what has this to do with the querystring issue ?
    Posting guidelines I use to see if I will spend time to answer your question : http://www.catb.org/~esr/faqs/smart-questions.html

  • #3
    Senior Coder
    Join Date
    Jun 2002
    Location
    near Oswestry
    Posts
    4,508
    Thanks
    0
    Thanked 0 Times in 0 Posts
    It is generally held, though obviously Google won't give the details, that it limits how deeply it indexes URLs with query data in them. The reason is just that it's too easy for a robot to get lost indexing hundreds of pages that are all basically the same.

    But what you can do is use a "clean URLs" approach, where you parse a normal-looking URL and create CGI parameters from it. For example:
    Could become:
    But it's done transparently - the address still looks the same. You end up with URLs that are easier to remember and type, and which look to a search engine just like folders.

    Anyway there's loads of ways of doing it, depending on your environment. PHP and mod_rewrite is what I do at http://www.udm4.com/ and I believe http://www.alistapart.com/ does something similar. Google for "clean URLs" and that should steer you right
    "Why bother with accessibility? ... Because deep down you know that the web is attractive to people who aren't exactly like you." - Joe Clark


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •