...

View Full Version : To use or not to use (GET variables)



bowser1111
03-14-2007, 02:55 PM
This is a bit of a dumb question, but I tend to see two different kinds of sites out there... the ones that look like this:

domain.com/widgets.php

and the ones that look like this:

domain.com/index.php?q=widgets

On the one end of the spectrum, it seems like you could simply have one index.php for your entire site, a "q" variable for everything you could possible want to see or do, and just a bunch of stuff thrown into an "include/" directory to make it all happen... but I dunno... this seems somehow... creepy.

On the other side of the spectrum, you could just as easily have different .php files for everything you want to do, includes for just stuff like the header and footer, and only use GET variables for small stuff... but something about having a ton of files on the server also seems ... creepy.


I guess my question is where is the balance in all this? Is there some reason not to lean too far to one extreme? Do performance issues come into play here or no?

mlseim
03-14-2007, 03:38 PM
Bowser ...

That's the power of PHP. It makes no difference to a search engine
because the PHP scripting is creating valid HTML/XHTML just like a
static webpage.

See this site for example: http://www.cgrelayforlife.com

The navigation is a PHP script (using PHP "include")... all pages are
displayed using a PHP script, called "page.php" ... and content for
each page is taken from a database ... again, using "includes".

So there are basically two pages, index.php and page.php

A page can be instantly added by creating another text block in the database.

Servers are so fast and efficient now, bandwidth limits are so high ...
it just makes sense to use PHP or Perl for creating dynamic webpages.

Using PHP is especially necessary for large, fluid types of content, like
photo galleries for example. You can create photo galleries dynamically
and not have to worry about creating static pages.

aedrin
03-14-2007, 04:15 PM
I guess my question is where is the balance in all this? Is there some reason not to lean too far to one extreme? Do performance issues come into play here or no?

That is the right question to ask, where is the balance in it.

For me, if I have to make something temporary in a short amount of time with 3-5 pages - max - then I'll go for individual PHP files per page. However, if I know this is for something more permanent, I'll set up a index.php file with GET parameters. Simply because it is much easier to manage.

bowser1111
03-14-2007, 05:48 PM
Servers are so fast and efficient now, bandwidth limits are so high ...
it just makes sense to use PHP or Perl for creating dynamic webpages.

THIS is exactly the kind of information I'm just dying to hear, but don't think to ask... I mean in my head I'm always curious about performance issues and the like, but what I really just need is for someone to tell me "look, performance issues were a factor in 1992... it's 2007... just shut up and use PHP for all it's worth".

So, just to flesh this idea out a little more... let's say my server was dishing out 2,000 pages a day (or any number you wish that's "quite a bit" without being "unreasonable") and I was using a few includes (let's say like 4 or 5) rather than static pages... even if my site were to grow this large (here's hoping), should I even be worried about performance issues where includes and simple GET variables are concerned??

What if I were to use extensive PHP? What about GDLibrary? I mean simple terms (if this is even possible), at what point should I start worrying about performance issues at all??

bowser1111
03-14-2007, 05:51 PM
@Aedrin

This might be a bit arbitrary, but for me it seems to me that it would be reasonable to have a different .php file for each folder or area of the site. That is, if I were to have an "admin" area, then one .php file (with different get variables) would exist off the admin/ directory and would control whatever I need to do (edit entries, add/delete entries, etc), whereas a different area of the site would have a different .php off its directory.

Does this seem reasonable?

JohnDubya
03-14-2007, 05:59 PM
bowser, some really good questions! For my own personal use, I try not to use $_GET's too much because I don't like junking up the URL. But they are very useful at times. It would be really nice and organized to use index.php?page=edit_user for all the links on your site to make it dynamic. That would allow you to only create one page for the header/footer, and the main content could be included. I have yet to build a site like this, but I can see where it would be really handy and convenient.

And even though you didn't ask me specifically... :) I would definitely make a different index page for the normal section and the admin section. Better to keep them separated.

_Aerospace_Eng_
03-14-2007, 06:06 PM
I use it on most of my sites. I posted the script I use in this thread if you are wondering how it could be done.
http://www.codingforums.com/showthread.php?t=106420
A good example can be found here on a site I made.
http://www.thesidewalkstudio.com/gallery.php
Click on the subnavs on the left. Notice the urls.

bowser1111
03-15-2007, 02:25 AM
@JohnDubya... yes that makes total sense, and I'd prefer not to junk up the URL either...

But wait! The one thing I had forgotten is the search engines! At the last point I had looked into it, I had read that Google can only handle a few GET variables tops (and maybe none at all) in its indexing. Does that still hold true today?

So for example is there a chance (however small) that:

domain.com/index.php?article=11003

wouldn't get indexed? What about if you used ModRewrite on an Apache server to make it:

domain.com/articles/11003

(as I'm planning to do)?

aedrin
03-15-2007, 04:12 PM
I had read that Google can only handle a few GET variables tops (and maybe none at all) in its indexing.

I always wonder why people thought that?

The only reason Google would exclude large parameter counts is because it could indicate a dynamic page that is not useful for information.

And there is no reason to not use this method to create your sites.

Troy297
03-15-2007, 04:48 PM
Well.... for me what I always do to make things really simple and easy to edit is:

a) Make a file called head.php which contains all the code above main content
b) Make a file called foot.php which contains all the code below the main content
c) Then create seperate pages for everything that all look like this:


<?php require('head.php');?>
Page Content Here
<?php require('foot.php');?>

So that way I know where to find the content of every page so I can edit it, and if I ever want to tweak the layout I only need edit one file.

Hope this helps!

aedrin
03-15-2007, 05:41 PM
I think that's one of the most popular reasons people use PHP. ;)

bowser1111
03-16-2007, 03:16 AM
I always wonder why people thought that?

In my case I read it in a book specifically about SEO (and not that old, either).


The only reason Google would exclude large parameter counts is because it could indicate a dynamic page that is not useful for information.

Wouldn't that be reason enough to worry if you were trying to get the page in question indexed?

felgall
03-16-2007, 04:05 AM
You can always add a MOD rewrite into your .htaccess file to rename the pages so that you can link to them as if they were actual separate pages.



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum