04-04-2007, 01:41 PM
imagine suche a senario:
a database on the server.
a client run a query, the result of which is 300 records.
php must send only 30 records, but at the page, include links like this:
so that the client can continue recieving the whole result of his query.
I know how to run a query on the server, and get and send the result.
but I wonder, what the right technique is, to deliver many records by part, without every time to run the query again on the server.
04-04-2007, 06:52 PM
hum alright. I'd use the MySQL LIMIT for that
in your page, have a code like this
$resByPage = 20; //number of results/page
$totalRes = mysql_num_rows(mysql_query("SELECT * FROM table")); //get the total records you need
$TotalPages = ceil($totalRes / $resByPage); //number of pages is rounded to the upper unit
//if user clicked a page #
if($_GET['page'] > 0)
$page = $_GET['page'];
$page = 1;
$query = mysql_query("SELECT * FROM table LIMIT ".(($page*$resByPage)-$page).", ".$resByPage); //get a resultset of only the records of the current page
for($i=0; $i<$TotalPages; $i++)
echo "<a href='results.php?page=$i'> $i </a> ";
while($row = mysql_fetch_array($query))
if anything, the code is really basic, and it's probably not the best way to do it. (You do run the query everytime, but at least you only get the needed results) I also assumed you were working with MySQL. I took an old forum script for a php based game and realized how wrong I was with it back then... Oi I need to fix that. Anyway, it should work now, I'm pretty sure of it, I just don't know if thats really what you wanted.
04-04-2007, 07:56 PM
without every time to run the query again on the server.
It is possible to cache the query's results. But, it is easier and sometimes more efficient to use LIMIT as Krokador suggested.
Unless you have a really expensive query (many joins/calculations), then you'd be better of storing the results somewhere, while the visitor browses them.