View Full Version : not sure where to post this one

07-08-2005, 09:52 PM
I have a script that goes through and searches a mysql db, it currently has 16500 records, I get this error code-

Fatal error: Maximum execution time of 30 seconds exceeded in e:\deduper.php on line 22

I am running this on phpdev423, anybody know is this an apache error code? mysql? or php?
the script seems to be running correctly because it finds 3 duplicates before the error code is displayed and the process stopped. Could it be a php problem? I will post the script if it will help.


07-08-2005, 09:58 PM
It means the php script takes too long. Change the max execution time in your php.ini.

07-08-2005, 11:10 PM
Post the script. You could have a bad loop in there.

07-08-2005, 11:24 PM
I must or just not checking efficiently, I currently have my execution timeout set to 3000 seconds and it dies out in the 18000 range somewhere.

$sql="SELECT * FROM agents ";
$result = mysql_query($sql);
$rows = mysql_num_rows($result);
if($rows==0){echo"MySQL Error = ".mysql_error();} else {
echo "There are a total of ".$rows." rows in the database."."<br />";
while($line= mysql_fetch_assoc($result)){
$trec = $line['trec'];
$uid = $line['uid'];

$sql2="SELECT uid FROM agents WHERE trec = $trec";
$result2 = mysql_query($sql2);
if($trecrows > 1){echo $uid."<br />";}}

what I am doing here, I pull everything out of the database into an array,
I then iterate through the loop comparing a unique field with the same data in the array, is there a better way to do this, it just occurred to me that maybe if I created two arrays and compared them to each other it may go faster instead of using the db. any thoughts or suggestions would be appreciated.


07-08-2005, 11:48 PM
Well you said you have 16000+ records, it probably takes a while to pull them all out and it's going over the 30 second limit.

07-08-2005, 11:55 PM
well, when it said it found a duplicate in the 18000s, i wondered how that could be, I double checked there is actually 19500 records or rows whichever you prefer. but I have my timeout set to 3000 seconds which is 50 minutes, and I think that is still abit too long for the amount of data. am I wrong in thinking that way or should it take that long?