...

View Full Version : Memory not being freed



XmisterIS
07-04-2012, 01:02 PM
My script is consuming ever increasing amounts of memory and I don't know why.

I've whittled it down to the following code, which causes the problem:



$titles = array('a', 'b', 'c', 'd'); //Arbitrary column names.

gc_enable();

foreach ($filenames as $filename) //Array of names of very large data files in CSV format.
{
if ($fp = fopen($filename))
{
echo 'Reading file '.$filename.PHP_EOL;

while (!feof($fp))
{
gc_collect_cycles();

$line = fgetcsv($fp,0, ',', '"');
$assoc = array_combine($titles, $line);
extract($assoc);

$res = mysql_query("insert into mytable (a, b, c, d) values ($a, $b, $c, $d)", $conn);
mysql_free_result($res);

echo (memory_get_usage()/1024/1024).'Mb'.PHP_EOL;
}

fclose($fp);
}
else
echo 'Could not open file '.$filename.PHP_EOL;
}


Now, my (clearly naive) little brain tells me that the line which echos the memory usage should, by and large, report much the same value each time.

BUT ... it doesn't. Itreports an ever-increasing memory usage until the script hits the memory limit.

WTF???? :confused::confused::confused::confused:

Fou-Lu
07-05-2012, 01:18 AM
Do you see the same degradation of memory if you issue the insert using $line[0], $line[1], $line[2], and $line[3], and remove the use of the combine and extract?
Note that the use of the free_result is incorrect. This only works at freeing resultsets, which insert will not provide. I'd suggest first removing that, and seeing if the same behaviour occurs. It is issuing warnings each time it hits that free.

Otherwise, it does not appear that a leak exists created by your code itself. It is possible another method within it is leaking though.

So what I would do:
Remove the free result call. If the behaviour persists:
Remove the extract call. Insert using $assoc['a'] through 'd'. If it persists:
Remove the array_combine. Use $line directly. If it persists:
Swap storage of DB to that of file. If this corrects the issue:
Attempt to write a bind with MySQLi or PDO and loop the bind/execute.

Memory usage will always go up, I've never successfully brought it down to the initial level before. But in the end you should be able to keep it at a relatively stable size block for each iteration.

XmisterIS
07-05-2012, 11:27 AM
well, I tried your points in order, and the problem seems to have been fixed at point 3 - but also requires each of point 1 and 2 to be in place too.

There is certainly what I would call memory "creep" (i.e. it's creeping up slightly with each iteration) - but by and large, it's ok.

I see from the docs that MySQLi and/or PDO is recommended nowadays - I'll look into it, thanks. It will be easy to upgrade my code because i use a wrapper class and interface for mysql.

Fou-Lu
07-05-2012, 03:24 PM
Just go straight from the $line array, and skip the combine and extract


while (!feof($fp))
{
gc_collect_cycles();
$line = fgetcsv($fp,0, ',', '"');
$res = mysql_query("insert into mytable (a, b, c, d) values ({$line[0]}, {$line[1]}, {$line[2]}, $line[3]})", $conn);
}

Although its even easier with PDO or MySQLi since you can just bind and execute on a prepared statement instead.



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum