How can I increase the shell's (or specific application's) memory limit?

Daniel A. ldrada at gmail.com
Mon Apr 3 11:48:55 UTC 2006


On 4/3/06, Olaf Greve <o.greve at axis.nl> wrote:
> Hi,
>
> I've got a question which is probably pretty easy to answer: how can I
> assign more memory to a PHP script running in a shell and/or in a browser.
>
> Some more background info:
> I'm building a PHP script that has to retrieve pretty large sets of data
> from a remote MySQL database, then process it, and store the results to
> a local database.
>
> The issue:
> The script (surprise, surprise) quickly runs out of memory. Now, I have
> already tried to increase the memory limit in php.ini (followed by an
> Apache restart, of course), but even when setting the limit to something
> high like 384MB or so, the script still bails out with a memory limit
> error when retrieving as little as some 50MB of data...
>
> Now, of course I could rewrite my PHP script such that it will retrieve
> smaller batches of data, but being a programmer I'm lazy, and I'd rather
> simply assign more memory to the script (actually, it's not only due to
> laziness, but also due to the fact that the script has to agregate data
> etc., and I'd rather have it do that in 1 run for a variety of reasons).
>
> It seems to me like setting the memory limit in php.ini above a value of
> 64MB (or so) doesn't seem to have any effect anymore. My assumption then
> is that the memory limit is somehow enforced elsewhere (the shell
> perhaps, and/or Apache?).
>
> Can anyone tell me how to adjust this such that I can successfully
> assign say 384MB of memory to PHP scripts ran both from browsers (i.e.
> through Apache 2.2 and mod_php) as from the commandline?
>
> Tnx in advance, and cheers,
> Olafo
>
> _______________________________________________
> freebsd-questions at freebsd.org mailing list
> http://lists.freebsd.org/mailman/listinfo/freebsd-questions
> To unsubscribe, send any mail to "freebsd-questions-unsubscribe at freebsd.org"
>
Hi Olaf,
Generally, I think it's bad programming practice to retrieve such big
datasets if it is possible to do otherwise.

Concider this example:
I needed an app which would write a file which has a set size. The
most obvious way to do this, is to make a random string which would be
X bytes long. That, however, is not feasible, and very slow.
The solution is to find a data size which is not too small and not too
big, so disk I/O and CPU time are balanced well.

<?php
$num = ''; // The string to write
$bytes = 419430400; // 400 megs
$starttime = time();
$blocksize = 1048576; // 1 meg blocks
$totalblocks = $bytes / $blocksize;
$filename = "400.megs";
for($i=0;$i<$totalblocks;$i++)
{
        for($x=0;$x<$blocksize;$x++)
        {
                $num .= rand(0,1);
        }
        $difftime = time() - $starttime;
        $hours = date('H', $difftime) - 1;
        $minsecs = date('i:s', $difftime);
        $y = $i+1;
        $pct = round(100/$totalblocks*$y, 2);
        echo "[$hours:$minsecs] Writing block $y, $pct%\n";
        file_put_contents($filename, $num, FILE_APPEND);
        $num = '';
}
?>

Now, I suggest that you retrieve only a certain amount of SQL rows at
a time, process them, and throw them into your local database. This
will make your application a lot faster, by the way.


More information about the freebsd-questions mailing list