Asked  2 Years ago    Answers:  5   Viewed   85 times

I can find lots of tutorials on how to overcome the out-of-memory error. The solution is: To increase the memory in the php.ini or in the .htaccess - what a surprise...

I actually don't understand the error message:

Fatal error: Out of memory (allocated 32016932) (tried to allocate 25152 bytes)

"Allocated 32016932", means 32MB have been allocated as in - the PHP script is using 32MB? Tried to allocate 25152, means that another 25KB were tried to be allocated, but the script failed as the maximum (of ~ 32MB?) has been reached?

What can I actually tell from this error message, besides that I'm "out of memory"?

 Answers

3

I Always interpreted it like:

Fatal error: Out of memory ([currently] allocated 32016932) (tried to allocate [additional] 25152 bytes)

But good Question if there is a bulletproof explanation.

Friday, September 16, 2022
4

I have finally found the answer. The clue came from pcguru's answer beginning 'Since the server has only 1 GB of RAM...'.

On a hunch I looked to see whether Apache had memory limits of its own as those were likely to affect PHP's ability to allocate memory. Right at the top of httpd.conf I found this statement: RLimitMEM 204535125

This is put there by whm/cpanel. According to the following webpage whm/cpanel incorrectly calculates its value on a virtual server... http://forums.jaguarpc.com/vps-dedicated/17341-apache-memory-limit-rlimitmem.html

The script that runs out of memory gets most of the way through, so I increased RLimitMEM to 268435456 (256 MB) and reran the script. It completed its array merge and produced the csv file for download.

ETA: After further reading about RLimitMEM and RLimitCPU I decided to remove them from httpd.conf. This allows ini_set('memory_limit','###M') to work, and I now give that particular script the extra memory it needs. I also doubled the RAM on that server.

Thank you to everyone for your help in detecting this rather thorny issue, and especially to pcguru who came up with the vital clue that got me to the solution.

Monday, August 8, 2022
 
3

The most eager component in Symfony is a profiler. If you don't need profiler in some particular actions you can disable it via code:

if ($this->container->has('profiler'))
{
    $this->container->get('profiler')->disable();
}

You can also set global parameter in config:

framework:
    profiler:
        collect: false
Wednesday, September 28, 2022
 
4

This is a least squares problem with 100,000 responses. Your bigmatrix is the response (matrix), beta is the coefficient (matrix), while w1 is the residual (matrix).

bigmatrix, as well as w1, if formed explicitly, will each cost

(100,000 * 100,000 * 8) / (1024 ^ 3) = 74.5 GB

This is far too large.

As estimation for each response is independent, there is really no need to form bigmatrix in one go and try to store it in RAM. We can just form it tile by tile, and use an iterative procedure: form a tile, use a tile, then discard it. For example, the below considers a tile of dimension 100,000 * 2,000, with memory size:

(100,000 * 2,000 * 8) / (1024 ^ 3) = 1.5 GB

By such iterative procedure, the memory usage is effectively under control.

x1 <- rnorm(100000)
x2 <- rnorm(100000)
g <- cbind(x1, x2, x1^2, x2^2)
gg <- crossprod(g)    ## don't use `t(g) %*% g`
## we also don't explicitly form `gg` inverse

## initialize `beta` matrix (4 coefficients for each of 100,000 responses)
beta <- matrix(0, 4, 100000)

## we split 100,000 columns into 50 tiles, each with 2000 columns
for (i in 1:50) {
   start <- 2000 * (i-1) + 1    ## chunk start
   end <- 2000 * i    ## chunk end
   bigmatrix <- outer(x1, x2[start:end], "<=")
   Gw <- crossprod(g, bigmatrix)    ## don't use `t(g) %*% bigmatrix`
   beta[, start:end] <- solve(gg, Gw)
   }

Note, don't try to compute the residual matrix w1, as It will cost 74.5 GB. If you need residual matrix in later work, you should still try to break it into tiles and work one by one.

You don't need to worry about the loop here. The computation inside each iteration is costly enough to amortize looping overhead.

Wednesday, August 31, 2022
 
p4plus2
 
3

Investing huge amounts of effort, you may be able to mitigate this for a while. But in the end you will have trouble running a non-terminating PHP application.

Check out PHP is meant to die. This article discusses PHP's memory handling (among other things) and specifically focuses on why all long-running PHP processes eventually fail. Some excerpts:

There’s several issues that just make PHP the wrong tool for this. Remember, PHP will die, no matter how hard you try. First and foremost, there’s the issue of memory leaks. PHP never cared to free memory once it’s not used anymore, because everything will be freed at the end — by dying. In a continually-running process, that will slowly keep increasing the allocated memory (which is, in fact, wasted memory), until reaching PHP’s memory_limit value and killing your process without a warning. You did nothing wrong, except expecting the process to live forever. Under load, replace the “slowly” part for "pretty quickly".

There’s been improvements in the “don’t waste memory” front. Sadly, they’re not enough. As things get complex or the load increases, it’ll crash.

Thursday, August 25, 2022
 
liberty
 
Only authorized users can answer the search term. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :
 

Browse Other Code Languages