views:

6774

answers:

9

Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php

any suggestion

+2  A: 

Increase your maximum memory limit to 64MB in your php.ini file.

Google search

But could I ask why you are trying to allocate that much memory? What line of code does it fail at?

nlaq
PHP can be very inefficient with memory usage, I have often seen simple datagrids blow well into 80mb with a mere couple hundred records. This seems to especially happen when you go the OOP route.
TravisO
It's not necessarily a language problem - it's an algorithm problem, too. Too many PHP programmers do repeated actions on a whole dataset rather than doing all processing on one item at a time.
staticsan
PHP is efficient if you use it right. It is hard though to keep track of all of your objects due to the managed nature of the runtime - not unlike C#. But too many high-level programmers period (including C#) do not have an appreciation of how their code affects the resources it runs on.
nlaq
+6  A: 

Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.

Check for infinite loops.

If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;

Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...

rikh
A: 

If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.

As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.

Beau Simensen
+1  A: 

1st suggestion is right one. Dynamically increasing memory limit in the script is done via function ini_set():

ini_set('memory_limit', '128M');

mente
A: 

ini_set('memory_limit', '128M');

i used same thing but it showing same problem

panidarapu
You might not have permissions to increase the memory limit - especially if you're on a shared host.
Kristian J.
+1  A: 

It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.

staticsan
+6  A: 

At last I found the answer:

ini_set('memory_limit', '-1');

It will take unlimited memory usage of server, it's working fine.

Thanks for giving suggestion friends.

panidarapu
You still should check *why* the memory is exhausted. Maybe you don't need to read the whole file, maybe read it sequentially.
macbirdie
Worked great. I know _why_ memory was being exhausted - I'm using Zend-Feed to consume a rather large-ish Atom feed. I can't control the size of it or do anything other than swallow the whole feed in one pull, so bumping the memory limit for that one operation solved it. Cheers!
Don Jones
- @panidarapu and @Don Jones: Depending on the amount of memory, and *how* this script is used, it could be dangerous to allow the change in memory usage in this way. Don, in your case, you can likely break the feed down into smaller chunks and parse what you need. Glad it works, but be careful.
anonymous coward
+3  A: 

If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.

So you should read them bit by bit using fopen & fread.

Gumbo
A: 

I had a similar issue basically I had a method calling itself (slight over sight on my behalf) but I thought it was worth mentioning.