tags:

views:

387

answers:

2

I'm not sure if memory is the culprit here. I am trying to instantiate a GD image from data in memory (it previously came from a database). I try a call like this:

my $image = GD::Image->new($image_data);

$image comes back as undef. The POD for GD says that the constructor will return undef for cases of insufficient memory, so that's why I suspect memory.

The image data is in PNG format. The same thing happens if I call newFromPngData.

This works for very small images, like under 30K. However, slightly larger images, like ~70K will cause the problem. I wouldn't think that a 70K image should cause these problems, even after it is deflated.

This script is running under CGI through Apache 2.0, on OS 10.4, if that matters at all.

Are there any memory limitations imposed by Apache by default? Can they be increased?

Thanks for any insight!

EDIT: For clarification, the GD::Image object never gets created, so clearing out the $image_data from memory isn't really an option.

A: 

I've run into the same problem a few times.

One of my solutions was simply to increase the amount of memory available to my scripts. The other was to clear the buffer:

Original Script:

$src_img = imagecreatefromstring($userfile2);
imagecopyresampled($dst_img,$src_img,0,0,0,0,$thumb_width,$thumb_height,$origw,$origh);

Edited Script:

$src_img = imagecreatefromstring($userfile2);
imagecopyresampled($dst_img,$src_img,0,0,0,0,$thumb_width,$thumb_height,$origw,$origh);
imagedestroy($src_img);

By clearing out the memory of the first src_image, it freed up enough to handle more processing.

jerebear
How can you increase the amount of memory available to the scripts? I don't think the other solution will work, since I never get past the creation of the first GD::Image object.
pkaeding
IIRC there is a setting in php.ini regarding script memory limits
DrJokepu
@DrJokepu, is there a limit like that in Perl? From what I've read, there isn't by default, and a Perl script can take all the system memory if it wants to, but I may be wrong.
pkaeding
Oh is that Perl? Sorry I was talking rubbish then. Maybe I should start learning reading at some point. Unfortunately, my Perl knowledge is even more rusty than my PHP knowledge.
DrJokepu
Are you on Linux? Linux traditionally limits your scripts to 64MB. You can up this with ulimit (ulimit -a will show you what your perl script will be limited to under 'data seg size') e.g. ulimit -d 200000HTH
aidan
+1  A: 

GD library eats many bytes per byte of image size. It's a well over a 10:1 ratio!

When a user uploads an image to our system, we start by checking the file size before loading it into a GD image. If it's over a threshold (1 Megabyte) we don't use it but instead report an error to the user.

If we really cared we could dump it to disk, use the command line "convert" tool to rescale it to a sane size, then load the output into the GD library and remove the temporary file.

convert -define jpeg:size=800x800 tmpfile.jpg -thumbnail '800x800' -

Will scale the image so it fits within an 800 x 800 square. It's longest edge is now 800px which should safely load. The above command will send the shrunk .jpg to STDOUT. The size= option should tell convert not to bother holding the huge image in memory, but just enough to scale to 800x800.

Christopher Gutteridge
Or use Image::Magick?
MkV