i am using PHP to run exec() on a script which looks like this:
exec("pdftk xx.pdf fill_form xx.fdf output xx.pdf flatten");
the strangest thing is that when i log in to ssh and put the command in manually - it works fine! it outputs a 224k pdf. but when i use the exec() command, only the first 36k of the script comes out. (i checked - the first 36k of the good file is identical to the bad file)
no heres the strange thing - this was working fine with exec() until i added some more variables to the fdf file, making it longer. i thought it was a problem with the fdf because of the new data - but why would this process run fine from ssh?
update: also I tried running php -f test.php (which just had the one exec line in it). that output the entire file properly. but even if i go to http://mydomain.com/test.php i only get a part of the file.
the script is not timing out, because i make it echo something after the exec() command and it works fine.
it can't be a permission issue (ssh logs in as root) because it is still able to write the file
also - when i try getting a return or exit value from exec or passthru, i get nothing. the return value is always 0.
update: in the apache error logs, I am getting
[Fri Sep 17 20:00:57 2010] [error] Unhandled Java Exception: [Fri Sep 17 20:00:57 2010] [error] java.lang.OutOfMemoryError [Fri Sep 17 20:00:57 2010] [error] <>
i changed the php_ini from 32M to 64M - still get it. considering these are all tiny files, i dont think that's it. but would PHP be able to limit the memory of a child process like that? is there another setting for that somewhere?
help!