views:

82

answers:

2

Ok, I know how this question has been asked and all. But, heres the thing.

  1. I'm already using ini_set('memory_limit', '400M');
  2. The file I'm trying to transfer (to Amazon S3) is 245MB
  3. The error msg is weird, says allowed mem of 400MB exhausted when it was trying to allocate 239MB.. isnt that the other way round?

The script I'm using is a library out there, to communicate with the Amazon S3

Help please!

EDIT
Ok heres the code, as you can see I'm not doing much, its all about the script I'm using.. That is here: http://belgo.org/backup_and_restore_to_amazo.html

ini_set('memory_limit', '400M');
require 'lib/s3backup.php';
$bucket = 'thebucketname';
$bucket_dir = 'apts';
$local_dir = "/home/apartmen/public_html/transfer/t/tr";
$s3_backup = new S3_Backup;
$s3_backup->upload_dir( $bucket, $bucket_dir, $local_dir );
+4  A: 

"allowed mem of 400MB exhausted when it was trying to allocate 239MB.." means that PHP was trying to allocate an additional 239MB of memory that (when added to the memory already allocated to the script) pushed it over the 400MB limit.

Mark Baker
Ok, now I made it 700M but got "Internal Server Error".. Just checked, no I didn't crash the server :P The file is just 245M, why does it need all that memory
Torrrd
I'd guess it needs that much memory because it's loading the entire file into memory rather than reading it in "chuncks" or using apull parser if it's an XML file
Mark Baker
@Tor: If you got an "Internal Server Error", you might want to check your logs to see why...
ircmaxell
A: 

The AWS SDK for PHP has an AmazonS3 class that can stream a local file up to S3.

http://docs.amazonwebservices.com/AWSSDKforPHP/latest/#m=AmazonS3/create_object

The parameter you need to pay attention to is "fileUpload".

Ryan Parman