Hi all,
My website allows users to upload photographs which I store on Amazon's S3. I store the original upload as well as an optimized image and a thumbnail. I want to allow users to be able to export all of their original versions when their subscription expires. So I am thinking the following problems arise
Could be a large volume ...
The following bucket policy is returning a malformed error:
{
"Version": "2008-10-17",
"Id":"S3Policy",
"Statement":[
{
"Sid":"1",
"Effect": "Allow",
"Principal": {
"AWS": ["AWSID"]
},
"Action": ["s3:GetObject", "s3:PutObject"],
"Resource": "arn:aws:s3:::BUCKETNAME/*"
]
}
}
I'm trying to create a policy where all files within BUCKETNA...
I am using PHP S3.PHP class to manage files on Amazon S3. I use the copyObject() function to copy files in my S3 bucket. All works great until I meet filenames that need to be urlencoded (I urlencode everything anyway). When a filename ends up with % characters in it the copyObject() function spits the dummy.
for example - the filename...
Hi All,
So I'm having a bit of trouble figuring out why I get a particular error. [NOTE: I've masked my AccessKey and Signature parameters]
The url below returns valid xml for ONE product.
http://ecs.amazonaws.com/onca/xml?AWSAccessKeyId=[myAccessKey]&IdType=ASIN&ItemId=B002UD52WQ&Operation=ItemLookup&ResponseGroup=Me...
I'm using Texticle to do full-text search on Heroku. It's working great.
I'm now trying to setup nightly db backups to Amazon s3 using this script.
When I try heroku rake backups:backup I first get this error:
/disk1/home/slugs/245176_566b3d9_4845/mnt/.bundle/gems/bundler/gems/texticle-3a96c70a9fa60921197f0027204a23824435b142-ee972f...
Hi,
Just put our new site live, and having trouble with one of the swf files playing.
Using the colorbox jquery plugin throughout the site and this works fine.
However a movie on http://www.learningassistant.com/qcf (qcf engine movie) gives out a repeated javascript 'Access is denied' error everytime you close the colorbox??
Perhaps...
One of my clients has a site which displays media that has been uploaded from a client application.
This application initially used FTP, but we're moving to S3 for various data storage and performance reasons.
What I would like to be able to do is have this client upload a file directly to our central S3 store (ala dropbox/jungledisk e...
I have an S3 bucket setup as a streaming distribution with a CloudFront service attached to it. There are fairly sizable .flv files in there which I use to hook up JWPlayer using signed URL's.
After about a month of hosting these videos in S3 (and they have been watched by web site visitors several times), I just logged on using CloudBe...
I have static files located on Amazon S3, and am continually having issues with Amazon caching them. When I update/overwrite the static file, I'd love for it to automatically show the newest version rather than waiting...
Any ideas?
...
I have a bit of code like so which works fine if the file in question doesn't already exist.
if AWS::S3::S3Object.exists? file_name, bucket.name + path_to_images
puts "file exists (deleting)"
AWS::S3::S3Object.delete file_name, bucket.name + path_to_images, :force => true
end
AWS::S3::S3Object.store file_name,
File.read(file_pa...
I'm trying to use S3 as an off site file location for a database backup. On my home dev machine this works just fine, I just do a dump out from mySQL and then
<cffile action = "copy"
source = "#backupPath##filename#"
destination = "s3://myID:myKey@myBucket/#filename#">
and all is good. However, the production server at work is behind...
Does anyone know of any problems serving gzipped HTML pages using Amazon S3. I have the need to minimize the file size of our HTML files (ie serving up compressed HTML, CSS and javascript files) - but am concerned that either:
Amazon S3 does not serve up gzipped files correctly to the browser that requests it. Or,
Some browsers have tr...
We're planning to setup a server to record videos and save them using Amazon's S3 service.
Subsequently we need to provide a mechanism to end users to view those videos online.
Is it possible to store files on S3 in such a way that they are streamed to user's browsers, in a way similar to how video is visible on youtube?
We haven't m...
I am right at the start of trying to write some PHP code to run on a Linux box on an EC2 server that will read files from my S3 bucket, zip them then write the zip file back to the bucket.
I have instantly run in to problems with even creating a simple zip archive from some images on the local disk of the EC2 instance, I am using a scri...
I am looking at the readme and there isn't any instructions on how to install it locally on my ubuntu machine (curious, is it different on a mac os?)
http://github.com/boto/boto/blob/master/README
...
I have a connection that works as I can list buckets, but having issues when trying to add a object.
conn = S3Connection(awskey, awssecret)
key = Key(mybucket)
key.key = p.sku
key.set_contents_from_filename(fullpathtofile)
I get the error:
'attribute error: 'str' object has no attribute 'connection'
the error is in the file:
/us...
Hello all,
I don't think I fully understand Amazon Web Services yet, which is why I'm asking this question. I want to know if AWS would be a nice host for a CakePHP application that of course runs off PHP, and MySQL?
Would I have to change or add anything to my code if used a service like EC2? I also noticed that Amazon has it's own da...
I'm making a website where files are uploaded through the admin and this will then store them on Amazon S3. I'm using django-storages and boto for this, and it seems to be working just fine.
Thing is, I'm used to use my easy_thumbnails (the new sorl.thumbnail) on the template side to create thumbnails on the fly. I prefer this approach,...
Is there a way to create the server side of something similar to S3, iDrive, Dropbox, etc in strictly PHP? The end goal would be to allow users to map a drive or folder to our servers using an app that already exists for one of those popular services.
In other words, I don't really want to write the client on the OS side, but rather im...
I have a structure on amazon like this -
(bucket name) MyImages
--- (key) general
---- 1.jpg
---- 2.jpg
I have created the key (general) by using S3 Firefox Organizer tool and set read permission for all. Now, by a java program when I am uploading the images inside this key, I want to set the permission of each object...