views:

40

answers:

2

I have a bit of code like so which works fine if the file in question doesn't already exist.

if AWS::S3::S3Object.exists? file_name, bucket.name + path_to_images
  puts "file exists (deleting)"
  AWS::S3::S3Object.delete file_name, bucket.name + path_to_images, :force => true
end

AWS::S3::S3Object.store file_name, 
   File.read(file_path), 
   bucket.name + path_to_images, 
   :content_type => 'image/png',
   :access => :public_read

`rm #{file_path}`

The problem I'm having is if the file does exist, I want to overwrite it with a new copy... now I'm not sure if its a problem with overwriting it, so I tried deleting the file first if it already exists. That didn't seem to work either. So I assume it's either not deleting it, or it's cached too.

When displaying the image it of course trails with ?123232 a random number, I even tried deleting the browsers cache just for kicks.

I'm sure there's something easy I'm missing, and probably a more succinct way to do this anyway.

Thanks

UPDATE: I think the problem must have something to do with cloudfront or regular s3 caching, because eventually... it does update. But only after a day or so? And it is not my browser caching it, so it's probably this. Anyone know how to tell it to dump the cache?

A: 

What about

File.open(file_path)

instead of File.read. This should do the trick :)

Petr

praethorian
That didn't seem to do anything. ;-(
holden
A: 

I've never experienced any issues with s3 caching or anything like that. More than likely, I'd still consider this a local cache issue as the caching could be taking place in multiple places. To confirm, I'd recommend viewing the actual files in your amazon bucket and seeing what you get after an update (I think amazon has this built into their s3 account site now, otherwise I've had good luck with cloudberry http://cloudberrylab.com/). Alternatively, you could just try accessing it from a different computer after you've updated the file.

Ryan