views:

84

answers:

2

I am a novice as far as using cloud computing but I get the concept and am pretty good at following instructions. I'd like to do some simulations on my data and each step takes several minutes. Given the hierarchy in my data, it takes several hours for each set. I'd like to speed this up by running it on Amazon's EC2 cloud.

After reading this, I know how to launch an AMI, connect to it via the shell, and launch R at the command prompt.

What I'd like help on is being able to copy data (.rdata files) and a script and just source it at the R command prompt. Then, once all the results are written to new .rdata files, I'd like to copy them back to my local machine.

How do I do this?

+1  A: 

I don't know much about R, but I do similar things with other languages. What I suggest would probably give you some ideas.

  1. Setup a FTP server on your local machine.
  2. Create a "startup-script" that you launch with your instance.
  3. Let the startup script download the R files from your local machine, initialize R and do the calculations, then the upload the new files to your machine.

Start up script:

#!/bin/bash
set -e -x
apt-get update && apt-get install curl + "any packages you need"
wget ftp://yourlocalmachine:21/r_files > /mnt/data_old.R
R CMD BATCH data_old.R -> /mnt/data_new.R
/usr/bin/curl -T /mnt/data_new.r -u user:pass ftp://yourlocalmachine:21/new_r_files

Start instance with a startup script

ec2-run-instances --key KEYPAIR --user-data-file my_start_up_script ami-xxxxxx
dropson
A: 

first id use amazon S3 for storing the files
both from your local machine and back from the instance
as stated before, you can create start up scripts, or even bundle your own customized AMI with all the needed settings and run your instances from it
so download the files from a bucket in S3, execute and process, finally upload the results back to the same/different bucket in S3
assuming the data is small (how big scripts can be) than S3 cost/usability would be very effective

Adam