SO I've finally figured out how to get my R scripts to run on the Amazon EC2 cloud. I've been using an AMI with 26 ECUs, 8 Cores, and 69 gigs of RAM.
I then divide up my code into multiple scripts, and run each one in an instance of R. With a server of this size, I can easily run 20-40 scripts simultaneously, each running several 1000 simulations.
What I would like to know is if R is taking advantage of all this computing power natively. Should I install packages that specifically tell R to use all this extra memory/ multiple CPUs? I've seen this page and some packages (at least from the description) seem promising. But I am unable to figure out how to incorporate this into my code. Could anyone shed more light on this?