views:

372

answers:

1

Realizing that loops are usually not ideal in R, sometimes they are necessary.

When writing large loops, doesn't

for (i in 1:large_number)

waste memory, since a vector of size large_number must be created?

Would this make while loops the best choice for large, necessary loops?

+6  A: 

First off, a lot of that 'loops are bad' chatter stems from the dark ages when loops where in fact less efficiently implemented, in particular in some versions of S-Plus.

That said, and while your comment about the need for a large index object is correct, you could also use

  • functions from the apply family such as sapply, lapply or tapply to unroll your structures

  • the relatively new iterators package which also avoids the large vector you mentioned as a memory constraint

  • the Ra 'accelerated R' variant and its jit package which can significantly accelerate simple loops.

As added bonus, options one and two give a path towards parallel execution of the loops on suitable systems using tools from the CRAN packages snow, multicore, or NWS just to name a few.

Dirk Eddelbuettel
gappy
No added insight here, sorry. On the one hand you have the large user base of R and the need for backwards compatibility, on the other you the desire to experiment with new features. I think changes will only be evolutionary, not revolutionary.
Dirk Eddelbuettel