views:

120

answers:

2

Excuse my ignorance, as I'm not a computer engineer but with roots in biology. I have become a great fan of pre-allocating objects (kudos to SO and R inferno by Patrick Burns) and would like to improve my coding habits. In lieu of this fact, I've been thinking about writing more efficient functions and have the following question.

Is there any benefits in removing variables that will be overwritten at the start of the next loop, or is this just a waste of time? For the sake of argument, let's assume that the size of old and new variables is very similar or identical.

+2  A: 

I think it will really depend on the specifics of the case. In some circumstances, when the object is large it might be a good idea to rm() it, especially if it is not needed and there are lots of other things to do before it gets overwritten. But then again, it's not impossible to conceive of circumstance were that strategy might be costly in terms of computation time.

The only way to know whether it would really be worthwhile is to try both ways and check with system.time().

wkmor1
+1 For using `system.time()` to check the performance. You can also look at things like `Rprof` or `profR` for profiling the code. See this question for an example: http://stackoverflow.com/questions/2476946/creating-large-xml-trees-in-r.
Shane
+1  A: 

No. Automatic garbage collection will take care of this just fine.

hadley