tags:

views:

53

answers:

2

I'm wondering if anyone else has ever encountered this problem. I'm writing a fairly small amount of data to a csv file. It's about 30 lines, 50 times.

I'm using a for loop to write data to the file.

It seems "finicky" sometimes the operation completes successfully, and other times it stops after the first ten times (300 lines) other times 3, or 5... by telling me

"cannot open connection".

I imagine it is some type of timeout. Is there a way to tell R to "slow down" when writing tables?

Before you ask: there's just too much code to provide an example here.

+2  A: 

Code would help, despite your objections. R has a fixed-size connection pool and I suspect you are running out of connection.

So make sure you follow the three-step of

  1. open connection (and check for error as a bonus)
  2. write using the connection
  3. close the connection
Dirk Eddelbuettel
How do you manually close the connection to a table?
Brandon Bertelsen
See `help(connection)` -- a file is just one form of a connection. The limits still apply.
Dirk Eddelbuettel
+1  A: 

I can't reproduce it on a R 2.11.1 32bit on a Windows 7 64bit. For these kind of things, please provide more info on your system (see e.g. ?R.version, ?Sys.info )

Memory is a lot faster than disk access. 1500 lines are pretty much manageable in the memory and can be written to file in one time. If it's different sets of data, add an extra factor variable indicating the set (set1 to set50). All your data is easily manageable in one dataframe and you avoid having to access the disk many times.

In case it really is for 50 files, this code illustrates the valuable advice of Dirk :

for(i in 1:50){
    ...
    ff <- file("C:/Mydir/Myfile.txt",open="at")
    write.table(myData,file=ff)
    close(ff)
}

See also the help: ?file EDIT : you should use open="at" instead of open="wt". "at" is appending mode. "wt" is writing mode. append=T is the same as open="at".

Joris Meys
+1 and acceptance for the example!
Brandon Bertelsen