tags:

views:

2108

answers:

6

I'm opening lots of files with fopen() in VC++ but after a while it fails.

Is there a limit to the number of files you can open simultaneously?

+3  A: 

Yes there are limits depending the access level you use when openning the files. You can use _getmaxstdio to find the limits and _setmaxstdio to change the limits.

Malcolm Post
True, but there are limits also in the S.O.
Paulo Santos
+1  A: 

Yes, there is a limit.

The limit depends on the OS, and memory available.

In the old D.O.S. the limit was 255 simultaneuously opened files.

In Windows XP, the limit is higher (I believe it's 2,048 as stated by MSDN).

Paulo Santos
+7  A: 

The C run-time libraries have a 512 limit for the number of files that can be open at any one time. Attempting to open more than the maximum number of file descriptors or file streams causes program failure. Use _setmaxstdio to change this number. More information about this can be read here

Also you may have to check if your version of windows supports the upper limit you are trying to set with _setmaxstdio. For more information on _setmaxstdio check here

stack programmer
Interesting. Does this limit apply to the executable? Thread? Something else?
Les
Also: It's not possible to _setmaxstdio beyond 2048 open files, at least with the current Windows CRT. If you need more open files than that, you will have to use CreateFile (http://msdn.microsoft.com/en-us/library/aa363858.aspx) and related Win32 functions. However, a design which requires that many open files is probably wrong...
ephemient
+2  A: 

I don't know where Paulo got that number from.. In windows NT based operating systems the number of file handles opened per process is basically limited by physical memory - it's certainly in the hundreds of thousands.

Larry Osterman
It's different if you are using the CRT, as stated by the OP.
Joe
+1  A: 

In case anyone else is unclear as to what the limit applies to, I believe that this is a per-process limit and not system-wide.

I just wrote a small test program to open files until it fails. It gets to 2045 files before failing (2045 + STDIN + STDOUT + STDERROR = 2048), then I left that open and ran another copy.

The second copy showed the same behaviour, meaning I had at least 4096 files open at once.

Drarok
A: 

It's not necessarily bad design to open a large number of files. There are very good reasons for doing so and when you encounter one of those reasons, then you'll understand.

I'm interested in the .NET architecture's limits. Anyone done any experiments or have any information in this area?

Queue