views:

135

answers:

2

I have a multithreaded application running on Win XP. At a certain stage one of a threads is failing to open an existing file using fopen function. _get_errno function returns EMFILE which means Too many open files. No more file descriptors are available. FOPEN_MAX for my platform is 20. _getmaxstdio returns 512. I checked this with WinDbg and I see that about 100 files are open:

788 Handles
Type            Count
Event           201
Section         12
File            101
Port            3
Directory       3
Mutant          32
WindowStation   2
Semaphore       351
Key             12
Thread          63
Desktop         1
IoCompletion    6
KeyedEvent      1

What is the reason that fopen fails ?


EDIT:

I wrote simple single threaded test application. This app can open 510 files. I don't understand why this app can open more files then multithreaded app. Can it be because of file handle leaks ?

#include <cstdio> 
#include <cassert> 
#include <cerrno> 
void main() 
{ 
    int counter(0); 

    while (true) 
    { 
        char buffer[256] = {0}; 
        sprintf(buffer, "C:\\temp\\abc\\abc%d.txt", counter++); 
        FILE* hFile = fopen(buffer, "wb+"); 
        if (0 == hFile) 
        { 
            // check error code 
            int err(0); 
            errno_t ret = _get_errno(&err); 
            assert(0 == ret); 
            int maxAllowed = _getmaxstdio(); 
            assert(hFile); 
        } 
    } 
}
+4  A: 

I guess this is a limitation of your operating system. It can depend on many things: the way the file descriptors are represented, the memory they consume, and so on.

And I suppose there isn't much you can do about it. Perhaps there is some parameter to tweak that limit.

The real question is, do you really need to open that much files simultaneously ? I mean, even if you have 100+ threads trying to read 100+ different files, they probably wont be able to read them at the same time, and you'll probably not get any better result than having, as an example, 50 threads.

It's difficult to be more accurate since we don't know what you try to achieve.

ereOn
Before I start system fine tuning I want to understand what does cause the problem and why maximum number of opened files is not constant (it varies between 95 and 103). What are the other factors which do influence this, e.g. event, semaphore or directory handles ?
tommyk
@tommyk: This is only a guess, i have no deep knowledge of your operating system (mostly because you didn't say whether you were on Windows, Linux or something else :D). I assume that under some systems, file descriptors are global and thus the count of available descriptors (sockets, files, mutexes, and so on) is limited by other processes and the OS itself.
ereOn
My platform is Windows XP (32 bits).
tommyk
+1  A: 

I think in win32 all the crt function will finally endup using the win32 api underneath. So in this case most probably it must be using CreateFile/OpenFile of win32. Now CreatFile/OpenFile api is not meant only for files (Files,Directories,Communication Ports,pipes,mail slots,Drive volumes etc.,). So in a real application depending on the number these resources your max open file may vary. Since you have not described much about the application. This is my first guess. If time permits go through this http://blogs.technet.com/b/markrussinovich/archive/2009/09/29/3283844.aspx

ferosekhanj