Our server application is listening on a port, and after a period of time it no longer accepts incoming connections. (And while I'd love to solve this issue, it's not what I'm asking about here;)
The strange this is that when our app stops accepting connections on port 44044, so does IIS (on port 8080). Killing our app fixes everything - IIS starts responding again.
So the question is, can an application mess up the entire TCP/IP stack? Or perhaps, how can an application do that?
Senseless detail: Our app is written in C#, under .Net 2.0, on XP/SP2.
Clarification: IIS is not "refusing" the attempted connections. It is never seeing them. Clients are getting a "server did not respond in a timely manner" message (using the .Net TCP Client.)