We have a "print engine" which basically picks pre defined tasks to run against a file and these "tasks" are .NET 2.0 - 3.5 (in C#) command-line applications.
All it does is run one after the other and with the applications that we haven't developed internally, they run quite quickly, typically 10-30 ms.
However, our .NET applications, typically take 1-3 seconds, and when you're dealing with several executables per file and several files, all executed synchronously, we end up with really long wait times.
Typically these applications are doing a little bit of database work and some extremely basic file modification (plain text stuff). We have even stripped down some applications in order to see if it was just the overhead of the .NET Framework that was slowing everything down and everyone that has looked into it has just concluded that
".NET is just slow and it's not made to perform in this manner."
I was wondering if this is true and what techniques I can use to either track the problem down or alleviate the lag. I've tried using profilers but so far I haven't seen one that will repeatedly execute a command-line .NET application, which is what we do. Most just want to run an executable once and attach to it to profile it.
Ideally, we would like to opt out of using the print engine completely and develop our own more efficient engine, but that's just not going to happen.