(This was meant to be a general hypothetical question, me whining that .NET was a pig and begging for reasons. It was not really meant to be a question about my specific app.)
Currently I am rewriting some old C++ code in C#. We are porting over all legacy applications. I have C++ applications that take MAX 3% CPU. Mostly they use none. I then take the code, copy and paste, then reformat to C# syntax and .NET libraries, and BAM! 50% CPU. Whats the reason for this? I thought at first it was JIT, but even after each code path has been exercises, and the whole thing has been JIT ed, same issue.
I have also noticed huge memory increases. Apps that took 9 MB running a full load now start at 10 MB and run at 50 MB. I realize hardware is cheap, but I want to understand what causes this. Is it a cause for alarm, or is .NET just that much the pig?
Update 1 Answer to Skeet
I am familiar with C#. I change things to Linq, and so on. I typically take code and reduce the number of lines, and so on. Could you give some more examples of what a C++ person my do wrong in .NET?
Update 2
This was meant to be a general question, but the specific app that has the issue is as follows.
It has a thread that uses and ODBC driver to get data from a paradox db. It then uses Linq to transform this to a SQL db and post it. I have run it through ANTS profiler, and it seems the data set filling take the most time. Followed by Linq posting. I know some of my areas are reflection usage, but I don't see how to do what I need to with out this. I plan to change my string to string builders. Is there any difference between these two?
(int)datarow["Index"]
and
ConvertTo.Int32(datarow["Index"])
I changed all string concatenation to format strings. That didn't reduce over head. Does any one know the difference between a data reader vs data adapter and datasets?