views:

260

answers:

6

Hello

I've a windows forms application with MS SQL Server 2005 as the back end. I have written code in the form to call few stored procedures using SqlConnection, SqlCommand objects and i properly dispose everything.

I've disposed sqlcommand object by calling

oSqlCommand.Dispose()

But i witnessed my application consuming huge amount of memory. I basically pass large XML files as SqlParameters.

I finally decided to memory profile it using RedGate Memory profiler and i noticed that the System.Data.SqlClient.SqlParameters are not disposed.

Any insights on this?

Thanks

NLV

+3  A: 

the Dispose does not Dispose it's parameters, only disposes it's internal SqlMetaData cache... it's btw normal that the parameters are not disposed automatically because you could pass in something that should not be disposed after disposing the command... + SqlParameter is not implementing Dispose either because it holds no unmanaged resources ....

Tim Mahy
how to dispose the SqlParameters then?
NLV
why would you ever want to do that? normally you only implement dispose for unmanaged resources....how do you pass those large xml files, as strings or XmlReader's ?
Tim Mahy
Then i dono why the sqlparameters are disposed properly. I'm not using 'using' statement. Instead, i call the dispose explicitly. Anything wrong?
NLV
yes, when possible use the using keyword instead of call dispose yourself, the using creates a nice try{}finally{} code fragment for you behind the scenes... how do you pass those large xml files, as strings or XmlReader's, if the second suits for you than you will have to dispose it yourself...
Tim Mahy
Yes, i pass the xmls using new XmlNodeReader() and i dispose it properly (after i found that it is causing the leak too! :)).
NLV
+9  A: 

You state this:

i properly dispose everything.

and this:

I've disposed sqlcommand object by calling oSqlCommand.Dispose()

However, those are mutually exclusive! If you call .Dispose() directly, you're doing it wrong. Specifically, you're leaving open the possibility that an exception will make the program skip over your call to the Dispose() method. The "proper" way to dispose of your command is to create it with a using block, like so:

using (SqlCommand cmd = new SqlCommand("sql string here"))
{
    // use the command here
} // compiler transforms your code to make sure .Dispose() is called here

Now, I gather from your question that this isn't your main problem at the moment, but it is a point worth driving home.

As for you question about the parameters: SqlParameters do not implement IDisposable. Therefore, you do not dispose them directly. They are an entirely managed resource, and that means they will be cleaned up for you by the garbage collector at some point after they are no longer reachable. You don't have to do anything to clean them up.

If you can seriously show that SqlParameter objects are hanging around long after they should, that means that you are holding a reference to them somewhere. For example, perhaps you are "caching" old SqlCommand objects somewhere, which in turn hold on to all their parameters. Don't do that. Find and eliminate whatever still references your SqlParameters, and the garbage collector will clean them up for you.

Update:

After re-reading your question, it sounds like your xml parameters are ending up on the Large Object Heap. The garbage collector in .Net is generational - it doesn't clean up everything every time it runs. As an object moves to a higher generation, it's more likely to hang around a while. The Large Object Heap is basically the last generation, and it does not get cleaned up much at all. More than that, it does not get compacted ever, such that over time it fragments, resulting in your program holding on to much more data than it needs. What you need to do is try to find a way to keep from loading the entire xml data for a parameter into memory, such that it never makes it to the large object heap. Use a file stream, or something similar instead.

Joel Coehoorn
I'm trying by wrapping the command object in using statement. Let me update you in few mins.
NLV
@NLV I added an update to my answer that you should read.
Joel Coehoorn
Ya i've read Joel. I've a huge dataset with multiple tables which i need to save it in the database as history. So i've broken it into multiple xml files and saving it in multiple xml columns.
NLV
A: 

Without testing this I can think of two things that might help you. With the SqlParameters you could use the finalize() method which will free up resources. Also are you running all of your Sql commands through a using block? If so when the using block is finished your resources should be reclaimed and it will remove your memory leak issues.

Tim C
+4  A: 

Since SqlParameter is not IDisposable, it isn't an issue of disposing it; and normally there would be little benefit in tidying up the references etc, since it is still subject to the same GC.

If sounds like you have accidentally kept a reference to the SqlCommand. But if you are sure that you are done, you could try explicitly setting each .Value to null, and calling Clear() on the parameters list. But that is really just masking the fact that you are clinging onto a dead command.

Marc Gravell
A: 

I have used this pattern is several projects without any issues

public partial class StoredProcedures
{
    [SqlProcedure()]
    public static void InsertCurrency_CS(
        SqlString currencyCode, SqlString name)
    {
        using (SqlConnection conn = new SqlConnection("context connection=true"))
        {
            SqlCommand InsertCurrencyCommand = new SqlCommand();
            SqlParameter currencyCodeParam = new SqlParameter("@CurrencyCode", SqlDbType.NVarChar);
            SqlParameter nameParam = new SqlParameter("@Name", SqlDbType.NVarChar);



            InsertCurrencyCommand.CommandText =
                "INSERT Sales.Currency (CurrencyCode, Name, ModifiedDate)" +
                " VALUES(@CurrencyCode, @Name)";

            InsertCurrencyCommand.Connection = conn;

            conn.Open();
            InsertCurrencyCommand.ExecuteNonQuery();
            conn.Close();
        }
    }
}

ref: http://msdn.microsoft.com/en-us/library/5czye81z%28VS.80%29.aspx

Raj Kaimal
A: 

There are several things you can do if you are positive that it's the SqlParameter that's keeping the last reference.

First, try to pass your XML as a string (and use OPENXML in a sproc to handle it) just to see if having a simple object and more control will help.

Second, make your own SqlParameter-s, keep them in a Dictionary and then do something like:

foreach (SqlParameter param in parameters.Values)
    command.Parameters.Add(param);

Then after you finish the command run and dispose the command and close (if still open) and dispose the connection, go into your dictionary, explicitly assign null as SqlParameter.Value (or, take the string ref from .Value into a local var, assign String.Empty to .Value and then assign null to local var -- this is only if SqlParameter.Value complains about straight null. Then assign null to the dictionary item (that was a ref to SqlParameter) and then assign null to dictionary.

In a more simple case you can keep a ref just to that one critical SqlParameter and skip the dictionary. The key point is to keep explicitly assigning nulls - the the last ref to the string and then to the last ref to SqlParameter that contained it.

Remember that there are several things involved. It starts with not parsing XML in mid-tier at all - just sending it down to SQL and ends with explicitly nullifying references. If your code is such that it's actually constructing that XML on the fly, make one as a large straight string to try.

If this alone doesn't lower the memory pressure then you'll have to force explicit GC collections but for that you have to do abit of the reading and plan reasonable intervals since GC costs a lot i.e. if you start GC-in after every request like a mad rabbit you are going to pay a lot in CPU cycles.

Also since you didn't say how big you data actually is and on what kind of hardware are you running and if your mid tier running under IIS it's hard to speculate about possible further options, like to have IIS run multiple worker processes and just recycle them when they get too bloated. For really huge memory consumption and genuine pass-through mid-tier (meaning no cache buildups) that can be faster than fiddling with gc but we are talking really huge data in order to enter that area.

ZXX