tags:

views:

2664

answers:

6

Hi,

I'm doing some tests with nhibernate and I'm modifying batch_size to get bulk inserts.

I'm using mssql2005 and using the northwind db. I created 1000 object and insert them to the database. I've changed the values of batch_size from 5 to 100 but found no change in the performance. I'm getting value of around 300ms. Using the sql profiler, I see that 1000 sql insert statements at the sever side. Please help.

app.config

 <property name="adonet.batch_size">10</property>

Code

    public bool MyTestAddition(IList<Supplier> SupplierList)
    {
        var SupplierList_ = SupplierList;
        var stopwatch = new Stopwatch();
        stopwatch.Start();
        using (ISession session = dataManager.OpenSession())
        {  
            int counter = 0;
            using (ITransaction transaction = session.BeginTransaction())
            {

                foreach (var supplier in SupplierList_)
                {
                    session.Save(supplier);                       
                }
               transaction.Commit(); 
            }

        }
        stopwatch.Stop();
        Console.WriteLine(string.Format("{0} milliseconds. {1} items added",
                            stopwatch.ElapsedMilliseconds,
                            SupplierList_.Count));
        return true;
    }
A: 

i copied the wrongly at the app.config. I've actually added the following line: 10 Modified the size from there but no diff.

edit your post, or make a comment on your post. don't post non answers - it's not a forum.
TheSoftwareJedi
A: 

A call to ITransaction.Commit will Flush your Session, effectively writing your changes to the database. You are calling Commit after every Save, so there will be an INSERT for each Supplier.

I'd try to call Commit after every 10 Suppliers or so, or maybe even at the end of your 1000 Suppliers!

Actually the commit() is outside of the foreach loop. So that'd mean that it's called at the end of the 1000 inserts
A: 

According to this nhusers post, you seeing 1000 inserts in SQL server should not really matter, because the optimization is done on a different level. If you really have no gain in performance, trying the most recent version of NHibernate might help pointing to the resolution.

hangy
A: 

i have tried similar stuff with nh, never really got great performance. i remember settling with doing a flush every 10 entries and a commit every 50 entries to get a performance boost as with each insertion the process got steadily slower. It really depends on the size of the object so you could play arround with those numbers, maybe you can squeeze some performance out of it.

+1  A: 

The following is a great post on batch processing in Hibernate, which is what NHibernate is based upon and closely follows:

http://relation.to/Bloggers/BatchProcessingInHibernate

As you can see, the suggested actions are to set a reasonable batch size in the config, which you have done, but to also call session.flush()and session.clear() every 20 or so records.

We have employed this method ourselves and can now create and save 1000+ objects in seconds.

ChrisAnnODell
+1  A: 

You could load the target type to a List and then call System.Data.SqlClient.BulkCopy to bcp the data into the target table.

This would allow processing of greater volumes.