views:

687

answers:

6

I've just started doing some real-world performance testing on my Fluent NHibernate / SQLite project, and am experiencing some serious delays when when I Commit to the database. By serious, I mean taking 20 - 30 seconds to Commit 30 K of data!

This delay seems to get worse as the database grows. When the SQLite DB file is empty, commits happen almost instantly, but when it grows to 10 Meg, I see these huge delays.

The database has 16 tables, averaging 10 columns each.

One possible problem is that I'm storing a dozen or so IList<float> members, but they are typically only 200 elements long. But this is a recent addition to Fluent NHibernate automapping, which stores each float in a single table row, so maybe that's a potential problem.

Any suggestions on how to track this down? I suspect SQLite is the culprit, but maybe it's NHibernate?

I don't have any experience with profilers, but am thinking of getting one. I'm aware of NHibernate Profiler - any recommendations for profilers that work well with SQLite?

Edit

Some more testing indicates that it's not the size of the database that causes the slowdown - it's dependent on how many Saves I've done since I started my program. Commit time for the first Save is about 300 ms, and goes to over 1000 ms by the 50th Save.

I'm keeping a single Session open all the time - maybe I need some explicit Flush logic?

Also, I downloaded the NHibernate Profiler. It gave me an alert about "Large number of individual writes" for my IList members. The alert description suggested turning on batching, but SQLite does not support that, as far as I can tell.

Also mentioned Multiquery support, so I'm going to read up on that.

/Edit

Here's the method that saves the data - it's just a SaveOrUpdate call and a Commit, if you ignore all the error handling and debug logging.

    public static void SaveMeasurement(object measurement)
    {
        Debug.WriteLine("\r\n---SaveMeasurement---");

        // Get the application's database session
        var session = GetSession();
        using (var transaction = session.BeginTransaction())
        {

            try
            {
                session.Save(measurement);
            }
            catch (Exception e)
            {
                throw new ApplicationException(
                    "\r\n    SaveMeasurement->SaveOrUpdate failed\r\n\r\n", e);
            }

            try
            {
                Debug.WriteLine("\r\n---Commit---");
                transaction.Commit();
                Debug.WriteLine("\r\n---Commit Complete---");
            }
            catch (Exception e)
            {
                throw new ApplicationException(
                    "\r\n    SaveMeasurement->Commit failed\r\n\r\n", e);
            }
        }
    }
+2  A: 

You say that you have a List of measurement objects, but you are saving them 1 at a time. This would be fine, if all of the saves were in 1 transaction, but you have each save wrapped in a transaction. This will hurt your performance no matter how large the DB is.

Kevin Crowell
Thanks for the suggestion. I changed SaveOrUpdate to Save, but am still getting the delays. Currently, I'm just saving about 1000 Measurement objects to populate my DB - the ILists I mentioned are members of the Measurement class. But I'm using a DefaultCascade.All() convention to make the ILists save when I save the Measurements - maybe that is implicitly doing a SaveOrUpdate?
Tom Bushell
@Tom I'm sorry. I am not sure.
Kevin Crowell
1. Use one transaction for the whole batch2. Use save to add a new one, you don't have to use explicit updates at all when you use one transaction
Paco
So, you guys are saying I should: 1) do a single "BeginTransaction" 2) do a "Save" on each of my thousand measurements 3) do a single "Commit"? Is that correct?
Tom Bushell
That would give you the best performance. But unless you were committing each object 1 at a time, I doubt you will see much improvement from this.
Kevin Crowell
SaveOrUpdate does not check whether an unsaved object is already saved. Why would it?
James L
+4  A: 

Your transaction should be around the whole operation, not each save.

You would also then benefit from enabling ado.net batching: 1000

James L
I tried to turn on batching, but without success. Google search turned up several hits saying that batching is not supported with SQLite, unless this has changed recently.
Tom Bushell
SQLite can most certainly write multiple things per transaction, and has been able to do this for as many years as I've known about it. Whether NHibernate can take advantage of this, I just don't know.
Donal Fellows
NHibernate has something called MultiQuery, which I looked at briefly, but am shying away from for now, because it seems to require hand written SQL. But it's good to know that SQLite has the capability - thanks!
Tom Bushell
+1  A: 

Are you using autoincrement primary keys? This will cause NHibernate to perform a select after each insert to populate the primary key property on the object. I'm not sure how it does it for SQLite but your statement that the problem gets worse as the database grows points to this as a probable root cause.

Your plan to use a profiler is the best course of action. The NHibernate Profiler is excellent and has a trial period that will allow you to use it to troubleshoot this problem.

Jamie Ide
Yes, I am using autoincrement ints as primary keys. Switching to GUID keys might eliminate the problem?
Tom Bushell
Maybe. But I wouldn't take any action without profiling the app. first.
Jamie Ide
A: 

Edit After reading zoidbeck's answer, I've abandoned this approach. See my comment to his answer. /Edit

After some Googling, I came across some posts that implied that the NHibernate cache could actually slow things down considerably under certain circumstances. Apparently, NH can spend more time looking things up in the cache than doing the actual Commit.

I tried doing a session.Clear after every Commit, but that broke lazy loading.

The solution (for now) was to call:

session.Evict(measurement);

after each Commit, which deletes the measurement from the cache. Now, my commits take roughly the same time (about 800 ms, for 30 K of data), no matter how big the DB is.

The root problem seems to be my IList<float> members, which generate hundreds of SQL Inserts. Batching would probably solve this, but, alas, is only supported by SQL Server, not SQLite.

At some point, I'll have to optimize this - perhaps by storing them as a BLOB. But that's a question for another day.

Tom Bushell
+1  A: 

Nhibernate is never going to be fast.

Some older versions of nhibernate aggressively close the session to the database. This causes it to open/close the sqlite database a LOT (which is very slow). I believe newer versions include an option to keep the nhibernate session open longer. If you're using a memory database it causes everything to be lost as soon as it closes.

http://stackoverflow.com/questions/189280/problem-using-sqlite-memory-with-nhibernate

Jay
I'm keeping the session open all the time, so don't think that's an issue. Up to now, speed has not been a problem. It seems that ILists of basic value types are not saved efficiently, if you store hundreds at a time like I do.
Tom Bushell
+3  A: 

I think using session.evict() only covers the symptom. You haven't posted the GetSession()-Method but the comment above and the remark that session.clear() breaks your lazy loading makes me guess you are using one session for the whole application.
This is extremely inefficient and will slow down your app the longer it runs. Creating a new session on the other hand is really cheap and will provide you with a clean and fast session only handling the objects you want to.
IMO you should think about declarative transaction management. Personally i prefer Springs-TX-Management but there are also other clever solutions that aren't that method-based like for example castle's ActiveRecord.

zoidbeck
You are correct - I keep a single session open, because my app can be collecting new measurements and writing them to the database while a user is browsing through old measurements. If I open another session just to write a new measurement, SQLite tells me the DB is locked. Any suggestions on how to resolve this?
Tom Bushell
@zoidbeck - based on your input, I'm now keeping a session open all the time for lazy loading, and Disconnecting/Reconnecting when I Save, (using a new session for each save). This seems to be working better overall, but I worry I've just moved the problem to another session... :-(
Tom Bushell
Hi Tom, unfortunately there is no one-fits-all solution. If you don't have any problems why change it. On the other hand in a multi user environment you may run into trouble with this. Usually i try not to use NH-persisted objects in my view. I am using explicit client objects instead to also decouple the view from the server. With this in addition to spring.net you can easily make stateless calls on your server and configure the needed transactions straightforward on the called method. Sessionmanagement, MultiUser, Security - everything right out of the box.
zoidbeck