views:

41

answers:

2

Hi,

I insert some data into a sql server 2005 database using NHibernate. Before I insert an object I check whether it already exists based on its primary key. Every 100000 or so lookups this seems to throw some exception (if I do the same via sql it takes ages as well). A defragmentation of the hard disk seems to help but seems a bit strange that I have to do this - or is this normal?

BTW I know that I could use bulk insert but had to opt for NHibernate for certain reasons ...

Thanks.

Christian

+1  A: 

Likely to be database file growth if it's 100k inserts. Do you:

  • shrink the database at all?
  • have large growth increment?
  • not have instant file initialisation enabled?

Defragging the filesystem is only useful for generating IO time-outs

gbn
Kimberly L. Tripp has an excellent article on instant file initialization [here](http://sqlskills.com/BLOGS/KIMBERLY/post/Instant-Initialization-What-Why-and-How.aspx).
Joe Stefanelli
A: 

It sounds very likely that the time out you are seeing is due to database file growth. That would explain the consistent frequency of the slowness. I am not sure how frequently this occurs or the long term plan for the table. However, you may wish to manually grow the file to a much larger size to avoid the frequent overhead of the automatic growth.

Bryan Cooper