views:

126

answers:

2

Hi,

I’m facing an issue with NHibernate performance and can you please suggest me some optimizations? Below mentioned is a small summary of my application architecture

I have a windows service which is listening to a messaging bus. On receiving a message the service creates an object out of which a property is the received xml snippet and saves the message to the DB (uses NH). There is a WPF UI with a readonly connection to the DB, and on refresh of the UI it displays the objects on the screen. While the UI does a refresh, it retrieves the xml and deserializes it , from which the object’s properties are derived and binded to the screen.

For example assume an xml XXX is received by the service, it deserializes the xml , creates the book object and save it to the DB and a property/column is SCHEMA which contains the xml snippet.

The UI while refreshed searches all book objects by ID and creates the book objects out of the xml which is being saved (yes, the xml is the constructor param).

Now my issue is that the refresh takes more than 2 minutes to display say 50 book objects. I analyzed it using the NHibernate profiler, and found that the time spend within the DB is negligible, however time spent to create the entities is proportionally huge(10ms:1990 ms).I guess it’s due to the fairly huge size of xml snippet and it’s deserialization.

My question is, how can I improve the performance. I dispose sessions after every refresh and is not lazy loading (please note that the time spend in DB is negligible). On every refresh it’s possible that all objects are updated by some downstream systems or maybe one of them are updated.Can I implement some sort of caching mechanism in this case?

Thanks in advance for any suggestions.

Regards, -Mike

A: 

The entire list of 50 books could be saved in a singleton class meant for caching. Like a cache manager. You could also use say an enterprise library cache but I would suggest an in memory cache. If a book gets added you could update the cache. The cache would have the entire xml so no deserialisation would happen. Also you could update the db in an ansynchronous thread and reduce the time.

Prashant
I'm bit sceptical about caching duw to the following- The UI gets refreshed only when user hits the refresh button- Every time I do a refresh I use a new session- Every book object holds a book xml around 20 KBI'm not sure whether NHibernate do dirty tracking between sessions?
Mike
A: 

Here is the pseudo code

On the service, whenever I receive a message

public void OnMessage(string message)
{
//deserializes the message
  DeserializedObject schema = deserializationFactory.Deserialize(message);

    var book = new Book(schema,message);

    // saves the book using a new session
    repository.Save(book);
}

The book object:

public class Book
{

public DeserializedObject Schema{get;set;}

private string xml;
public string Xml{get{return xml;}}
public Book(DeserializedObject schema,string xml):this(schema)
{
    this.xml = xml;
}

public Book(DeserializedObject schema):this()
{
 this.Schema = schema;
}

public virtual XmlDocument XmlSchema
        {
            get
            {
                var doc = new XmlDocument();
                if (Schema!= null)
                {
                    var serializer = new XmlSerializer(typeof(DeserializedObject));
                    var stream = new MemoryStream();
                    serializer.Serialize(stream, Schema);
                    stream.Position = 0;
                    doc.Load(stream);
                }
                return doc;
            }
        }

public virtual string SerializedSchema
        {
            get { return XmlSchema.OuterXml; }
            set
            {
                if (value != null)
                    Schema = value.Deserialize< DeserializedObject >();
            }
        }
public string Author
{
get{return Schema.Author;}
}
}

Now the Mapping for Book(uses FNH)

public class BookMap:ClassMap<Book>
{
LazyLoad();
Table("Books");
IdGenerator.Instance.GenerateId(this, "book_id_seq", book => book.Id);
Map(book=> book.SerializedSchema, "SERIALIZED_SCHEMA")
           .CustomSqlType("Clob")
           .CustomType("StringClob");
}

On UI:

public void OnRefresh()
{
  //In reality the call to DB runs on a background worker and the records are binded to the     grid after a context switch.

//GetByCriteria creates a new session every time a refresh happens. datagrid.DataContext = repository.GetByCriteria(ICriterion allBooksforToday); }

The important thing to note here is Book type is shared between the service and the UI. However, only service can do a write to the DB, wherin the UI can update the trade object (basically the xml) and sends it over the messaging bus (again the xml). The service once receiving it updates the DB.

The xml size will be approximately 20 KB, so that would mean that if I'm loading say 50 books I'll be loading close to an MB of data.

Thanks,-Mike

Mike