views:

396

answers:

0

We currently have an application that retrieves data from the server through a web service and populates a DataSet. Then the users of the API manipulate it through the objects which in turn change the dataset. The changes are then serialized, compressed and sent back to the server to get updated.

However, I have begin using NHibernate within projects and I really like the disconnected nature of the POCO objects. The problem we have now is that our objects are so tied to the internal DataSet that they cannot be used in many situations and we end up making duplicate POCO objects to pass back and forth.

Batch.GetBatch() -> calls to web server and populates an internal dataset
Batch.SaveBatch() -> send changes to web server from dataset

Is there a way to achieve a similar model that we are using which all database access occurs through a web service but use NHibernate?

Edit 1

I have a partial solution that is working and persisting through a web service but it has two problems.

  1. I have to serialize and send my whole collection and not just changed items
  2. If I try to repopulate the collection upon return my objects then any references I had are lost.

Here is my example solution.

Client Side

public IList<Job> GetAll()
{
    return coreWebService
      .GetJobs()
      .BinaryDeserialize<IList<Job>>();
}

public IList<Job> Save(IList<Job> Jobs)
{
    return coreWebService
             .Save(Jobs.BinarySerialize())
             .BinaryDeserialize<IList<Job>>();
}

Server Side

[WebMethod]
public byte[] GetJobs()
{
    using (ISession session = NHibernateHelper.OpenSession())
    {
        return (from j in session.Linq<Job>()
                select j).ToList().BinarySerialize();
    }
}

[WebMethod]
public byte[] Save(byte[] JobBytes)
{
    var Jobs = JobBytes.BinaryDeserialize<IList<Job>>();

    using (ISession session = NHibernateHelper.OpenSession())
    using (ITransaction transaction = session.BeginTransaction())
    {
        foreach (var job in Jobs)
        {
            session.SaveOrUpdate(job);
        }
        transaction.Commit();
    }

    return Jobs.BinarySerialize();
}

As you can see I am sending the whole collection to the server each time and then returning the whole collection. But I'm getting a replaced collection instead of a merged/updated collection. Not to mention the fact that it seems highly inefficient to send all the data back and forth when only part of it could be changed.

Edit 2

I have seen several references on the web for almost a transparent persistent mechanism. I'm not exactly sure if these will work and most of them look highly experimental.

I'm having a hard time finding a replacement for the DataSet model we are using today. The reason I want to get away from that model is because it takes a lot of work to tie every property of every class to a row/cell of a dataset. Then it also tightly couples all of my classes together.