views:

173

answers:

5

The way i currently populate business objects is using something similar to the snipet below.

using (SqlConnection conn = new SqlConnection(Properties.Settings.Default.CDRDatabase))
{
    using (SqlCommand comm = new SqlCommand(SELECT, conn))
    {
        conn.Open();

        using (SqlDataReader r = comm.ExecuteReader(CommandBehavior.CloseConnection))
        {
            while (r.Read())
            {
                Ailias ailias = PopulateFromReader(r);
                tmpList.Add(ailias);
            }
        }
    }
}

private static Ailias PopulateFromReader(IDataReader reader)
{
    Ailias ailias = new Ailias();

    if (!reader.IsDBNull(reader.GetOrdinal("AiliasId")))
    {
        ailias.AiliasId = reader.GetInt32(reader.GetOrdinal("AiliasId"));
    }

    if (!reader.IsDBNull(reader.GetOrdinal("TenantId")))
    {
        ailias.TenantId = reader.GetInt32(reader.GetOrdinal("TenantId"));
    }

    if (!reader.IsDBNull(reader.GetOrdinal("Name")))
    {
        ailias.Name = reader.GetString(reader.GetOrdinal("Name"));
    }

    if (!reader.IsDBNull(reader.GetOrdinal("Extention")))
    {
        ailias.Extention = reader.GetString(reader.GetOrdinal("Extention"));
    }

    return ailias;
}

Does anyone have any suggestions of how to improve performance on something like this? Bear in mind that PopulateFromReader, for some types, contains more database look-ups in order to populate the object fully.

A: 

AFAIK, that is as fast as it gets. Perhaps the slowness is in the SQL query/server. Or somewhere else.

leppie
+6  A: 

One obvious change would be to replace this kind of statement: ailias.AiliasId = reader.GetInt32(reader.GetOrdinal("AiliasId"));

with

ailias.AiliasId = reader.GetInt32(constAiliasId);

where constAiliasId is a constant holding the ordinal of the field AiliasId.

This avoids the ordinal lookups in each iteration of the loop.

automatic
Nice catch. Many forget that GetOrdinal has to do extra work to get the right column.
David Robbins
My thoughts too. Get the ordinal position outside the read loop.
RichardOD
+4  A: 

If the data volume is high, then it can happen that the overhead of building a huge list can be a bottleneck; in which case, it can be more efficient to use a streaming object model; i.e.

public IEnumerable<YourType> SomeMethod(...args...) {
    using(connection+reader) {
        while(reader.Read()) {
            YourType item = BuildObj(reader);
            yield return item;
        }
    }
}

The consuming code (via foreach etc) then only has a single object to deal with (at a time). If they want to get a list, they can (with new List<SomeType>(sequence), or in .NET 3.5: sequence.ToList()).

This involves a few more method calls (an additional MoveNext()/Current per sequence item, hidden behind the foreach), but you will never notice this when you have out-of-process data such as from a database.

Marc Gravell
+1  A: 

Your code looks almost identical to a lot of our business object loading functions. When we suspect DAL performance issues, we take a look at a few things things.

  1. How many times are we hopping out to the DB? Is there any way we can connect less often and bring back larger chunks of data via the use of multiple result sets (we use stored procedures.) So, instead of each child object loading its own data, the parent will fetch all data for itself and its children. You can run into fragile SQL (sort orders that need to match, etc) and tricky loops to walk over the DataReaders, but we have found it to be more optimal than multiple DB trips.

  2. Fire up a packet sniffer/network monitor to see exactly how much data is being transmitted across the wire. You may be surprised to see how massive some of the result sets are. If they are, then you might think about alternate ways of approaching the issue. Like lazy/defer loading some child data.

  3. Make sure that you are using all of the results you are asking for. For example, going from SELECT * FROM (with 30 fields being returned) to simply SELECT Id, Name FROM (if that is all you needed) could make a large difference.

ckittel
A: 

It's likely the real problem is the multiple, per-object lookups that you mention. Have you looked closely to see if they can all be put into a single stored procedure?