views:

1646

answers:

6

First, let me explain the current situation: I'm reading records from a database and putting them in an object for later use; today a question about the database type to C# type conversion (casting?) arose.

Let's see an example:

namespace Test
{
    using System;
    using System.Data;
    using System.Data.SqlClient;

    public enum MyEnum
    {
        FirstValue = 1,
        SecondValue = 2
    }

    public class MyObject
    {
        private String field_a;
        private Byte field_b;
        private MyEnum field_c;

        public MyObject(Int32 object_id)
        {
            using (SqlConnection connection = new SqlConnection("connection_string"))
            {
                connection.Open();

                using (SqlCommand command = connection.CreateCommand())
                {
                    command.CommandText = "sql_query";

                    using (SqlDataReader reader = command.ExecuteReader(CommandBehavior.SingleRow))
                    {
                        reader.Read();

                        this.field_a = reader["field_a"];
                        this.field_b = reader["field_b"];
                        this.field_c = reader["field_c"];
                    }
                }
            }
        }
    }
}

This is (obviously) failing because the three this.field_x = reader["field_x"]; calls are throwing the Cannot implicitly convert type 'object' to 'xxx'. An explicit conversion exists (are you missing a cast?). compiler error.

To correct this I currently know of two ways (let's use the field_b example): number one is this.field_b = (Byte) reader["field_b"]; and number two is this.field_b = Convert.ToByte(reader["field_b"]);.

The problem with option number one is that DBNull fields are throwing exceptions as the cast is failing (even with nullable types as String), ant the problem with number two is that it's not preserving null values (the Convert.ToString(DBNull) yields a String.Empty), and I can't use them with enums too.

So, after a couple of lookups on the internet and here at StackOverflow, what I came up with is:

public static class Utilities
{
    public static T FromDatabase<T>(Object value) where T: IConvertible
    {
        if (typeof(T).IsEnum == false)
        {
            if (value == null || Convert.IsDBNull(value) == true)
            {
                return default(T);
            }
            else
            {
                return (T) Convert.ChangeType(value, typeof(T));
            }
        }
        else
        {
            if (Enum.IsDefined(typeof(T), value) == false)
            {
                throw new ArgumentOutOfRangeException();
            }

            return (T) Enum.ToObject(typeof(T), value);
        }
    }
}

This way I should handle every case.

Question is: Am I missing something? Am I doing a WOMBAT (Waste Of Money, Brain And Time) as there's a quicker and cleaner way to do it? It's all correct? Profit?

Thanks in advance, Andrea.

+1  A: 

Look at the various GetXXX methods of the data reader. Perhaps they are what you are looking for.

Chris Dunaway
+4  A: 

don't you want to use the reader.Get* methods ? The only annoying thing is that they take column numbers so you have to wrap the accessor in a call to GetOrdinal()

using (SqlDataReader reader = command.ExecuteReader(CommandBehavior.SingleRow))
{
    reader.Read();

    this.field_a = reader.GetString(reader.GetOrdinal("field_a"));
    this.field_a = reader.GetDouble(reader.GetOrdinal("field_b"));
    //etc
}
Sam Holder
I always avoided the `reader.GetX` methods because you must pass them the column's number, and because this is "bad" (What if - e.g. - the underlying stored procedure gets some additional columns? Even if you don't care about them they will screw up the code.) and I wasn't aware of the `reader.GetOrdinal` method, I never even considered them.
kappa
I assume you could probably add a series of overloads via an extension method which would allow you to pass in the column name to the reader and get the various types back
Sam Holder
+6  A: 

If a field allows nulls, don't use regular primitive types. Use the C# nullable type and the as keyword.

int? field_a = reader["field_a"] as int?;
string field_b = reader["field_a"] as string;

Adding a ? to any non-nullable C# type makes it "nullable". Using the as keyword will attempt to cast an object to the specified type. If the cast fails (like it would if the type is DBNull), then the operator returns null.

Note: Another small benefit of using as is that it is slightly faster than normal casting. Since it can also have some downsides, such as making it harder to track bugs if you try to cast as the wrong type, this shouldn't be considered a reason for always using as over traditional casting. Regular casting is already a fairly cheap operation.

Dan Herbert
Cannot convert type 'System.DBNull' to 'int?' via a reference conversion, boxing conversion, unboxing conversion, wrapping conversion, or null type conversion.
Joel Mueller
Have you tried the code above? It will work. What you said is absolutely true and is why my answer works: http://msdn.microsoft.com/en-us/library/cscsdfbt.aspx
Dan Herbert
My mistake. I was testing `DBNull.Value as int?` rather than `(object)DBNull.Value as int?`.
Joel Mueller
Yeah, about the nullable types I forgot to add those damned question marks when writing the example, in the actual code they're there. About the `as` thing, until now I thought it was a VB-only keyword... Learned a new thing... The only "frightening" thing is what is said in the MSDN page you linked: *«The as operator is like a cast operation. However, if the conversion is not possible, as returns null instead of raising an exception.»* But the database type won't change every couple of days, will it? (After this Murphy's law is going to kick in instantly...)
kappa
Choosing between your answer and Bebop's one is only a matter of taste, I think, so I'll mark this as the chosen but both are valid.
kappa
+1  A: 

You can make a set of extension methods, one pair per data type:

    public static int? GetNullableInt32(this IDataRecord dr, string fieldName)
    {
        return GetNullableInt32(dr, dr.GetOrdinal(fieldName));
    }

    public static int? GetNullableInt32(this IDataRecord dr, int ordinal)
    {
        return dr.IsDBNull(ordinal) ? null : (int?)dr.GetInt32(ordinal);
    }

This gets a bit tedious to implement, but it's pretty efficient. In System.Data.DataSetExtensions.dll, Microsoft solved the same problem for DataSets with a Field<T> method, which generically handles multiple data types, and can turn DBNull into a Nullable.

As an experiment, I once implemented an equivalent method for DataReaders, but I ended up using Reflector to borrow an internal class from DataSetExtensions (UnboxT) to do the actual type conversions efficiently. I'm not sure about the legality of distributing that borrowed class, so I probably shouldn't share the code, but it's pretty easy to look up for oneself.

Joel Mueller
A: 

This is how I've dealt with it in the past:

    public Nullable<T> GetNullableField<T>(this SqlDataReader reader, Int32 ordinal) where T : struct
    {
        var item = reader[ordinal];

        if (item == null)
        {
            return null;
        }

        if (item == DBNull.Value)
        {
            return null;
        }

        try
        {
            return (T)item;
        }
        catch (InvalidCastException ice)
        {
            throw new InvalidCastException("Data type of Database field does not match the IndexEntry type.", ice);
        }
    }

Usage:

int? myInt = reader.GetNullableField<int>(reader.GetOrdinal("myIntField"));
BFree
+1  A: 

The generic hanlding code posted here is cool, but since the question title includes the word 'efficiently' I will post my less generic but (I hope) more efficient answer.

I suggest you use the getXXX methods that others have mentioned. To deal with the column number problem that bebop talks about, I use an enum, like this:

enum ReaderFields { Id, Name, PhoneNumber, ... }
int id = sqlDataReader.getInt32((int)readerFields.Id)

It's a little extra typing, but then you don't need to call GetOrdinal to find the index for each column. And, instead of worrying about column names, you worry about column positions.

To deal with nullable columns, you need to check for DBNull, and perhaps provide a default value:

string phoneNumber;
if (Convert.IsDBNull(sqlDataReader[(int)readerFields.PhoneNumber]) {
  phoneNumber = string.Empty;
}
else {
  phoneNumber = sqlDataReader.getString((int)readerFields.PhoneNumber);
}
Ray
This isn't resolving the problem which was keeping me away from the `reader.GetX` methods: if the underlying recordset layout changes, particularly if returning additional columns between the old ones, the code will screw up even if you don't care about the new columns, as their ordinal number changes, and your enum does not.
kappa
If your recordset changes, it can cause problems no matter what scheme you use. In Dan's accepted answer, if the field names change, the code is broken. Again, the reason I posted this answer was because you mentioned efficiency. Accessing fields by names uses a 'for' loop internally to step through all the fields in the results until the matching one is found. This is done for each field access, in each row. So, say, 20 fields and 50 records returned - you have 1,000 'for' loops. Plus 1,000 type casts and 1,000 nullable types (which are a little larger than standard types).
Ray