views:

993

answers:

1

I'm using a SqlDataReader to populate an entity in a Winform app. The entity class has several foreign key attributes in the database that I want to setup as properties in the entity class. Each property will be of type 'Attribute' with an ID & Description property.

Some of the users of the system are far from the database, so data access performance is a serious consideration.

I could either execute the SqlCommand.ExecuteReader multiple times (once for each attribute) against a stored proc that returns a single resultset, or I could execute the SqlCommand.ExecuteReader once against a stored proc that returns multiple result sets and use the SqlDataReader.NextResult to move through them and setup the attributes.

If I loop through with .NextResult, I get into some issues with making sure the stored proc and the property assignment looping are aligned. If the order of SELECT statements in the proc change order, then the assignment order in the winform app would get messed up.

If the SqlDataReader goes back to the database for each read anyway, is there much time added executing the SqlCommand.ExecuteReader? Doing an ExecuteReader for each attribute would make things clearer on the assignment side.

Thanks!

+1  A: 

Using .Nextresult, ADO.Net does not go back to the database for each result set. All this logic is in the ADO.Net data Provider, where ever that is running (in your data access layer, probably on your application server, or in your UI layer, whereever)

if you execute the datareader multiple times, however, you will be making one rtound trip to the database for each call. So if the layer where this code is executing is some distance (in round-trip-time) away from the DB, you would be better off to use the .NextResult technique.

You should also consider using a DataSet (with multiple DataTables) instead of a data reader.... you might find this more performant.

Charles Bretana
I'm curious: When would using a DataSet (with DataAdapter) be more "performant" that just using a DataReader? It is just that since the DataAdapter uses the DataReader underneath I'm wondering what mystical magic makes it run faster for the DataAdapter but not the user's end code?
Colin Mackay
To be honest, I'm not exactly sure, but (and the following is a guess) it might be because the code in the ADO.Net provider that 'populates' the dataset with data is optimized (maybe written in C++/assembly ?) to do it rapidly and then disconnect from the database, whereas when you use a datareader, you must write this code yourself, using calls to std CLR framework classes, and your code cannot disconnect until it has finished reading and consuming the very last record in the reader...
Charles Bretana
And for any application that potentially creates multiple DB connections, what affects scalability the most is not what the connections do but how long they remain open.
Charles Bretana