As part of our unit tests, we restore a blank database when the tests start . The unit tests then perform their tests by calling web services (hosted in the Visual Studio ASP.NET host).
This works fine for us the first time the unit tests are run, however if they are re-run without restarting the web services, an exception is raised as all the connections have been reset as part of the restore.
The code below simulates what occurs:
static void Main(string[] args)
{
DoDBStuff();
new Server("localhost").KillAllProcesses("Test");
DoDBStuff();
}
private static void DoDBStuff()
{
string constr = "Data Source=localhost;Initial Catalog=Test;Trusted_Connection=Yes";
using (SqlConnection con = new SqlConnection(constr))
{
con.Open();
using (SqlCommand cmd = new SqlCommand("SELECT TOP 1 id FROM sysobjects", con))
{
int? i = cmd.ExecuteScalar() as int?;
}
}
}
All the code, except for the KillAllProcesses runs in the web service's process, while the KillAllProcess runs in the Unit Test process. The same could be achieved by restarting the SQL server.
The problem we face is the webservices doesn't know when the connection is killed, and just picks a "bad" connection from the connection pool. Further, the creation of a connection and the execution of the command are several layers apart within the app.
How can we detect that a connection is "bad" before executing a command, without drastically affecting the performance of the application?