I'm facing to do the unit test on the data access layer, all the data access layer call database through the store procedure. I plan to create a clean database used for unit testing, but I found most of store procedure will remote call the other database, I've no idea on how to unit test this kind of store procedure, should I create all of related database? or any other solutions?
Look into object mocking.
Rhino mocks is one solution to this, you can start looking at it here .
Mocking will allow you to simulate data access from your database without having to actually have a 'test' database set up. There is a little bit of working involved in setting up the mocks, but it keeps your tests consistent.
In general, the more you can abstract your data access layer away from the rest of the application, the better you can unit test it. As Jayden wrote in another answer, Test Doubles such as dynamic mocks are a good solution.
However, as I understand your question, you explicitly want to unit test the data access layer, and that's okay too. In that case, dynamic mocks aren't going to help you, as you will be testing the lowest layer in your application - there's nothing left to mock out. Some people insist that this is not a unit test but rather an integration test, but I think that the important part is whether the test is automated or not, and not what we call it.
In any case, once you begin testing the data access layer, you more or less have to deal with it as it is. If it uses remote stored procedures then you will have to deal with that as well. To make setup a bit simpler, you may want to put the 'remote' database on the same box as the 'local' database.
A unit test is mainly a behavioral test, so that would be OK. There ought to be other types of tests (integration tests or system tests) that use a realistic setup with many distributed machines etc. to verify that security, networking etc. works as intended, but that should not be the main focus of a unit test.
You could consider structuring your database that references the other, so that all remote tables are access via views. I normally follow a naming convention of:
vw_[DATABASENAME]_[TABLENAME]
The view consists of nothing more than:
select * from server.dbo.tablename
All stored procedures access remote tables via the views, rather than directly i.e. for accessing a Person table in remote database call Staff would be:
create view vw_STAFF_Person
as
select * from Staff.dbo.Person
go
create procedure stp_Select_Staff
as
select * from vw_Staff_Person
go
rather than
create procedure stp_Select_Staff
as
select * from Staff.dbo.Person
go
The rationale for this is that when you want to test your database, you will probably want to re-link all the remote databases to your 'test' remote databases. This is generally easier to do when the only objects that are accessing remote data are simple views, rather than the often more numerous and complex stored procedures.
I generally have a job set up that can rescript the views to the 'test' databases, so it is done automatically.
Additional to this, I also often have a job set up that restores production database backups to the 'test' environment so that testing prior to deployment can be done against systems that contain a copy of live data. Again this process is made easier with only having to re-link the views, rather than all stored proc references to remote systems.
In order to make testing against the security easier, I also always set the database security at the level of database roles
, rather than specific users, as I find roles
more portable across production, test and development environments than users
.
Hope that some of these tips help.