views:

348

answers:

7

We are in the initial phase of trying to implement TDD. I demo'd the Visual Studio Team System code coverage / TDD tools and the team is excited at the possibilities. Currently we use Devpartner for code coverage, but we want to eliminate it because its expensive. We have very limited experience in TDD and want to make sure we don't go a wrong direction. Currently we are using SourceSafe for source control but will be migrating to Team System in about a year.

I can tell you our applications are very data centric. We have about 900 tables, 6000 stored procedures, and about 45GB of data. We have lots of calculations that are based upon userdata and different rates in the system. Also a lot of our code is based upon time (calculate interest to the current date). Some of these calculations are very complex and very intensive (only a few people know the details for some of them).

We want to implement TDD to solve QA issues. A lot of developers are forced to fix bugs in areas they are not familiar with and end up breaking something. There are also areas that developers are almost afraid to touch because the code is used by everything in the system. We want to mitigate this problem.

I'm afraid since our code is so data centric that implementing TDD might be a little bit more complex than most systems. I'm trying to come up with a gameplan that I can present to management but I want to hopefully not get caught up in some of the TDD beginner mistakes. Also if tools / facilities in Team System make TDD more complete then that would be nice but we don't want to wait for Team System to get started.

The first question we our asking is should we just start with the tools in visual studio? I have read post were people complain about the intrinsic tools in visual studio (need to create a separate project to create your test) but the one thing about the tools in visual studio is they are free and the integration is good. If we decide to go the other route in using something like XUnit, MBUnit, or NUnit then we are most likely going to have some maybe significant cost:

1) If we want IDE Integration (failed to mention most of our code is VB.Net)
---TestDriven.Net or Resharper or ?????

2) If we want code coverage
---NCover (Seems pretty pricey for its functionality)

Also I've seen some pretty cool functionality demoed in visual studio 2010. Like the ability to do input testing (data entered on a form) or the ability to record what the user has done and then feed that into your unit test to reproduce a problem.

Also, although I don't quite grasp the mocking object concept yet, I know a lot of people feel it's a must. The question is can all the mocking frameworks plug into visual studio's version of TDD (MSTEST)?

I have advised management that we should probably just add regression testing going forward (new development or found bugs) but not try to go through all our code and put in unit tests. It would be WAY too big of project.

Anyways, I would appreciate anyones help.

+7  A: 

First thing to do is get this book:

Working Effectively with Legacy Code

For such a large project, read it and internalize it. TDD on a data driven application is hard enough. On a legacy one you need some serious planning and effort. Worth it in my view, but it is still a big curve.

Yishai
I would also recommend Roy Osherove's book on Unit Testing. I'm not convinced that TDD is the way to go on a project like this. It's a little bit like closing the barn door after the cows come home.
Robert Harvey
+3  A: 

1) I use TestDriven.Net, and I like it so a +1 from me for that
2) Code coverage is useful, when thought about in the right frame of mind:
High code coverage does not necessarily mean high quality unit tests, but...
High quality unit tests does mean high code coverage

I've only used NCover, so can't recommend alternatives.

Regarding mocking - once you grasp what it is about and what it really means for you, you will see the benefits it has to offer. Namely, it means you're not dependent on the integration of the code you're testing with an outside dependency, and also helps to keep text execution time down (e.g. mocking your data access layer prevents costly interaction with the DB). That is an important factor in my opinion as if tests take a long time to run, people can start to not bother running them if it means they have to wait too long! I use NUnit, which has support for mocks built-in.

Tie a TDD approach into a continuous integration environment (e.g. CruiseControl.NET), and you have a very powerful and productive setup.

When starting out in TDD/unit testing, I'd always recommend writing tests for code written from "now" onwards and not focussing too much on writing tests for legacy/existing code. That is a lot harder to do in general and a lot more expensive time-wise, particularly if the code is old/not fresh in anyone's mind!

Update: To explain my last point a bit further, in response to Robert's comment...

When you are trying to get up and running with TDD/unit testing, and gain momentum with the whole team, you want that to be as positive and productive as possible. Writing tests for old code that isn't being changed during this initial period is expensive compared to new code because the code is not fresh, the exact intricacies of it more than likely have to be worked out again and not necessarily by the original programmer. Plus the fact that you may well find it difficult to justify to the business, the time needed to spend revisiting old code to write tests for them, instead of working on new functionality/fixing real bugs/issues.

It can become a negative experience-a developer who is tasked writing the tests for old code he knows/remembers little about will find it more difficult to do and thus their first experience is not a positive one. You do need to be careful too in this situation as you may end up with weak tests which gives you false confidence. In my experience, it's absolutely critical that everyone gets off to a positive start with this, else confidence/motivation in it fades and the end result is much worse.

I'm not actually saying you should not add tests for legacy code - I do myself when I work in or around older code that doesn't have any tests in order to bit-by-bit, improve the test coverage and quality. The difference is, I'm already onboard with the process, a "believer". It's the early stages of it, that are key...hence my point about not focussing too much on the legacy code at the start.

AdaTheDev
Your comment about not unit testing the legacy code is absolutely backwards. You need to establish a good suite of unit tests on the legacy code base so that you can refactor safely, without worrying about breaking something.
Robert Harvey
@Robert - I don't think it's backwards at all. I've clarified my opinion above.
AdaTheDev
+3  A: 

Well, I would like to start by recommending that you bring in a consulting company that knows TDD to help your team get started. This is especially important if you don't have anyone on the team that is familiar with TDD, unit-testing, mocking frameworks, etc. I am not sure how much buy in you already have from management or the team, but you don't want your first attempt to fail, because of mistakes that could have been prevented by hiring a specialist to assist you take those first steps.

Anyway, I would recommend that you start small and pick a new project that isn't extremely large. Even a small subset of a larger project would work. Use this as both a place to get the team familiarized with TDD and to show management that it is feasible. Then once the team is more versed you can pick at larger projects. As for legacy code I would recommend looking at this book:

Working Effectively with Legacy Code, Michael Feathers

Also I would surely recommend taking a look at this book:

Art of Unit testing, Roy Osherove

It might not be a TDD book, but it is great book to learn about unit-testing, mocking frameworks, and it even has a chapter on legacy code. Also it has some recommendations on how to get team and management buy in. It does talk a little about TDD, integration tests, organizing your codebase, and extensively about what makes a good unit test. All in all a great read.

Hope this helps.

Waleed Al-Balooshi
+1  A: 

100 thumbs up for Working Effectively with Legacy Code recommended by Yishai. I also recommend Pragmatic Unit Testing in C# with NUnit as you're using .NET (although I'm assuming C#). It's been very useful for teaching the basics of unit testing, providing a solid foundation to work from.

Alex Angas
+5  A: 

Do not rush and for God's sake don't try to force developers to do TDD.

Otherwise - you will get messy low quality 'tests', which will be deleted after few months and developers which will never want to hear about TDD again.

Most important requirement for TDD - good knowledge of how to write tests (pretty obvious why).

Second requirement - good knowledge of used technologies.
That's because if it's hard to code anything, designing code in head will be impossible.

Used tools actually are quite unimportant.

P.s. tons of legacy code and data-centric app => a good mold for disaster.

Arnis L.
A: 

For a couple years I was aware of and dabbled in writing unit tests. What allowed me to really engage unit testing was when I started working on an open source project with impressive test coverage. It really struck me that the way I was writing software was 'on the wrong side of history'.

You mentioned many tools whose advertised capabilities excite you, and were wondering how to put them together. I think these questions are addressed very well by "xUnit Test Patterns", from Addison-Wesley (http://xunitpatterns.com/). This book allowed me to pull together all those tools and techniques I'd read about in the past.

The book may better serve an audience who also appreciates other books like the Gang of Four's Design Patterns, Refactoring, and Refactoring to Patterns. These books are also great, though they did not have so direct a change in how I coded after I read them. The presentation of XUnit Test Patterns though reflects the style of the books. They can be hard to read at first as they tend to cross reference chapters in arbitrary directions. I think they're very solid though.

The GoF presented categories of patterns- creational, structural, and behavioral. These categories serve as a way to associate and contrast the patterns being explained. By associating test design patterns with the typical lifetime of a unit test, XUnit Test patterns also weaves together a range of techniques available for unit testing. The same steps are also used to associate and contrast various tools used for building unit tests.

It will help with the high level view, and go into actual implementation.

My only criticism of XUnit Test Patterns is how much text they use to criticize NUnit. NUnit is a fine piece of programming, to its author's credit NUNit us mentioned so prominently in what I think will become a classic book.

Frank Schwieterman
+2  A: 

With regard to getting started, I would also recommend reading Fowler's Refactoring. The first chapter gives a good feel for what it means to introduce tests then safely introduce change (although the emphasis here is on behaviour preserving change). Furthermore, this talk describes some practices which can help improve the testability of your code. Misko Hevery also has this guide to writing testable code, which summarises the talk.

From your description, it sounds as though you want to test the core parts of your system - the parts with a lot of dependencies where changes are scary. Depending on the degree to which data access is decoupled from business logic, you will probably need to refactor towards a state where the code is more testable - where it is easy and fast to instantiate sets of test data in order to verify the logic in isolation. This may be a big job, and may not be worth the effort if changes here are infrequent and the code base is well proven.

My advice would be to be pragmatic, and use the experience of the team to find areas where it is easiest to add tests that add value. I think having many focussed unit tests is the best way to drive quality, but it is probably easier to test the code at a higher level using integration or scenario tests, certainly in the beginning. This way you can detect big failures in your core systems early. Be clear on what your tests cover. Scenario tests will cover a lot of code, but probably won't surface subtle bugs.

Moving from SourceSafe to Team System is a big step, how big depends on how much you want to do in Team System. I think you can get a lot of value from using Visual Studio's built in test framework. For example, as a first step you could implement some basic test suites for the core system/core use cases. Developers can run these themselves in Visual Studio as they work and prior to check in. These suites can be expanded gradually over time. Later when you get TFS, you can look at running these suites on check in and as part of an automated build process. You can follow a similar path regardless of the specific tools.

Be clear from the outset that there is overhead in maintaining test code, and having well designed tests can pay dividends. I have seen situations where tests are copy pasted then edited slightly etc. Test code duplication like this can lead to an explosion in the number of lines of test code you need to maintain when effecting a small product code change. This kind of hassle can erode the perceived benefit of having the tests.

Visual Studio 2008 will only show you block coverage, although the code analysis will also give other metrics like cyclomatic complexity per assembly/class/method. Getting high block coverage with your tests is certainly important, and allows you to easily identify areas of the system that are totally untested.

However, I think it is important to remember that high block coverage is only a simple measurement of the effectiveness of your tests. For example, say you write a class to purge a file archive and keep the 5 newest files. Then you write a test case which checks if you start with 10 files then run the purger you are left with 5. An implementation which passes the test could delete the newest files, but could easily give 100% coverage. This test only verifies 1 of the requirements.

Alex Peck