views:

169

answers:

4

Hi,

The organization I work with, a large Indian outsourcing company, enforces documentation of code review comments. The rationale is that this gives the team insight into the quality of code development and also understanding of the patterns of the problems encountered. This is checked during the regular project audits that we have.

The problem is that it takes a lot of effort and time to document the comments and track them to closure, especially for a complex project that has various layers like client/Web services and database.

How does this fit with the principles of Agile methodology? What is your experience and opinion?

A: 

From an Agile perspective, I'd be interested to hear who is going to read the code review comments, why and when. I would like to see a short feedback-cycle. The code reviews can provide that. I'm not sure how you'd extract patterns of problems from free text transcripts of code review comments. Also, you'd want causes, not effects. It is probably easier to just ask the developers (anonymously) about the problems. Much easier, much less work.

[edit] In general, you can collect information to improve quality, or to evaluate people. If you try to do both the quality of the information will not be usable. If you want the reviews to improve quality, you need to create an atmosphere of trust.

Stephan Eggermont
That is a good point you bring up! Well the way the process works is that a dev lead or a manager would review the comments and provide feedback to the developer if there's a pattern. In fact it is frequently said that if the review comments are not documented the management will assume that the review wasn't done.
Completely agree with the need to create atmosphere for trust!
A: 

Most of the comments in our code end with the date. When we find comments that are getting a little stale, we delete it. Unit tests are intended to be our source of code documentation. The quality of our code is judged by code review, bug counts, and test coverage. We also use a continuous integration system. This helps us get quick feedback from testing as we have several builds a day.

Guster_Q
Thank you Guster_Q. One requirement of our organization is that once should be able to analyze the pattern on comments and therefore you can't put review comments in the code itself.
A: 

Disclaimer: I'm an employee of Atlassian, makers of the popular issue tracking and project management tool JIRA.

We also have a light-weight peer code review tool that is a great solution for distributed teams that are using agile methods. Please note that similar results can be achieved with other code review tools. I'll leave it to you to pick the best one since I'm of course biased ;)

Creating documentation of your code review sucks. It takes time away from writing code, and it's a tough practice to maintain especially when you approach release date. Crucible lets developers create reviews based on commits, from their IDE, Web browser, or command line.

Reviewers can comment on code, and create defects which are automatically converted into JIRA issues and managed via your workflow. Perhaps the most valuable feature for the situation described here is that Crucible automatically creates a record of the code review, so you don't have to create documentation or add additional code comments.

Jesse Gibbs
Thank you Jesse. I'll check out your tool!
+2  A: 

What goals do you have for code reviews?


If, as you suggest in your reply to Stephan Eggermont, your goal is to prove to management that reviews were done or that defects were fixed, then I'd say it's not a very agile process. Agility hinges on trust.

If your goal is to prove to management that refactorings were done as a result of code reviews, then I'd say it's also not a very agile process. I'm not sure that management needs to care about refactorings of working code.


Where your question becomes really interesting to me is when either:

  • a. your goal is to ensure, in a repeatable manner, that defects found in code reviews are fixed
  • b. your goal is to improve working code and improve the development skills of the team

Point a. can be addressed by writing cards for defects, writing automated regression tests for those cards before each defect is fixed, and frequently running those regression tests as part of a regular test suite.

Point b may need to be done in an ad-hoc manner. If a review discovers only a few refactorings needed, an email to the team may suffice. If there are many, or if they are complex, something like a "brown bag" technical lunch meeting with the team may be needed to transmit the information fully.

JeffH
Thank you JeffH. We can't always write automated regression tests but for changes that are complex, we just create a tasks or defects in the task management system (we are using Rally). I agree with point b. spending time on writing review comments in say an excel sheet (which is what we currently do) is unproductive. Many times the resources do this as a formality. But our project auditors need to see a proof of review documentation. For now, I am going to categorize task/defects as originating from review. I just plan to keep minor review comments verbal.
Project auditors? This all sounds like waste... either the code works, and passes its tests, or it doesn't.
Jeremy McGee