views:

1445

answers:

6

While cyclomatic complexity is a worthwhile metric, I tend to find it to be a poor tool for identifying difficult to maintain code. In particular, I tend to find it just highlights certain types of code (e.g. parsers) and misses difficult recursion, threading and coupling problems as well as many of the anti-patterns that have been defined.

What other tools are available to identify problematic Java code ?

Note, we already use PMD and FindBugs which I believe are great for method level problem identification.

A: 

The static analysis tools you already use are pretty standard. If you're using Eclipse, try looking here for more code analysis tools.

Emma provides analysis of code coverage, though this is really for testing.

pianoman
+3  A: 

Google Testability Explorer checks for example for singletons and other static things which are bad smells in design. Metrics is an Eclipse plugin that measures almost every code metric known to mankind. I used and can easily recommend both.

Marcin
+1 for Metrics...
Stu Thompson
+1  A: 

I never used it, but I found this rather interesting and promissing:

http://erik.doernenburg.com/2008/11/how-toxic-is-your-code/

And I used this one and found it extremely helpful, because the nice visualization of dependencies

http://www.headwaysoftware.com/products/structure101/index.php

Jens Schauder
+1  A: 

Sonar tries to identify "hot spots" of complexity and maintainability combining the results of various open source tools (including PMD and Findbugs). It integrates well with Maven and CI servers (especially Hudson).

EDIT by extraneon

There is a Sonar site available where a lot of open source projects are analyzed. I think this shows quite good how much rules get applied,and how far the drill down goes. You can of course also disable rules you don't find that interesting.

Here is an explanation of the metrics.

Csaba_H
+3  A: 

My experience is that the most important metrics when looking at code maintainability are:

  • Cyclomatic Complexity, to identify large chunks of code that are probably hard to understand/modify.
  • Nesting depth, to find similar spots (a high nesting depth is automatically high CC, but not necessarily the other way around, so scoring on both is important to look at).
  • Fan in/out, to get a better view of the relationships between methods/classes and the actual importance of individual methods.

When examining code that was written by others, it is often useful to include dynamic techniques. Simply run common usage scenarios through a profiler/code coverage tool to discover:

  • Code that is actually executed a lot (the profiler is great for this, just ignore the timing info and look at the hit counts instead).
  • Code coverage is great to find (almost) dead code. To prevent you from investing time in refactoring code that is rarely executed anyway.

The usual suspects such as any profiler, code coverage and metrics tool will usually help you with getting the data required to make these assessments.

jvdbos
A: 

The tool NDepend for .NET code will let you analyze many dimensions of the code complexity including code metrics like: Cyclomatic Complexity, Nesting Depth, Lack Of Cohesion of Methods, Coverage by Tests...

including dependencies analysis and including a language (Code Query Language) dedicated to ask, what is complex in my code, and to write rule?

A while back, I wrote an article to summarize several dimensions of code complexity: Fighting Fabricated Complexity

Patrick Smacchia - NDepend dev
Thank you. It looks like a good product. Have you gotten any other vendors or teams to adopt CQL and build it for other languages ?
Jim Rush