What can be alternative metrics to code coverage?

If you are looking for some useful metrics that tell you about the quality (or lack there of) of your code, you should look into the following metrics:

  1. Cyclomatic Complexity
    • This is a measure of how complex a method is.
    • Usually 10 and lower is good, 11-25 is poor, higher is terrible.
  2. Nesting Depth
    • This is a measure of how many nested scopes are in a method.
    • Usually 4 and lower is good, 5-8 is poor, higher is terrible.
  3. Relational Cohesion
    • This is a measure of how well related the types in a package or assembly are.
    • Relational cohesion is somewhat of a relative metric, but useful none the less.
    • Acceptable levels depends on the formula. Given the following:
      • R: number of relationships in package/assembly
      • N: number of types in package/assembly
      • H: Cohesion of relationship between types
    • Formula: H = (R+1)/N
    • Given the above formula, acceptable range is 1.5 - 4.0
  4. Lack of Cohesion of Methods (LCOM)
    • This is a measure of how cohesive a class is.
    • Cohesion of a class is a measure of how many fields each method references.
    • Good indication of whether your class meets the Principal of Single Responsibility.
    • Formula: LCOM = 1 - (sum(MF)/M*F)
      • M: number of methods in class
      • F: number of instance fields in class
      • MF: number of methods in class accessing a particular instance field
      • sum(MF): the sum of MF over all instance fields
    • A class that is totally cohesive will have an LCOM of 0.
    • A class that is completely non-cohesive will have an LCOM of 1.
    • The closer to 0 you approach, the more cohesive, and maintainable, your class.

These are just some of the key metrics that NDepend, a .NET metrics and dependency mapping utility, can provide for you. I recently did a lot of work with code metrics, and these 4 metrics are the core key metrics that we have found to be most useful. NDepend offers several other useful metrics, however, including Efferent & Afferent coupling and Abstractness & Instability, which combined provide a good measure of how maintainable your code will be (and whether or not your in what NDepend calls the Zone of Pain or the Zone of Uselessness.)

Even if you are not working with the .NET platform, I recommend taking a look at the NDepend metrics page. There is a lot of useful information there that you might be able to use to calculate these metrics on whatever platform you develop on.


What about watching the trend of code coverage during your project?

As it is the case with many other metrics a single number does not say very much.

For example it is hard to tell wether there is a problem if "we have a Checkstyle rules compliance of 78.765432%". If yesterday's compliance was 100%, we are definitely in trouble. If it was 50% yesterday, we are probably doing a good job.

I alway get nervous when code coverage has gotten lower and lower over time. There are cases when this is okay, so you cannot turn off your head when looking at charts and numbers.

BTW, sonar (http://sonar.codehaus.org/) is a great tool for watching trends.


Bug metrics are also important:

  • Number of bugs coming in
  • Number of bugs resolved

To detect for instance if bugs are not resolved as fast as new come in.


Crap4j is one fairly good metrics that I'm aware of...

Its a Java implementation of the Change Risk Analysis and Predictions software metric which combines cyclomatic complexity and code coverage from automated tests.