A heavy-weight champion

 
event

Some weeks (months ago) I received a free license for NDepend and I was kindly asked to use it and write about it.
Well, I have used it to some extent so I can write something about it.

Second Impressions

I have known the tool since it was a Beta and has been a fan of the original author (Patrick Smacchia) from the old CodeBetter days.

I have even evaluated the tool when it was a beta and I actually had the time to toy around with beta products. My first impression was feeling overwhelmed. The tool had a lot of features and measured a lot of things that I did not actually understand: cyclomatic complexity, afferent coupling,...
Not that I fully understand them know, but the tools (and his blog entries) served me as a gateway to get to know some advanced design concepts that have come handy throughout the years.

Fast-forward some years and the tool is definitely waaaay more polished from the UI perspective, but it stills offers waaay too much for me to easily grasp. But that will not deter me from trying again.

The UI

The tool comes in several flavors:

  • a stand-alone application that is zip-deployable
  • a console application to be run when automation is required
  • a Visual Studio Add-On for getting feedback when authoring code and when tight integration is wanted

I have only been able to try the stand-alone application, but I guess the VS version should be as good-looking (although a bit too professional if there is such a thing).

I had pointed to Patrick that when I first tried the tool in my old coding-laptop I have had issues with the pop-ups getting in the way, but not that I have spent more time with it, I have not had much trouble (and I am not fond of overlays anyway, so they have been disabled).

Word of advise: have plenty of real state for the tool.
I usually use my tools in split screen for my 27" monitor but with NDepend I had to make an exception to be able to visualize the huge amounts of information that it provides.

Quickstarting

The documentation is pretty good at keeping you up and running and getting your first results.

However like it happens with most (if not all) code analysis tools, results on a medium sized project are totally overwhelming. Be prepared to be greeted with a lot of warning lights and even red flags. Do not freak out, although I myself have had a hard time not to.

Oh, and if you are committed to the tool and ready to check in the .ndproj, get ready to push almost half-a-Megabyte of XML. And that is without any customization. 😮

Being ruled

To be completely honest, I have only scratched the surface of what the tool can offer by running the report on two of my OSS projects: SharpRomans and NMoneys and focusing on the results from the out-of-the-box rules.

NDepend has an edge over other static analysis tools that I know in the sense that all rules can be tweaked and customized by editing the queries in CQLinq (a custom query language focused on analyzing code) with syntax highlighting and auto-completion.
Very, very cool , but I have not had the time (nor the energy) to come to that point.

Code analysis tools tend to be picky, warning you about the minor thing that might lead you to solve a major problem you were not aware of; or, most of the time, being a sign of nothing and be easily dismissable.
The challenge is commit enough time and ponder whether the hint makes any sense in the first place, then whether it applies to your scenario and then whether it even makes sense to do something about it. This is not anything against NDepend specifically, but to all static analysis tools I know of: the need of a skilled human to initially intervene on the feedback.

Some wins

The tool does a good job of suggesting easy fixes and I strongly believe that after acting upon some of the warnings, the design of my projects is slightly better.

I fixed potential issues with member visibility, immutability, turning reference types to value types and some other "minor" fixes. And the tool does a wonderful job at telling you the exact point to apply the fix, the cause and provide meaningful potential solutions.

Some dead ends

Of course, not everything is rainbow, unicorns and free time invested...

  • There was a case in which there was a warning that method overrides should be called with base., but I could not find an instance of such warning in this code:
  • Also, a rule suggested to nest a class into a parent type, but since that class contained extension methods, doing so would lead to compilation errors, since static method cannot be located on nested classes.

  • A rule suggested turning a class into a struct, but the class had custom logic in the default constructor which would be close to impossible to achieve if it was a value type.

  • There were a lot of boxing warnings. But majority of those boxing/unboxing issues cannot be fixed, since there is no generic API to be used. There were a ton of warning for calls to string.Format() and the canonical implementation of .Equals() in value types.

  • This one can be a tough and debatable one (I imagine): a rule suggested not using members marked as Obsolete (great advice), but it failed to grasp that the single usage was wrapped in #pragma directives, meaning that it was OK to ignore the warning from the developer perspective.

  • Another violation that got me scratching my head was the advice against marker interfaces (interfaces without members), which can be useful. However in my case, it was a "lesser marker" interface because, even though it did not have members on its own, it did aggregate two different interfaces, which, in my opinion, validates its existence.

I am pretty confident majority of these rules could have been tweaked for my projects, but again, it is neither a task for a first-timer, nor a trivial one.

End result is: one of my projects is red flagged and the things that are pointed as show-stoppers (quality gate conditions) are, in my opinion, definitely not red flags in this particular case.

Final words

Do not let my "dead ends" with the analysis rules fool you. It IS a very good tool, but boy isn't it huge!

I have dipped my pinkies in it by checking the results from the rule execution, but you can get easily waist-deep: dependency graphs, dependency matrices, all sorts of heat-maps, code diffs, trend analysis, execution from the build process,...

A powerful monster it is (wrapped in an effective UI) but definitely not one to be left in the hands of unskilled developers.
And not a monster to be left alone to roam your coding lands without human supervision and dedication, it will become more of a trouble than a problem revealer.

share class ,