I have spent a week heavy on the refactoring side. It’s been a while since I spent so much time on a “virgin” codebase, growing it test first. And I want to share with you, guys and girls, a piece of refactoring that ended up being pretty cool.
Triangulate, always triangulate
I was happily developing one class at a time. But since the first class I felt an tingle. Like those lines drawn from Spiderman’s face. The tingle became an itch when, while developing a second class, I was doing pretty much the same thing. And the itch became a rash when, for the third time I did the same.
The thing in question, was performing one of two actions depending of the presence or absence of value of a pair of variables.
But everyone knows that you do not scratch when you have a rash. Right? RIGHT!? One does not go blindly hurting himself. In the case of code, one does not endeavor refactoring quests without having a set of tests that make you confident that things will be as they were before and no new defects have been introduced. Right? RIGHT!?
In my case, I was going tests first, so I had a battery of tests for the classes that provoked the rash.
Moral of the story is: one should put up with a tingle, could put up with an itch and must act with the rash. Bringing “the big guns” for a tingle does not make sense and can be counter-productive.
Primitives are evil, value objects FTW
In reality they are not evil. They are very useful indeed, but their main feature (its generality) is usually their biggest disadvantage.
There are million of examples when primitive types are used for sheer laziness but, in reality, a custom value object communicates the concept way better.
For example: can one really perform all operations of a string on a conventional name? does that number represent meters or inches? probabilities bigger than one? These and other aberrations can (and do) happen when the concept is represented with a primitive type (a string, an integer or a floating point number).
In the case that occupies us, the two operations have a value object as a subject, although the value objects (which happen to be nullables) are created out of Nullable<T>
primitive types.
The Choice
This choice value object have multiple but cohesive responsibilities:
- ensure that only one of the possibilities is active (not two, not zero)
- build the value object out of the valued primitive and nullify the other
- perform one out of two actions
He chose… poorly
Or we could, instead.
One abstraction introduced, one abstraction that can be tested in a fraction of the time it takes to deploy the thing, fire up the complete application, exercise the feature, see it burn because you suck at boolean logic, attach the debugger,… you get the picture, right? RIGHT!?
Let’s check that the correct function is executed according to the presence of the value, and that the other function is not executed.
One can walk the anonymous method path, but will find it noisy with all those curly braces:
One can step to remove noise by using methods as the functions, but we can see duplication coming:
Duplication? Couldn’t that type of duplication be solved by using generics? I heard that they are not only for collections
But… wasn’t it about noise? Are you kidding me? An anonymous delegate syntax in 2012? Yeah, but if I show you the goodies from the beginning where’s the fun?
And here we are, no duplication for the notExecuted()
function and syntax clean as a whistle: subject.Do(notExecuted, executed)
. It may seem not much, but do I have to remind you that refactoring is an art of evolution without revolution?
The journey
- tingly code
- itchy code
- rash-y code
- rash-y code with tests
- introduce value objects
A
andB
- new abstraction
AorB
- test abstraction
- use generics and type inference to test simple abstraction
- be amazed
- replace rashy-code with tested abstraction
- green lights everywhere
- party!