DaedTech

Stories about Software

By

Manual Code Review Anti-Patterns

Editorial Note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site. While you’re there, take a look around at some of the other posts and at their offerings.

Today, I’d like to offer a somewhat lighthearted treatment to a serious topic.  I generally find that this tends to offer catharsis to the frustrated.  And the topic of code review tends to lead to lots of frustration.

When talking about code review, I always make sure to offer a specific distinction.  We can divide code reviews into two mutually exclusive buckets: automated and manual.  At first, this distinction might sound strange.  Most reading probably think of code reviews as activities with exclusively human actors.  But I tend to disagree.  Any static analyzer (including the compiler) offers feedback.  And some tools, like CodeIt.Right, specifically regard their suggestions and automated fixes as an automation of the code review process.

I would argue that automated code review should definitely factor into your code review strategy.  It takes the simple things out of the equation and lets the humans involved focus on more complex, nuanced topics.  That said, I want to ignore the idea of automated review for the rest of the post.  Instead, I’ll talk exclusively about manual code reviews and, more specifically, where they tend to get ugly.

You should absolutely do manual code reviews.  Full stop.  But you should also know that they can easily go wrong and devolved into useless or even toxic activities.  To make them effective, you need to exercise vigilance with them.  And, toward that end, I’ll talk about some manual code review anti-patterns.

Read More

By

Static Analysis to Hide My Ignorance about Global Concerns

Editorial Note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site.  While you’re there, take a look at CodeIt.Right to help you automate elements of your code reviews.

“You never concatenate strings.  Instead, always use a StringBuilder.”

I feel pretty confident that any C# developer that has ever worked in a group has heard this admonition at least once.  This represents one of those bits of developer wisdom that the world expects you to just memorize.  Over the course of your career, these add up.  And once they do, grizzled veterans engage in a sort of comparative jousting for rank.  The internet encourages them and eggs them on.

“How can you call yourself a senior C# developer and not know how to serialize objects to XML?!”

With two evenly matched veterans swinging language swords at one another, this volley may continue for a while.  Eventually, though, one falters and pecking order is established.

Static Analyzers to the Rescue

I must confess.  I tend to do horribly at this sort of thing.  Despite having relatively good memory retention ability in theory, I have a critical Achilles Heel in this regard.  Specifically, I can only retain information that interests me.  And building up a massive arsenal of programming language “how-could-yous” for dueling purposes just doesn’t interest me.  It doesn’t solve any problem that I have.

And, really, why should it?  Early in my career, I figured out the joy of static analyzers in pretty short order.  Just as the ubiquity of search engines means I don’t need to memorize algorithms, the presence of static analyzers saves me from cognitively carrying around giant checklists of programming sins to avoid.  I rejoiced in this discovery.  Suddenly, I could solve interesting problems and trust the equivalent of programmer spell check to take care of the boring stuff.

Oh, don’t get me wrong.  After the analyzers slapped me, I internalized the lessons.  But I never bothered to go out of my way to do so.  I learned only in response to an actual, immediate problem.  “I don’t like seeing warnings, so let me figure out the issue and subsequently avoid it.”

Read More

By

CodeIt.Right Rules, Explained Part 3

Editorial Note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site.  While you’re there, take a look at CodeIt.Right and see how you can automate checks for and enforcement of these rules.

In what has become a series of posts, I have been explaining some CodeIt.Right rules in depth.  As with the last post in the series, I’ll start off by citing two rules that I, personally, follow when it comes to static code analysis.

  • Never implement a suggested fix without knowing what makes it a fix.
  • Never ignore a suggested fix without understanding what makes it a fix.

It may seem as though I’m playing rhetorical games here.  After all, I could simply say, “learn the reasoning behind all suggested fixes.”  But I want to underscore the decision you face when confronted with static analysis feedback.  In all cases, you must actively choose to ignore the feedback or address it.  And for both options, you need to understand the logic behind the suggestion.

In that spirit, I’m going to offer up explanations for three more CodeIt.Right rules today.

Read More

By

If You Automate Your Tests, Automate Your Code Review

Editorial Note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site.  While you’re there, have a look at CodeIt.Right.

For years, I can remember fighting the good fight for unit testing.  When I started that fight, I understood a simple premise.  We, as programmers, automate things.  So, why not automate testing?

Of all things, a grad school course in software engineering introduced me to the concept back in 2005.  It hooked me immediately, and I began applying the lessons to my work at the time.  A few years and a new job later, I came to a group that had not yet discovered the wonders of automated testing.  No worries, I figured, I can introduce the concept!

Except, it turns out that people stuck in their ways kind of like those ways.  Imagine my surprise to discover that people turned up their nose at the practice.  Over the course of time, I learned to plead my case, both in technical and business terms.  But it often felt like wading upstream against a fast moving current.

Years later, I have fought that fight over and over again.  In fact, I’ve produced training materials, courses, videos, blog posts, and books on the subject.  I’ve brought people around to see the benefits and then subsequently realize those benefits following adoption.  This has brought me satisfaction.

But I don’t do this in a vacuum.  The industry as a whole has followed the same trajectory, using the same logic.  I count myself just another advocate among a euphony of voices.  And so our profession  has generally come to accept unit testing as a vital tool.

Widespread Acceptance of Automated Regression Tests

In fact, I might go so far as to call acceptance and adoption quite widespread.  This figure only increases if you include shops that totally mean to and will definitely get around to it like sometime in the next six months or something.  In other words, if you count both shops that have adopted the practice and shops that feel as though they should, acceptance figures certainly span a plurality.

Major enterprises bring me in to help them teach their developers to do it.  Still other companies consult and ask questions about it.  Just about everyone wants to understand how to realize the unit testing value proposition of higher quality, more stability, and fewer bugs.

This takes a simple form.  We talk about unit testing and other forms of testing, and sometimes this may blur the lines.  But let’s get specific here.  A holistic testing strategy includes tests at a variety of granularities.  These comprise what some call “the test pyramid.”  Unit tests address individual components (e.g. classes), while service tests drive at the way the components of your application work together.  GUI tests, the least granular of all, exercise the whole thing.

Taken together, these comprise your regression test suite.  It stands against the category of bugs known as “regressions,” or defects where something that used to work stops working.  For a parallel example in the “real world” think of the warning lights on your car’s dashboard.  “Low battery” light comes on because the battery, which used to work, has stopped working.

Read More

By

CodeIt.Right Rules Explained, Part 2

Editorial Note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site.  While you’re there, have a look at CodeIt.Right to help with automated code review.

A little while back, I started a post series explaining some of the CodeIt.Right rules.  I led into the post with a narrative, which I won’t retell.  But I will reiterate the two rules that I follow when it comes to static analysis tooling.

  • Never implement a suggested fix without knowing what makes it a fix.
  • Never ignore a suggested fix without understanding what makes it a fix.

Because I follow these two rules, I find myself researching every fix suggested to me by my tooling.  And, since I’ve gone to the trouble of doing so, I’ll save you that same trouble by explaining some of those rules today.  Specifically, I’ll examine 3 more CodeIt.Right rules today and explain the rationale behind them.

Mark assemblies CLSCompliant

If you develop in .NET, you’ve no doubt run across this particular warning at some point in your career.  Before we get into the details, let’s stop and define the acronyms.  “CLS” stands for “Common Language Specification,” so the warning informs you that you need to mark your assemblies “Common Language Specification Compliant” (or non-compliant, if applicable).

Okay, but what does that mean?  Well, you can easily forget that many programming languages target the .NET runtime besides your language of choice.  CLS compliance indicates that any language targeting the runtime can use your assembly.  You can write language specific code, incompatible with other framework languages.  CLS compliance means you haven’t.

Want an example?  Let’s say that you write C# code and that you decide to get cute.  You have a class with a “DoStuff” method, and you want to add a slight variation on it.  Because the new method adds improved functionality, you decide to call it “DOSTUFF” in all caps to indicate its awesomeness.  No problem, says the C# compiler.

And yet, if you you try to do the same thing in Visual Basic, a case insensitive language, you will encounter a compiler error.  You have written C# code that VB code cannot use.  Thus you have written non-CLS compliant code.  The CodeIt.Right rule exists to inform you that you have not specified your assembly’s compliance or non-compliance.

To fix, go specify.  Ideally, go into the project’s AssemblyInfo.cs file and add the following to call it a day.

But you can also specify non-compliance for the assembly to avoid a warning.  Of course, you can do better by marking the assembly compliant on the whole and then hunting down and flagging non-compliant methods with the attribute.

Specify IFormatProvider

Next up, consider a warning to “specify IFormatProvider.”  When you encounter this for the first time, it might leave you scratching your head.  After all, “IFormatProvider” seems a bit… technician-like.  A more newbie-friendly name for this warning might have been, “you have a localization problem.”

For example, consider a situation in which some external supplies a date.  Except, they supply the date as a string and you have the task of converting it to a proper DateTime so that you can perform operations on it.  No problem, right?

That should work, provided provincial concerns do not intervene.  For those of you in the US, “03/02/1995” corresponds to March 2nd, 1995.  Of course, should you live in Iraq, that date string would correspond to February 3rd, 1995.  Oops.

Consider a nightmare scenario wherein you write some code with this parsing mechanism.  Based in the US and with most of your customers in the US, this works for years.  Eventually, though, your sales group starts making inroads elsewhere.  Years after the fact, you wind up with a strange bug in code you haven’t touched for years.  Yikes.

By specifying a format provider, you can avoid this scenario.

Nested types should not be visible

Unlike the previous rule, this one’s name suffices for description.  If you declare a type within another type (say a class within a class), you should not make the nested type visible outside of the outer type.  So, the following code triggers the warning.

To understand the issue here, consider the object oriented principle of encapsulation.  In short, hiding implementation details from outsiders gives you more freedom to vary those details later, at your discretion.  This thinking drives the rote instinct for OOP programmers to declare private fields and expose them via public accessors/mutators/properties.

To some degree, the same reasoning applies here.  If you declare a class or struct inside of another one, then presumably only the containing type needs the nested one.  In that case, why make it public?  On the other hand, if another type does, in fact, need the nested one, why scope it within a parent type and not just the same namespace?

You may have some reason for doing this — something specific to your code and your implementation.  But understand that this is weird, and will tend to create awkward, hard-to-discover code.  For this reason, your static analysis tool flags your code.

Until Next Time

As I said last time, you can extract a ton of value from understanding code analysis rules.  This goes beyond just understanding your tooling and accepted best practice.  Specifically, it gets you in the habit of researching and understanding your code and applications at a deep, philosophical level.

In this post alone, we’ve discussed language interoperability, geographic maintenance concerns, and object oriented design.  You can, all too easily, dismiss analysis rules as perfectionism.  They aren’t; they have very real, very important applications.

Stay tuned for more posts in this series, aimed at helping you understand your tooling.