DaedTech

Stories about Software

By

Are You Aggressively Trying to Automate Code Review?

Editorial note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site.  While you’re there, have a look at their automated analysis and documentation tooling.

Before I talk in detail about trying to automate code review, do a mental exercise.  Close your eyes and picture the epitome of a soul-crushing code review.

You probably sit in a stuffy conference room with several other people.  With your IDE open and laptop plugged into the projector via VGA cable, you begin.  Or, rather, they begin.  After all, your shop does code review more like thesis defense than collaboration.  So the other participants commence grilling you about your code as if it oozed your incompetence from every line.

This likely goes on for hours.  No nit remains unpicked, however trivial.  You’ve even taken to keeping a spreadsheet full of things to check ahead of code reviews so as not to make the same mistake twice.  That spreadsheet now has hundreds of lines.  And some of those lines directly contradict one another.

When the criticism-a-thon ends, you feel tired, depressed, and hungry.  But, looking on the bright side, you realize that this is the furthest you’ll ever be from the next code review.

It probably sounds like I speak from experience because I do.  I’ve seen this play out in software development shops and even written a blog post about it in the past.  But let’s look past the depressing human element of this and understand how it proves bad for business.

Read More

By

Fixing Your Snarled Dependency Graph

Editorial note: I originally wrote this post for the NDepend blog.  You can check out the original here, at their site.  While you’re there, have a look at NDepend’s features for helping you visualize your codebase.

I’ve written before about making use of NDepend’s dependency graph.  Well, indirectly, anyway.  In that post, I talked about the phenomenon of actual software architecture not matching the pretty diagrams people draw in Visio.  It reminds me of Helmuth von Moltke’s wisdom that no battle plan survives contact with the enemy.

Typically, architects conceive of wondrous, clean, and decoupled systems.  Then they immortalize this pristine architecture in Visio.  Naturally, print outs go up on the wall, and everyone knows what the system should look like.  But somehow, it never actually winds up looking like that.

Architectures of Despair

I think we all know what it winds up looking like.  Or, at least, what it can look like.  Sometimes the actual architecture only misses the mark by a little, around the edges.  But other times, it goes sailing off in the wrong direction, like a disastrous misfire at the archery range.

When this happens, we have metaphors for the result.  Work in the industry long enough, and you’ll hold your nose and describe a codebase as a big ball of mud.  You might also hear descriptors involving tangled Christmas tree lights or spaghetti code.  Maybe you’ll hear about a bramble bush or something.

The specific image varies, but the properties do not.  All of them describe something snarled, difficult to separate, and unpleasant to work with.  They indicate complexity without intent or purpose.  And when that happens, deadlines slip and defects proliferate.  Oh, and the people working in the codebase become miserable, now regarding those Visio diagrams as cruel jokes.

All of this stems from a core problem: a tangled dependency graph.

Read More

By

Deploying Guerrilla Tactics to Combat Stupid Tech Interviews

I’ve realized something about my situation.  I work for myself, building businesses and still, occasionally, consulting at times.  But of course that’s not news to me.  Nor is the fact that I’ve moved out to a quiet, remote place where I wear T-shirts exclusively, fish a lot, work when I feel like it from a room in my house, and often cook dinners over a fire in my backyard.  The realization came from marinating in that lifesyle for a while, and then noticing that I have absolutely no reason to pull any punches with my opinions.  No affiliations, no politics, no optics to manage.  So why not have some fun expressing those opinions, provocative or not, as DaedTech posts?

Today, I’d like to take on the subject of tech interviews.  Of course, talking about the deeply flawed hiring process isn’t new for this blog.  But I’m going to take it a step further by suggesting how we, as individuals, can try to fight back against Big Tech Interview.

The seed for this came from an idle internet clicking sequence that brought me to a blog.  The company to whom the blog belongs, Byte by Byte, offers the motto, “your one stop shop for acing your coding interview.”  Below that, it says, “master the coding interview game”  (emphasis mine).  It struck me then.  Yes, of course.  It really, truly is a game, and a stupid one at that.  But let me come back to the cottage industry of Princeton Review for tech companies later.

The History of the Job Interview

For this history, I’ll offer an excerpt from my book, Developer Hegemony, describing the history of the job interview in general.

In 1921, tired of hiring college graduates that didn’t know as much as he did, Thomas Edison made up a giant trivia questionnaire to administer to inbound applicants. According to Mental Floss, questions included “Who invented logarithms?” and “Why is cast iron called Pig Iron?” If you look at the sorts of questions that modern day tech companies seem to think they’re cute for asking, courtesy of cio.com, they include such profundities as “Why is the Earth round?” and “How much do you charge to wash every window in Seattle?” If you mixed Edison’s and tech companies’ questions together, you’d be hard pressed to tell the difference.

To summarize, almost 100 years ago, an aging, eccentric, and incredibly brilliant inventor decided one day that he didn’t like hiring kids that weren’t his equals in knowledge. He devised a scheme off the cuff to indulge his preference and we’re still doing that exact thing about a century later. But was it at least effective in Edison’s day? Evidently not. According to the Albert Einstein archives, Albert Einstein would not have made the cut. So the biggest, trendiest, most forward thinking tech companies are using a scheme that was dreamed up on a whim and was dead on arrival in terms of effectiveness.

But surely it’s evolved somehow. Right? Well, no, at least not in any meaningful way. In this piece from Business Insider about the “evolution” of the job interview, we can see that what’s actually changed is the media for asking dumb trivia questions. In Edison’s day, interviewers had to get cute face to face. Now they can do it over the phone, through a computer screen or even via a mobile app. Who knows what the future will hold for the job interview; they may be able to beam the stupid directly into your cerebral cortex!

Google Looks Critically at Tech Interviews

In the book, I cover a lot more ground than I can or will here.  I lay out a case for how uniquely pernicious this interview process is for tech.  It artificially depresses software developers’ wages and manufactures job scarcity in a market where demand for our labor is absolutely incredible.  But let’s seize on a different point for this particular post.

I have specific styles of modern tech interviews in my sights as worse than others.  Specifically, the whiteboard interview, the trivia/brain-teaser interview, and the “Knuth Fanatic,” algorithm-obsessed interview.  These serve mainly to make the interviewer feel smart, rather than to reveal anything about candidates.  But don’t take it from me.  Laszlo Bock, former head of Google HR, said this:

On the hiring side, we found that brainteasers are a complete waste of time. How many golf balls can you fit into an airplane? How many gas stations in Manhattan? A complete waste of time. They don’t predict anything. They serve primarily to make the interviewer feel smart.

And also this:

Years ago, we did a study to determine whether anyone at Google is particularly good at hiring. We looked at tens of thousands of interviews, and everyone who had done the interviews and what they scored the candidate, and how that person ultimately performed in their job. We found zero relationship. It’s a complete random mess.

Read More

By

The Decline of the Enterprise Architect

Happy reader question Monday, everybody!  Unlike last week (exigent circumstances) and the week before (US holiday), I can actually bring you one on Monday.  I’m trying hard not to pull a muscle patting myself on the back.

Anyway, today’s reader question has to do with the enterprise architect position.  Specifically, what do I think of it?

It’d be great to hear your thoughts on [the enterprise architect], I’m sure there’s more than an article’s worth on that subject.

Simple enough.  So let’s talk enterprise architect.

Prior Art on Enterprise Architects and Regular Architects

I have, in fact, talked about the architect position before.  It’s hard not to when you’ve blogged for as long as I have and as much as I have.  The architect role is almost as ubiquitous as the software developer role.

I once talked about the architect role as a pension plan for developers.  The pyramid shaped corporation creates a stigma that staying “just” a programmer means failure in a career development context.  So even as organizations (mercifully) move away from the “software as construction” metaphor, the concept of architect persists.  It persists because it gives companies an organizationally meaningless way to let someone be “more” than “just a developer.”

I’ve also made posts about the needless division between reasoning about software at a holistic versus granular level and about moving beyond this distinction and the construction metaphor.  These probably weren’t quite as provocative, and they didn’t dive into the toxic role of the pyramid shaped corporation in knowledge work.

But it appears I’ve never specifically talked about the enterprise architect.  Well, let’s do that now.

The Impressive Enterprise Architect

What is an enterprise architect, anyway?  Well, presumably, it’s someone who trades in enterprise architecture, which is like architecture, but more enterprise-y.  Let’s ask wikiepdia.

Enterprise architecture (EA) is “a well-defined practice for conducting enterprise analysis, design, planning, and implementation, using a comprehensive approach at all times, for the successful development and execution of strategy. Enterprise architecture applies architecture principles and practices to guide organizations through the business, information, process, and technology changes necessary to execute their strategies. These practices utilize the various aspects of an enterprise to identify, motivate, and achieve these changes.”

Wow.  Pasting that into the WYSIWYG made my Flesch Ease of Reading score plummet by 20% as I’m typing this.  That alone probably makes it enterprise-y.  Plus, it plugs in “utilize” as a synonym for “use,” so you know it’s official.

If we unpack — eck, who am I kidding?  There’s no unpacking that rhetorical peacock.  Enterprise architecture is, truly, using a comprehensive approach to practice conducting analysis, design, planning, and implementation to develop and execute architectural patterns and practices that guide organizations through changes related to business, information, process, and technology utilizing various aspects of the enterprise to identify, motivate, and achieve said changes.”

Crap, there goes another 15% off of my readability.  Now only people with tattoos of Gantt charts can read this post.

So what is enterprise architecture, and what, then, does the enterprise architect do?  Well, whatever else they do, they apparently seek to make sure no one ever asks them what they do again.

Read More

By

Addressing Malware Detection from the Outside In

Editorial note: I originally wrote this post for the Monitis blog.  You can check out the original here, at their site.  While you’re there, have a look around at the different types of production monitoring that they offer.

I worked as a software engineer for almost the entire decade of the 2000s.  While I was earning a living this way, computers were making their way from CS student dorm rooms to Grandma’s den.  Like so many other programmers of the time, I thus acquired the role of unofficial tech support for computer illiterate friends and family.  I do not miss those days in which malware detection became an involuntary hobby.

Back in 2005, everyone had computers with Windows XP or Windows 98.  And every computer with Windows XP or Windows 98 seemed to attract malware like flies to flypaper.  So I found myself sitting in front of CRT monitors next to dusty towers, figuring out why nothing worked.

I still kind of remember the playbook.  For instance, Lavasoft had a product called Ad Aware.  I also seem to recall something called MalwareBytes.  I favored these because I could download them for free.  At least, I could download them for free assuming the victim’s computer was even capable.  With those tools in place, and with a heaping helping of googling from my own laptop, I would painstakingly scan, sweep, remove, tweak, and repeat.  Eventually, I won. Usually.

It seems strange to think about now.  Ten or twelve years ago, consumers compared brands of antivirus software the way we compare music apps on our phones.  Malware detection dominated our computing consciousness, even for casual users.  But today?  I can’t remember the last time I ran an antivirus scan on my laptops.  I suspect you can’t either.  So what happened to all of this malware?  Did it simply disappear?

The Silencing of Malware

Well, no.  Malware didn’t disappear.  The criminals and spammers of the world didn’t just one day decide to do something better with their lives.  In fact, you might argue that they became more effective.

Most of the pieces of malware from my younger days had lots of bark and little bite.  They’d install themselves on your computer and hijack your browser with obnoxious graphics or spew error messages until your machine crashed.  Some came from would-be vandals, while others tried unsuccessfully to do things sneakily.

But causing some computer neophyte to say “this doesn’t seem right” and call up a young me to fix things — well, it hardly constitutes successful sneaking.  It always seemed, in those days, that malware authors sought mainly to annoy.  And malware detection and removal sought to fix the inconvenience.

In more recent years, however, the consumer annoyance factor has mostly disappeared.  Why?  Because there’s no profit in it.  Today’s malware instead aims to help its authors and users make money.  It does this by quietly gathering data, sending out spam, gaming search engines, and stealing information.  And it does all of this under the radar.

Read More