Doing the wrong thing for the right reason

I’ve written before about how we can’t afford to be religious about our science.  I’m seeing that situation in a new light based on some experiences while working on our Next Big Thing here in Ionix Storage Resource Management.

We’ve staffed this team with people who have worked on successful, shipping products.  Many of them “grew up” in a world where certain things (like, say, continuous integration, or test-driven development) were unheard of.  There are inevitable growing pains as we push for certain behaviors and results.  Sometimes there’s a failure in the software pipeline and the answer from the responsible party is “This wouldn’t have happened if we had XYZ in place,” where XYZ is a practice from some product they used to work on.

As a technical lead, it’s my responsibility to try and educate people and raise the quality of everyone’s work.  I’m supposed to push for the best practices we all need to understand and employ.  But I also need to make sure a product goes out the door.  So when I’m in a closed-door meeting with senior management and they say, “Do you think we should put XYZ in place?” I have to stop and ponder.

The answer, in my heart, is no.  Take away the crutch, make the team fail a few times.  People learn from failure.  In a few months they will adapt and be better for it (or they will have gotten fed up and left).

The answer, in my brain, is yes.  I know that without this crutch, they will be less productive for a few months.  And I know where we’re supposed to be in a few months, and I’m not sure we can get there without the crutch.

I tell management both these answers.  They ask me what we should do.

And this is why our jobs are hard.

(I usually end up leaning towards the practical side here, and hope that we can educate in parallel.  But I’m always left wondering if I should be more extremist….)