I have to admit that I've become somewhat of a process geek. Maybe even an anti-process geek, but that sounds negative, and wouldn't entirely be accurate. The process of software development has changed a great deal, and part of the reason for that is we rely far less on shrink-wrapped product, and more on the Web. Even your mom is comfortable calling the thing going on in her Web browser an application these days.
Back in the day, you wrote software, and you put it on some kind of media (floppy disks!), then you put it in a box that was shipped to a store. The Internet was not something most people had never heard of, though really fancy people had modems that connected to their phone so they could call another computer. The point is that the stakes were very high for developing software. If you got it wrong, fixing a problem would not only be costly, but it could tank your business.
To combat this risk, development was an enormous process that involved a ton of meetings, documents, a rigorous QA process, and worst of all, enough up front design to make any kind of change incredibly costly. It was really hard to be innovative this way, even in the days when computers were far less powerful and there was only so much memory you could fit your code into anyway.
The unfortunate thing is that a lot of organizations still build software this way. It's unfortunate because the rules have changed so dramatically. Most software truly does run on a server somewhere, and your interaction with it happens in a browser. What this means is that you, as a developer, are no longer bound to the high risk world of disks in boxes that go to thousands of people.
The real problems with the old way of developing software really boil down to two things. The first is that everything you do is based on assumptions. You assume that it will take a certain number of days to develop some component. You assume that your user wants to do this certain thing. You assume your business model will be readily adopted. You assume that your design is the right thing. You know what they say about assuming stuff.
The second problem is that all of this planning is really just guessing. Because you've made so many assumptions, the planning you do hasn't really been vetted against the real world. There will be problems you couldn't predict. You haven't received any feedback from users based on a real product. You don't know how the market will respond. You certainly can't know if your design, whether it be architectural design or a user interface, is ideal given the lack of feedback. The only thing you can really guess will happen is that stuff will change.
And yet, I've watched this unfold time and time again. Development organizations will put days, even years into designing the crap out of everything before they write a single line of code. I worked in a place where analysts would produce huge documents outlining a use case, a single action for a single feature, and it would be further analyzed by a committee. A developer wouldn't even see it until it was "approved," by which time the agreement that the document was supposed to solidify was already tainted by the fact that no one from the developers to the users gave any feedback on it. The assumptions already made it obsolete.
Massive attention to up front design is bad, and here's why. A proponent of up front design will argue that you need to spec-out things in detail, in part to build consensus about what you're doing, and also because you don't want to leave developers and others to interpret things for themselves. Fair enough, but you're still doing all of that design work based on assumptions. It doesn't matter how visionary you are, because your assumptions could still be wrong. If that's the case, all of the time spent on this design, and the subsequent work to build out that design, is wasted, and you're still far away from delivering value to your customer.
Now let me tell you how I would do it. I would describe the smallest thing in the simplest terms while keeping the big picture in mind, and work with developers to do that. None of that throw-it-over-the-wall nonsense. We'd break it down into tasks that only take a few hours each, and prioritize them. We act on the items with the highest priority, re-prioritize weekly, and after four weeks, "ship" what we have. Maybe that means getting it in front of customers, maybe it means using it ourselves, whatever, but the point is we have something real that we can act on. If we're getting it wrong, we can correct our course and move toward the right thing in as little as a few weeks, incorporating feedback, challenging assumptions and delivering value quickly.
The proponent of up front design will say that you've just moved the design work to a different stage of the pipeline (and maybe suggest that it's bad to "allow" other stakeholders to affect the design, but that's a different cultural problem). Yes, I did move the design, but what's so great about this is that I didn't invest a lot of time into it. I didn't waste time building consensus or creating documents that no one will read, I just went and built something. The price of getting it wrong is much lower, and the integration of feedback happens much faster. In other words, I'm building stuff that delivers value faster, and when I get it wrong, I can correct quickly. The risk, and therefore the cost, is lower. I've managed out assumptions as a source of risk and cost.
This is still a hard culture nut to crack. If you're a big agile fan, with a great deal of success under your belt, how do you convince successful waterfall process folks that they're doing it wrong? It's not that they're doing it wrong, it's just that they're doing it slower. When you move too slowly, your competition kicks your ass. Unfortunately, it's the part that goes with "u" and "me."
No comments yet.