Thursday, September 10, 2009

Rapid releases and rethinking software engineering

I have a new post up at blog@CACM, "Frequent releases change software engineering", on why software companies should consider deploying software much more frequently than they usually do.

Here is an excerpt, the last two paragraphs, as a teaser:
Frequent releases are desirable because of the changes it forces in software engineering. It discourages risky, expensive, large projects. It encourages experimentation, innovation, and rapid iteration. It reduces the cost of failure while also minimizing the risk of failure. It is a better way to build software.

The constraints on software deployment have changed. Our old assumptions on the cost, consistency, and speed of software deployments no longer hold. It is time to rethink how we do software engineering.
This CACM article expands on some of the discussion in an earlier post, "The culture at Netflix", on this blog and in the comments to that post. By the way, if you have not yet seen Reed Hastings' slides on the culture at Netflix, they are worth a look.

7 comments:

Alex Hawkinson said...

We're definitely aligned with that model. We use a modified scrum/agile process and push out updates to our staging environment usually twice per week and to production at least once every two weeks. It gives us an opportunity to rapidly improve and also incorporate new feedback from customers, partners, and the industry that help to make us most relevant in solving real-world problems.

This also forces a diligence in demonstrable daily progress that keeps us from resting and falling into an out of sync process across parts of the development team that I've seen eat away at productivity in past projects and companies.

To address a common criticism, this doesn't mean that there aren't major pieces that take a long time to build, it just means that if you challenge yourself you can usually evolve towards them in bite sized chunks.

Iamreddave said...

How about something like antivirus where in the field tweaks and things to react to current threats are released several times a day. Changes to the engine are on a more traditional timetable. Could this be the future of software releases?

jeremy said...

Frequent releases force changes in how an organization develops software.

Yes.

These changes ultimately reduce risk, speed development, and improve the product.

Maybe.

I would say that it depends on the type of software that you are developing, and who you are developing it for. Certainly what you say is true for some software.. online bookstores, search engines, etc. But is it true for all software?

Take games for example. Back when I had more free time in my life than I do now, I used to play a lot more games. And there were two types of games that I played -- console (xbox, ps, etc.) and computer (pc, mac).

And I will say that, without question, the games that were less "risky" and overall much better crafted were slow, careful console games. Not the "rapid release" PC games.

Why did console games become the better games? Because it was impossible to update them. Once you burned that master disk, that was the software that shipped, and that was the software that got played by users. It has to work the very first time. It has to be a finished product. It has to be fun and playable. You get no second chance.

With PC games on the other hand, the user is connected to the internet. That means patches and fixes and improvements can always be made. In fact, I remember a point in 2000-2001 when it became the de facto action of every PC game to connect to the internet to download the latest image during installation and sometimes even after you'd played it a few times.

And what I noticed starting to happen is that this gave the PC software developers a psychological license to ship unfinished products. PC games started (imo) going down in quality, because (imo) the attitude that "oh, we can just fix it after we ship" had unintended consequences. Those consequences were to allow formation of the attitude that game software developers didn't have to get it right the first time. What started out as a nice, reasonable way of being able to patch games after the fact ("rapid releases") soon turned into a crutch for not coming up with a great product to begin with.

I began to notice a marked difference between PC and console games, and console games began edging ahead.

So maybe what you say is true for certain types of software development (bookstores, etc.) But certainly this isn't true for all software. Indeed, I would say to game developers who have fallen into this "rapid release" mode of thinking that it is time to rethink rapid release, and get it right the first time.

Greg Linden said...

Games are an interesting example, Jeremy. Like movies and books, games often involve a bold creative vision in their storyline, one that has to be delivered complete to be maximally effective.

And it also is a good point that games seem to be released buggy these days. I am not happy about that either.

There are two spots where I disagree though. First, with games, we do see many games offering extensions of the game in later releases, sometimes even community-generated extensions, once the developers learn more about how people enjoy the game. I think this is a form of iterated development post release that is appropriate for creative entertainment.

Second, with software in general, I think you have an assumption in your last line -- "get it right the first time" -- that most organizations are capable of running multi-year projects starting from an initial vision and getting a high quality product in the end. The track record says otherwise; most large software projects fail. Limited information and daunting complexity make big projects risky.

jeremy said...

There are two spots where I disagree though. First, with games, we do see many games offering extensions of the game in later releases, sometimes even community-generated extensions, once the developers learn more about how people enjoy the game.

But extensions don't go back and fundamentally change the mechanics and interactions of the initial game. They just add more maps/levels/weapons/spells etc. They're more like sequels (to a book, movie, etc.) than they are iterations on the original game.

I think this is a form of iterated development post release that is appropriate for creative entertainment.

Is it iterated development..on the original software? Or is it more like learning the "Hollywood formula", and producing more content that follows a similar successful pattern?

Take World of Warcraft, for example. It has this whole notion of experience points and leveling. People love that. People love working toward that type of goal. Expansions to the game give people more powerful items, and the ability to advance to even higher maximum levels. Level 80 rather than Level 60 (or whatever -- I don't actually play WoW). But they don't fundamentally change the game itself, and how its basic mechanisms works. It's not like there is massive, data-driven experimentation going on.. Blizzard isn't trying to figure out whether or not they should make the new maximum level 100 (A) or 110 (B), based on user interactions. They already know that users like leveling up. Expansions are just giving the users more of the same.

Second, with software in general, I think you have an assumption in your last line -- "get it right the first time" -- that most organizations are capable of running multi-year projects starting from an initial vision and getting a high quality product in the end.

So the solution is instead to force users to be A/B game testers, so that more companies can enter the marketplace? That just doesn't fly with me. If a company is incapable of delivering a complete, bold product, then that company shouldn't be producing a game. It's a disservice to the consumers to expect each user to be a beta tester.

All I'm saying here is that not all software is created (pun not intended) equally. You have to consider the end user perspective, how the software will be used, when deciding whether or not the A/B, data-driven, fast iteration methodology is appropriate. Not just the capabilities of the company.

The rapid release approach to software engineering is not a blanket panacea, right? It's good for many things, just not all things. That's all I'm saying.

Greg Linden said...

Absolutely, I think you are right that rapid releases are not a panacea. But, primarily because of the changes in easily and quickly distributing software updates, I think it is worth thinking about whether it would be useful to organize around rapid releases on more software projects.

I think where we might still disagree is the implication that releasing software more frequently exposes people to more bugs and worse bugs. I don't see the evidence for that.

In fact, I'm claiming that the bugs are worse and quality of software lower under what is the widespread norm in our industry of working on a big software project for a couple years, flailing badly at project management and blowing deadlines, and then finally shoveling crapware out the door more out of frustration than completeness.

The complexity of large software projects is bewildering. Managing that complexity is an expensive effort that, the data shows, typically fails.

What I am arguing is that rapid releases reduces the complexity of software projects. It makes it easier and cheaper for a company to produce something useful to customers.

jeremy said...

I think where we might still disagree is the implication that releasing software more frequently exposes people to more bugs and worse bugs. I don't see the evidence for that.

I was actually trying to make a different point. As I wrote above:

And what I noticed starting to happen is that this gave the PC software developers a psychological license to ship unfinished products. PC games started (imo) going down in quality, because (imo) the attitude that "oh, we can just fix it after we ship" had unintended consequences. Those consequences were to allow formation of the attitude that game software developers didn't have to get it right the first time. What started out as a nice, reasonable way of being able to patch games after the fact ("rapid releases") soon turned into a crutch for not coming up with a great product to begin with.

I was probably not very clear, but what I meant here by "getting it right" wasn't about bugs; it was about user experience and overall product quality. The snappiness of the gameplay. The difficulty of the game (e.g. the number of monsters or health potions that appear -- sometimes there can be too many or too few, which makes for a game that is either too hard or too easy.. no fun either way). The relative balance between various units. Things like that. That's what I meant by "unfinished" products.

If you know you can fix those things in post, then you'll psychologically allow yourself to ship something of lower quality, in the first iteration. Versus if you know you won't be able to fix it after the fact, you'll get it right by the time you ship.

I do get the feeling that knowing that you can iterate has led to products of lower quality (again, experience, not bugs) shipping. I don't have any hard and fast evidence that I can point you to, though. I'll keep my eyes peeled and report back if I find anything. It's just the impression I'm left with, having used countless pieces of software over the years.