At a conference, I was listening to a talk by a “famous” speaker. I was rather unimpressed - He was switching topics in a confusing way, and the talks also was a bit boring. But it was OK(ish).

But then he based one of his arguments on the Chaos report. Now, you know, I am really skeptical when someone uses the chaos report - especially data from before 2015 - to back up their arguments. Especially when they talk about agile software development.

Let me tell you why. And I also want to tell you why the data from 2015 (and hopefully onward) might represent modern software development better - But still may have some problems.

Chaos Report

The chaos report measures whether projects delivered the original scope on time and within the original budget. Every year, they report really large failure rates: Around 60-70% of all projects are either “challenged” or “failed” in most years.

The (pre-2015) chaos report defined “Success” as “Delivered the original scope on time and within budget”.

There are a lot of things one could criticize about this report. For example, that it excludes a large number of organizations and project types, by design. The statistics are flawed: Organizations self-select to participate in the survey, and no sample size is defined.

But my biggest problem with this report is how they measured “challenged” projects - At least up until 2014. Challenged projects are projects that did deliver, but either late or not in budget or they did not deliver the original scope [1].

A measure of success where Flickr is “challenged” (did not deliver original scope, which was an online game) and Microsoft Bob might be a success (I don’t know if they delivered on time, but they might have) is just not useful. [2]

The Problem With "On Time"

When we talk about “on time” in this context, we probably mean an estimated delivery date that was based on the original scope. So, no “Fixed deadline, deliver whatever is ready” scenario and also no “We deliver when it is ready” scenario. Both do not make sense when we talk about what the chaos report measures: “Original scope delivered on time and within budget”.

So “On time” depends on a delivery date based on an estimate (i.e. guess!) based on the original scope. And that estimate has to be done very early in the project, where we have little actual data and experience. This means that either this estimate is heavily padded, or the project will not be on time.

And “On Time” also depends on how well we know the project scope at the beginning. To give some meaningful estimates at the very beginning of a (longer-running) project, we need to define the requirements in detail. Some high-level goals will probably not be enough to estimate an end date. But defining detailed requirements early has some real disadvantages over working with high-level goals (like we lose a lot of flexibility).

The Problem With "In Budget"

Here we have almost the same problem as in “The Problem With On Time” above: With a stable team, the budget of a software development project is just the time multiplied by some factor. So, with a stable team “In Budget” and “On Time” are basically the same.

But, you might think, I could add some people to the project team, so that we can deliver in time, but slightly over budget… Well, this often backfires.

“Adding manpower to a late software project makes it later” - Fred Brooks wrote this in 1975. Yet, even now, in 2015, many managers and teams grossly underestimate the impact of new team members. In fact, they will not only be over budget (as expected), but also over time in the end.

The Problem With "Original Scope"

Around 27% of the original requirements will change within a year. So, if you try to deliver the original scope in a long-running project, you will deliver outdated software in the end.

But what if your project is quite short and your requirements are really stable? Then this measure would not be so bad, right?

Probably… But you still lose a lot of flexibility when you define the requirements in minute detail in the beginning of the project. So you reduce your chance to deliver on time and in budget. And you lose a lot of time in the beginning defining the requirements - Time that you could use to deliver working software to get real feedback from real users!

To get that flexibility and feedback back, you only want to have some high-level goals at the beginning. But can you really measure whether you delivered the original scope then?

A New Measure of Success

Starting with the 2015 report, the Standish Group added three more factors to their success criteria: Strategic corporate goal, value delivered and satisfaction [3] [4]. The reason why the did this is

However, we have seen many projects that have met the Triple Constraints and did not return value to the organization or the users and executive sponsor were unsatisfied. Jim Johnson

Now, this is a big step in the right direction. But I think the reasoning is backward. My problem with the chaos report is not about projects that were counted as success but did not return value. Sure, those projects are a big problem. But they don’t explain the skewed numbers in the chaos report.

I see a bigger problem with projects that failed in the first three categories, but still provided value and satisfaction for their users and achieve their strategic goal. Those must be counted as success!

I hope the Standish Group weighs the factors in a way so that when the new three are met, the first three become unimportant. Then, maybe, we’ll get interesting numbers from the chaos report in the future. But for now, I remain skeptical.

Conclusion

All the above problems boil down to: It does not matter whether you deliver the original scope on time and in budget. When people are using your software and get real value from it, it is a success, even if it was late or did not deliver the original scope. On the other hand, “But we delivered in time and on budget, the requirements were wrong” is no excuse when nobody uses your software. It is a failure.

The only thing that matters in the end is: Does the software you created deliver more value than it cost? Did your team and your company maximize the return on investment?

The new measure of success better captures this reality, but it still contains “on time”, “on target” and “within budget”, and most of the problems I wrote about still apply. It is a step in the right direction, but I will continue to be skeptical when someone uses the chaos report to back up their arguments about agile software development.

[1] Actually the Standish Group acknowledges this and warns people to lump “challenged” and “failed” together (Interview: Jim Johnson of the Standish Group). Still, challenged sounds like there is a problem to be solved. And this is simply not true. If you work in a truly agile way, your project is automatically in the “challenged” area!

[2] See The Non-Existent Software Crisis: Debunking the Chaos Report

[3] Success Redefined

[4] Standish Group 2015 Chaos Report - Q&A with Jennifer Lynch

Thanks to Glen Alleman for his feedback and input after reading a draft of this article on my newsletter.

You might also be interested in: