"Our goals can only be reached through a vehicle of a plan, in which we must fervently believe, and upon which we must vigorously act. There is no other route to success." - Pablo Picasso
It is an old story: about 30% of IT application projects succeed, 45% are "challenged," and the other quarter fail altogether. That's the consistent result over the years of the Standish Group Study of Project Outcomes. Jorge Dominguez, here, displays a chart of the remarkably similar results since 1994. Not a pretty picture, right? Some question the validity of the Standish studies, but Scott Ambler parallels the Standish story in a recent Dr Dobbs column called "Lies, Great Lies, and Software Development Project Plans," which itemizes the strategies commonly used by IT project managers to "stay out of trouble" when schedule/budget results don't match initial estimates. For example, "18% change the original schedule to reflect the actual results".
The frequent reaction to stats like these is to scapegoat the IT folks by finding fault with their tools, processes, or skills. If we just had a more efficient methodology, or a slick development suite, or a more highly skilled team, or a better project process, were more agile, or whatever, then application projects would be on time, resulting systems would be faultless, and we could drive down outrageous IT costs.
Mr. Ambler plays out the IT perspective in this column about Agile project estimation. Here's how I paraphrase the gist of his logic: you can't accurately estimate early on due to normal uncertainty of high level requirements; you will get new requirements that will expand scope. If you offer a range of predicted outcomes, management will hold you to the most optimistic, which will turn out to be a gross underestimate. Your best strategy is to use an accurate estimation method (net or average velocity, described in the article), and be straight with management even if they "don't like what they are hearing".
Based on my experience that sequence of events can be distressingly true to life. But I've been on the other kind of IT project as well, the kind that ends on time, stays on budget, and satisfies the business. The scope of Mr. Ambler's article doesn't include asking why the requirements are changing so drastically, but if it had, maybe he would have asked questions like these:
- Does the business unit involved have defined objectives and processes?
- Are strategic and tactical business objectives of the project defined, and do they support the business unit's objectives?
- Are the critical business success factors of the project defined?
- Does the new project change business processes and if so have the new ones been defined yet?
- Are there documented business requirements and a process for evaluating/approving changes?
Based on what I've seen requirements shifts that nearly double the cost of an IT project, like the ones Mr. Ambler describes, result from inadequate business definition and business requirements analysis. Of course it is true that the "initial high-level requirements and architecture envisioning" early in a typical project only enables an estimate "in the +/- 30% range". But effective requirements and architecture definition, when based on effective business definition, can enable a much closer estimate. Further, it is possible to accurately estimate how long it will take to provide the more accurate estimate.
In some cases the business definition prerequisites clearly don't yet exist at the outset of an application development effort. One thing I've seen work in such cases is to divide the project into two separate projects: one to closely define the effort, and if needed make up for any lacking business definition, and another to build, test, and install the application. In between the two phases management will have the opportunity to review the costs/benefits of the project and evaluate whether to continue or not. This post describes one project that successfully used that strategy.
Please don't read this as repudiation of Agile methods and advocation of "big requirements up front". Agile methods work well whether or not business prerequisites are defined, but they seem not to work well when (1) the project's goals shift with evolving business definition and (2) the plan and budget aren't adjusted accordingly.
Business definition, completed as part of a discrete requirements phase, that leads to a management decision to continue or not, gives the team the opportunity to build on a solid business foundation. It also gives management a reasonable estimate and a chance to bail if the project isn't worth it.