Do IT projects ever succeed? Not if you look at the headlines. NHS Connecting for Health, FBI case management, any number of ERP implementations - stories of large IT project problems abound.
Nor if you look at the analyst reports. The best-known analysis of IT project success is published each year by the Standish Group and bears the appropriate title of Chaos reports.
The message: that more than 80 per cent of IT projects fail, and the trend has not become a lot better since 1994.
But look to the vendors and you will get a different story. Every leading system integrator will point at its best-of-breed project management methods and its adherence to international standards for quality management. It will proclaim its record of on-time, on-budget delivery.
Likewise, the smaller integrators will proclaim their personal touch and client focus as the reasons they have a perfect record of successful delivery. All those projects are failing, yet no one ever works on a failed project? What's really going on here?
Well, for a start, most of those people who report high failure rates are selling something. For newspapers, a long list of projects that succeeded last month hardly amounts to compelling copy.
And, worse, the analysts and consultants need failures so that they can sell their failure avoidance remedies. So they have manipulated the definition of "failure" to increase the perceived failure rate.
It is very easy to do: you set up a six-month project with a budget of £1 million (€1.45 million) and projected benefits of £2 million.
After a month of detailed design, the team comes back to you and says: "We've found a way to generate £4.5 million of benefits. Unfortunately, the project to do this is going to cost £1.5 million and take nine more months."
They then go away and deliver exactly this. By Standish's definition, this is a failure - 50 per cent over the original budget and hugely over time.
But from your perspective, this project has delivered. It might even have been better to double or even triple your initial investment, depending on how time-critical the end result was and what cash you had available to invest.
Of course, the vendors are selling something too. But consider this: the evidence is mounting in the US and elsewhere that investment in IT is driving up overall economic growth through increased productivity. Someone is getting it right somewhere.
Here are three things to remember when you are caught in the middle of this crossfire of illusory statistics:
• Project failure can be okay. As Charles Dickens wrote in Little Dorrit: "Every failure teaches a man something, if he will learn." If avoiding failure means avoiding learning, then it is not the way to prosper in a knowledge economy.
The way to prosper is to fail fast, fail often, learn and move on. (This is, of course, the antithesis of what some projects do - they cover their failures, aggregating them all up into one big disaster at the end. This sort of failure is not okay.)
• Innovation and exploration are processes of controlled failure. According to the UK Offshore Operators' Association, only one in eight exploration wells drilled in the North Sea discovered economic deposits of oil or gas. That is a failure rate comparable with that reported by the Standish Group. Was the North Sea a failure?
• Success comes from avoiding unnecessary failure. It is the avoidable failures that organisations need to be concerned about: the ones where they make the same mistakes again. The unavoidable failures that come from taking calculated risks offer opportunities for learning, not recriminations.
And how do you separate unnecessary failures from calculated risks?
By keeping in touch with reality. Most avoidable failures happen because people create a dream world for themselves. They curtail discussion of what is actually possible. They ignore available information about what is really going on. They redefine terms to suit their preconceptions.
This is what is happening now with Connecting for Health, the IT program for the UK's national health service, for example. Different groups have made up their own definitions of success, based on their preconceptions of how they want the program to be doing.
In among all this noise, it is getting increasingly difficult to unravel just how the program is actually doing. (Probably, like any real program, well in some areas and not so well in others.)
In this shadowy world, any independent view is at a premium. External perspectives help bring reality back into focus, so you can deal with it before it blows up.
And this focus on reality gives you a solid grounding of validated information from which to learn if a failure does affect your project.
IT project failure? It happens. It is what you do with it that counts.