What is the most amount of money you have ever spent on implementing a single piece of software? And what is the most you would be prepared to spend?

In this article we’ll take a look at how an eye-wateringly expensive software project can be worth every penny – or be the costliest mistake you’ll ever make.

The real cost of software

Purchasing and developing software can cost a great deal of money.

Still, any seasoned IT Manager will tell you that the real cost of software can’t be found on an application’s price tag or licence invoice. The big expenses come into play when you look at the time and money required to plan, implement and integrate the new software, particularly in a large organisation with complex operational processes.

Cost vs value

We can never discuss cost without also discussing value. In the software world, the value comes from what you are able to achieve, faster, better or cheaper, thanks to the software tools. The more value a system brings to your business over time, the more likely you are to consider a large up-front investment. However, the value over time is never guaranteed – and there are plenty of things that can go wrong in the meantime.

Software horror stories

We regularly hear horror stories about software development projects running away with budgets. Companies get sucked in to never-ending projects that don’t deliver the promised goods, or end up incurring such high implementation and training costs that the actual business value is diminished.

These are a few examples of failure and overspend in software projects:

  • Taurus
    Back in the 1980s the London Stock Exchange tried to implement an Electronic Trading Platform solution named Taurus. However, through changes in functional requirements, the project ran beyond every incremental budget. It is estimated that the London Stock Exchange lost £75M and additional stakeholders lost in the region of £400M, but the full final cost was not disclosed and the project was never completed.
  • Bolit
    In 1997 Sweden’s Patent and Registration Office attempted to implement a customer service and finance administration system called Bolit. This ended up running $35M over budget, while still being too complicated and badly functioning. The agency still does not have a working IT system as the project was scrapped.
  • NHS Connecting for Health
    In 2002, The NHS requested a system for electronic care records called NHS Connecting for Health. Costs gradually ballooned, without the system delivering the expected functionality. The project was later described by MPs as “one of the worst and most expensive contracting fiascos ever”. Officials have stated the final cost to be close to £20bn. The project was officially disbanded in 2011.
  • Canada Central Government website
    In 2013 Canada’s Central Government launched a project to consolidate 1,500 websites into a single portal on one platform. However, more than three years later, only 10,000 web pages of a total of 17 million had been successfully migrated. The anticipated costs quickly rose from $9.4M to an additional $28M – and are still ongoing.

Lessons to learn

What lessons can we learn from these cases, to prevent us from making the same mistakes? Well, we need to be aware of the key influences impacting software spend:

  • The fallacy of sunk costs
    Sometimes cancelling an implementation as it’s getting too expensive can be the hardest decision to make. There will always be fingers pointing at the amount invested already – and there is a temptation to try to make the most of that investment no matter what happens. However, the IT Director needs to take a pragmatic view on where the company’s funds are best spent for long-term profitability.
  • Scope creep
    Sometimes the initial budget gets gradually pushed up, as new features are added throughout the process. This scenario is often referred to as “scope creep”, and it is what happened with Taurus and many other similar implementations. This can be mitigated by running Agile, dynamic software projects where continuous feedback from the client is encouraged.
  • Technical issues
    Other implementations run into trouble due to internetworking issues of large systems. In the case of the Canadian website project, the technical problems may or may not have been possible to predict fully, but there was most likely a lack of pre-emptive analysis done before the project launched.
  • Obsolete systems
    In some cases, a product can be deemed obsolete before it is even fully delivered – as new developments simply overtake the technology. This happened with the BBC’s Digital Media Initiative, which was judged to be obsolete as cheaper off-the-shelf options came into existence before the project was eventually scrapped.
    The progress of technology cannot always be predicted, but development teams should stay on top of emerging IT in order to recommend the best solutions for any project.