Monday, December 21, 2009

Why New Systems Don't Work

In response to my previous post Passive Adoption, which suggested that real adoption was always optional, @oscarberg replied that adoption is optional in theory only - if you resist adopting new ERP you will likely lose your job.

Here's how I rescue my suggestion from being disproved by Oscar's counterexample. In many organizations, I agree that individuals can't openly rebel or resist the official adoption of some corporate system such as ERP. However, despite the absence of visible resistance, the organization somehow frustrates the purposes of the ERP system.

How does this happen? Over-simplying enormously, let's say the business case (top-down purpose) of adopting ERP across the organization is based on the cost-saving from eliminating surplus stock. But local managers like to have surplus stock, because it gives them more flexibility to achieve their targets. So we can observe the local managers diligently complying with the demands on the ERP system, and yet for some mysterious reason the surplus stock doesn't disappear. In other words, people whose departmental interests may be slightly at odds with the overall corporate interest, or who may feel their autonomy challenged by a centralized ERP system, may somehow manage to mislead the ERP system in order to preserve their local surpluses.

So in what sense has an organization in this position actually adopted ERP at all? Sure, the software has been installed, but the people in the organization are subverting the operation of the system so that it fails to do what it was designed to do.So this isn't true ERP after all.

There are many other examples of this kind of failure. We have seen many examples of target-based systems that are intended to improve customer service and value-for-money, yet achieve the exact opposite. (Followers of Deming and John Seddon are constantly railing against this kind of folly.) So-called knowledge management systems, that perennially fail to make any serious inroads into corporate knowledge. And so on, and so on. (Please feel free to add more examples in the comments.)

Oscar's conclusion from this is that whereas the old approach has been to design apps without involving users & then force adoption from top down, optional software require even more focus on people to be adopted.

I agree with this, but I'd go a lot further. Yes we should design all software recognizing that its correct adoption is always kind-of optional: however compulsory it may seem, covert subversion is always possible. But it's not just about software design. When we forget to look at the human organization as a joined-up system, using systems thinking lenses, then we are likely to design software that is highly vulnerable to this kind of failure.

New systems can work. But we need to understand why new systems and attempted innovations often don't work, why "best practice" in systems design often fails to produce the desired outcomes, and how innovation benefits from a whole-system perspective.


  1. "the people in the organization are subverting..."

    FOUL! It was the freakin' system that showed up -- all so badly designed -- that subverted the 'reality' of the work that was already being done.

  2. I certainly agree with the force of Paula's comment. But the larger problem is that the scope of "design" is wrong. "System" designers often are only given the software to design, and are encouraged to consult "users" (how inappropriate that word is by the way - it would often be more accurate to use the word "victims" instead) on trivial things like visual layout. What is neglected is the design of the whole-system, including the conflicting values and trust issues. (Plug for VPEC-T here.)

  3. Surely "the system that showed up ... all badly designed" failed because it wasn't the whole system. As you say in your reply, there was a scope error.

    I looked at this recently for Privacy-Enhancing Technologies... an area where (from a left-brained perspective) all the elements seem to be available to create a technical solution - but where adoption still seems a very distant goal.

    The conclusion I came to was that the technical elements were doomed to failure unless they formed an effective part of an "eco-system", including privacy-enhancing legislation, governance, processes and culture.

    Under that broad heading of "culture" I would include those steps which might mitigate the risk of stakeholders seeking ways to look as though they are complying with the design aims of the new system while actually pursuing their own goals to its detriment.

    I had the opportunity to use that analogy in Moscow in 1996, and the idea of individuals subverting an externally-imposed plan reduced the audience to helpless giggling.

  4. Richard: indeed, few see that as anything but normal. There's not only a discipline labeled erroneously as engineering but a whole ill-named department to go with it. Information technology is still weak on the data, let alone having graduated to 'inform'.

  5. It's those darn customers that get in the way of our great new systems. We keep having to actually provide them service and then this technology gets in the way.

    Think of the waste on all these new systems. Strategic plans, project plans, milestones, cost benefit analysis, etc. We wind up having the defacto purpose of a hit date vs. a system that works. Amazing that technology has exceeded our ability to use it intelligently.

    Regards, Tripp Babbitt