NOW AVAILABLE The draft of my book on Organizational Intelligence is now available on LeanPub Please support this development by subscribing and commenting. Thanks.

Wednesday, June 25, 2014

Defensive denial

... often follows what Freud called "kettle logic".

"The problem doesn't exist, and anyway it isn't a problem for us, and anyway we're already dealing with it."

Imagine we ask a manager whether her organization experiences any of the Symptoms of Organizational Stupidity. Suppose she denies it, or says it doesn't matter. Guess what - defensive denial is one of the symptoms on the list.

In 2008, Blockbuster CEO Jim Keyes expressed some bewilderment at his competitor’s success.

"I’ve been frankly confused by this fascination that everybody has with Netflix. ... Netflix doesn’t really have or do anything that we can’t or don’t already do ourselves." 

Denial is a key phase in the hype curve for new ideas (especially but not exclusively technological ones). A concept is rejected as meaningless, dangerous and/or unnecessary, while simultaneously being bundled together with earlier concepts.

"That concept doesn't make sense, and even if it did it wouldn't be technologically feasible, and anyway we already have a perfectly good word for it and lots of people are already doing it so we don't need a new word."

Brian Klapper, When Corporations Cannot Adapt (aka Fear the Kid in the Black t-shirt) (January 2013)

Related Posts: The Dynamics of Hype (Feb 2013)

Sunday, May 18, 2014

National Decision Model

@antlerboy asks does anyone know theoretical underpinning of the (rather good) police decision-making model?

The National Decision Model (NDM) is a risk assessment framework, or decision making process, that is used by police forces across the UK. It replaces the former Conflict Management Model. Some sources refer to it as the National Decision-Making Model.

Looking around the Internet, I have found two kinds of description of the model - top-down and bottom-up.

The top-down (abstract) description was published by the Association of Chief Police Officers (ACPO) sometime in 2012, and has been replicated in various items of police training material including a page on the College of Policing website. It is fairly abstract, and provides five different stages that officers can follow when making any type of decision - not just conflict management.

Some early responses from the police force regarded the NDM as an ideal model, only weakly connected to the practical reality of decision-making on the ground. See for example The NDM and decision making – what’s the reality? (Inspector Juliet Bravo, April 2012).

In contrast, the bottom-up (context-specific) description emerges when serving police officers discuss using the NDM. According to Mr Google, this discussion tends to focus on one decision in particular - to Taser or not to Taser.

"For me the Taser is a very important link in the National Decision Making Model . It bridges that gap between the baton and the normal firearm that has an almost certain risk of death when used." (The Peel Blog, July 2012). See also Use of Force - Decision Making (Police Geek, July 2012).

ACPO itself adopts this context-specific perspective in its Questions and Answers on Taser (February 2012, updated July 2013), where it is stated that Taser may be deployed and used as one of a number of tactical options only after application of the National Decision Model (NDM).

Of course, the fact that Taser-related decisions have a high Google ranking doesn't imply that these decisions represent the most active use of the National Decision Model. The most we can infer from this is that these are the decisions that police and others are most interested in discussing.

(Argyris and Schön introduced the distinction between Espoused Theory and Theory-In-Use. Perhaps we need a third category to refer to what people imagine to be the central or canonical examples of the theory. We might call it Theory-in-View or Theory-in-Gaze.)

In a conflict situation, a police officer often has to decide how much force to use. The officer needs to have a range of tools at his disposal and the ability to select the appropriate tool - in policing, this is known as a use-of-force continuum. More generally, it is an application of the principle of Requisite Variety.

In a particular conflict situation, the police can only use the tools they have at their disposal. The decision to use a Taser can only be taken if the police have the Taser and the training to use it properly. In which case the operational decision must follow the NDM.

More strategic decisions operate on a longer timescale - whether to equip police with certain equipment and training, what rules of engagement to apply, and so on. A truly abstract decision-making model would provide guidance for these strategic decisions as well as the operational decisions.

And that's exactly what the top-down description of NDM asserts. "It can be applied to spontaneous incidents or planned operations by an individual or team of people, and to both operational and non-operational situations."

Senior police officers have described the use of the NDM for non-conflict situations. For example, Adrian Lee (Chief Constable of Northants) gave a presentation on the Implications of NDM for Roads Policing (January 2012).

The NDM has also been adapted for use in other organizations. For example, Paul Macfarlane (ex Strathclyde Police) has used the NDM to produced a model aimed at Business Continuity and Risk Management. which he calls Defensive Decision-Making.

How does the NDM relate to other decision-making models? According to Adrian Lee's presentation, the NDM is based on three earlier models:

  • The Conflict Management Model (CMM). For a discussion from 2011, see Police Oracle.
  • The SARA model (Scan, Analyze, Respond, Assess) - which appears to be similar to the OODA loop.
  • Something called the PLANE model. (I tried Googling this, and I just got lots of Lego kits. If anyone has a link, please send.)

There is considerable discussion in the USA about the relevance of the OODA loop to policing, and this again focuses on conflict management situations (the "Active Shooter"). There are two important differences between combat (the canonical use of OODA) and conflict management. Firstly, the preferred outcome is not to kill the offender but to disarm him (either physically or psychologically). This means that you sometimes need to give the offender time to calm down, orienting himself into making the right decision. So it's not just about having a faster OODA loop than the other guy (although clearly some American cops think this is important). And secondly, there is a lot of talk about situation awareness and anticipation. For example, Dr. Mike Asken, who is a State Police psychologist, has developed a model called AAADA (Anticipating, Alerting, Assessing, Deciding and Acting). There is also a Cognitive OODA model I need to look into.

However, I interpret @antlerboy's request for theoretical underpinning as not just a historical question (what theories of decision-making were the creators of NDM consciously following) but a methodological question (what theories of decision-making would be relevant to NDM and any other decision models). But this post is already long enough, and the sun is shining outside, so I shall return to this topic another day.


Michael J. Asken, From OODA to AAADA ― A cycle for surviving violent police encounters (Dec 2010)

Adrian Lee, Implications of NDM for Roads Policing (January 2012).

Erik P. Blasch et al, User Information Fusion Decision-Making Analysis with the C-OODA Model (Jan 2011)

National Decision Model (ACPO, 2012?)

National Decision Model (College of Policing, 2013)

SARA model (Center for Problem-Oriented Policing)

Updated 19 May 2014

Saturday, April 26, 2014

On the true nature of knowledge

@pickover suggests that these two books, in theory, contain the sum total of all human knowledge. "The Joy of Logic", he remarks (via @DavidFCox).

"What they teach you at Harvard Business School" + "What they don't teach you at Harvard Business School"

Why is this wrong? Because knowledge doesn't follow the laws of elementary arithmetic. Adding two lots of knowledge together doesn't give you twice as much knowledge. (Does anyone really think that teaching children creationism as well as evolution will double their education?)

Knowledge is like light. When you add two light beams together, you may sometimes get more light. But you may also get puzzling patches of darkness. This is called interference. In high-school physics we learn that this is because light is a wave. If the two waves are out of phase, they cancel each other out.

(Curiously, uncertainty is also like light. When you add two pieces of uncertainty together, you may get less uncertainty. This is called hedging. Works best when the uncertainty is out of phase.)

Obviously these two books are out of phase.

Related posts

Does Big Data Release Information Energy? (April 2014)

Thursday, April 18, 2013

We Ought to Know the Difference

Is systems thinking really possible? Here's one reason why it might not be.

One of the concerns of systems thinking is the need to avoid the so-called environmental fallacy - the blunder of ignoring or not understanding the effects of the environment of a system. This is why, when systems thinkers are asked to tackle a concrete situation in detail, they often hesitate, insisting that it is wrong to look at the detail before understanding the context.

The trouble with this is that there is always a larger context, so this hesitation leads to an infinite regress and inability to formulate practical inroads into a complex situation. Many years ago, I read a brilliant essay by J.P. Eberhard called "We Ought to Know the Difference", which contains a widely quoted example of a doorknob. As I recall, Eberhard's central question is a practical one - how do we know when to expand the scope of the problem, and how do we know when to stop.

C West Churchman went more deeply into this question. In his book The Systems Approach and its Enemies (1979), he presents an ironic picture of the systems thinker as hero.

If the intellect is to engage in the heroic adventure of securing improvement in the human condition, it cannot rely on “approaches,” like politics and morality, which attempt to tackle problems head-on, within the narrow scope. Attempts to address problems in such a manner simply lead to other problems, to an amplification of difficulty away from real improvement. Thus the key to success in the hero’s attempt seems to be comprehensiveness. Never allow the temptation to be clear, or to use reliable data, or to “come up to the standards of excellence,” divert you from the relevant, even though the relevant may be elusive, weakly supported by data, and requiring loose methods.

Like Eberhard, Churchman seeks to reconcile the heroic stance of the systems thinker with the practical stance of other approaches. But we ought to know the difference.

This is an extract from my eBook on Next Practice Enterprise Architecture. Draft available from LeanPub.

John P. Eberhard, "We Ought to Know the Difference," Emerging Methods in Environmental Design and Planning, Gary T. Moore, ed. (MIT Press, 1970) pp 364-365

See extract here - The Warning of the Doorknob. The same extract can be found in many places, including Ed Yourdon's Modern Structured Analysis (first published 1989).

See also

Nicholas Berente, C West Churchman: Champion of the Systems Approach

Jeff Lindsay, Avoiding environmental fallacy with systems thinking (December 2012)

Updated May 14 2013

Saturday, March 30, 2013

From Enabling Prejudices to Sedimented Principles

In my post From Sedimented Principles to Enabling Prejudices (March 2013)  I distinguished the category of design heuristics from other kinds of principle. Following Peter Rowe, I call these Enabling Prejudices.

Rowe also uses the concept of Sedimented Principles, which he attributes to the French philosopher Maurice Merleau-Ponty, one of the key figures of phenomenology. As far as I can make out, Merleau-Ponty never used the exact term "sedimented principles", but he does talk a great deal about "sedimentation".
In phenomenology, the word "sedimentation" generally refers to cultural habitations that settle out of awareness into prereflective practices. Something like the "unconscious". (Professor James Morley, personal communication)
"On the basis of past experience, I have learned that doorknobs are to be turned. This ‘knowledge’ has sedimentated into my habitual body. While learning to play the piano, or to dance, I am intensely focused on what I am doing, and subsequently, this ability to play or to dance sedimentates into an habitual disposition." (Stanford Encyclopedia of Philosophy: Merleau-Ponty)

This relates to some notions of tacit knowledge, which is attributed to Michael Polyani. There are two models that are used in the knowledge management world that talk about tacit/explicit knowledge, and present two slightly different notions of internalization. 

Some critics (notably Wilson) regard the SECI model as flawed, because Nonaka has confused Polyani's notion of tacit knowledge with the much weaker concept of implicit knowledge. There are some deep notions of "unconscious" here, which may produce conceptual traps for the unwary.

Conceptual quibbles aside, there are several important points here. Firstly, enabling prejudices may start as consciously learned patterns, but can gradually become internalized, and perhaps not just implicit and habitual but tacit and unconscious. (The key difference here is how easily the practitioner can explain and articulate the reasoning behind some design decision.)

Secondly, to extent that these learned patterns are regarded as "best practices", it may be necessary to bring them back into full consciousness (whatever that means) so they can be replaced by "next practices". 

Bryan Lawson, How Designers Think (1980, 4th edition 2005)

Peter Rowe, Design Thinking (MIT Press 1987)

Wilson, T.D. (2002) "The nonsense of 'knowledge management'" Information Research, 8(1), paper no. 144

 Thanks to my friend Professor James Morley for help with Merleau-Ponty and sedimentation.

Thursday, February 28, 2013

Intelligence and Governance

Katy Steward of @TheKingsFund asks What Makes a Board Effective? (Feb 2013). She's looking specifically at the role of the Board in the National Health Service, but there is much that can be generalized to other contexts. She asks some key questions for any given board.

  • Are its members individually effective and do they communicate effectively – for example, do they challenge themselves and others?
  • Do they use energetic presentations and have insightful conversations?
  • Do they support their colleagues and have good decision-making skills?

In this post, I want to develop this line of thinking further by exploring what the concept of organizational intelligence implies for boards.

1. Boards need to know what is going on.

  • Multiple and diverse sources of information - both quantitative and qualitative
  • Understanding how information is filtered, and a willingness to view unfiltered information as necessary. 
  • Ability to identify areas of concern, and initiate detailed investigation 

2. Boards need to make sense of what is going on.

  • Ability to see things from different perspectives - patient quality, professional excellence, financial accountability, social accountability. 
  • Ability to see the detail as well as the big picture. 
  • Courage to investigate and explore any discrepancies, and not to be satisfied with easy denial.

3. Boards need to ensure that all decisions, policies and procedures are guided by both vision and reality. This includes decisions taken by the board itself, as well as decisions taken at all levels of management.

  • Decisions and actions are informed by values and priorities, and reinforce these values. (People both inside and outside the organization will infer your true values not from your words but from your actions.) 
  • Decisions and actions are guided by evidence wherever possible. Ongoing decisions and policies are open to revision according to the outcomes they yield.
  • Decision-making by consent (Robertson)

4. Boards need to encourage learning.

  • Effective feedback loops are established, monitoring outcomes and revising decisions and policies where necessary. 
  • Courage to experiment. Ability to tolerate temporary reduction in productivity during problem-solving and learning curve. Supporting people and teams when they are out of their comfort zone. 
  • Willingness to learn lessons from anywhere, not just a narrow set of approved exemplars.

5. Boards need to encourage knowledge-sharing

  • All kinds of experience and expertise may be relevant 
  • Overcoming the "silos" and cultural differences 
  • The collective memory should be strong and coherent enough to support the organization's values, but not so strong as to inhibit change.

6. Boards work as a team, and collaborate with other teams

  • Effective communication and collaboration within the board - don't expect each board member to do everything. 
  • Effective communication and collaboration with other groups and organizations.
  • Circle Organization (Robertson)

Note: The six points I've discussed here correspond to the six core capabilities of organizational intelligence, as described in my Organizational Intelligence eBook and my Organizational Intelligence workshop.

See also

Brian Robertson, The Sociocratic Method. A Dutch model of corporate governance harnesses self-organization to provide agility and a voice to all participants (Strategy+Business Aug 2006)

Steve Waddell, Wicked Problems, Governance as Learning Systems (Feb 2013)

Updated 1 March 2013

Tuesday, February 26, 2013

Developing cultures of high-quality care

#kfleadership Excellent lecture at @TheKingsFund this evening by Professor Michael West. Here are some of my notes.

When he left college West was short of money, so he took a job in the coal mines. Productivity was important to everyone, and the pay at the end of the week depended on the quantity of coal extracted. But there was one thing more important than productivity, namely safety.

In many organizations this would just be lip service. But in the coal mines, safety was taken very seriously, and management actions were completely congruent with this.

West argued that the same should apply in the Health Service. Of course productivity is fundamentally important, but the number one priority should not be productivity but high-quality and safe patient care.

Valuing patients and staff turns out to be good management. West's argument is not merely based on rhetoric, but is supported by data. Patient outcomes and patient satisfaction are highly correlated with staff satisfaction and morale, and these in turn are correlated with staff engagement, which West defined in terms of three things: pride, intrinsic engagement and involvement in decisions. Ultimately this links back to improved productivity.

Someone in the audience objected that productivity must always be the top priority, otherwise you risk running out of money to pay for patient care. West replied that productivity follows from good people management. He agreed that the NHS has a great deal to learn from the private sector, and expressed a hope that private sector expertise (including non-executive board members) would not be limited to the Marketing and Finance perspectives.

West affirmed that the NHS is full of intelligent and highly motivated people, and said that the traditional command and control mode of leadership was such a waste of resource. The key role of leaders is to learn from staff, and to realize the potential of the people.

People at all levels require courage to accept challenging targets - in other words, to strive for things that they won't always achieve. The organization must accept and learn from failure to reach these targets. Blaming people for failure to excel is not only stupid and unfair, it is also counter-productive, because it makes people risk-averse and inhibits them from striving for anything that isn't guaranteed in advance.

Leadership includes the courage to seek unwelcome information - for example feedback that indicates things not going well.

After the lecture, I was chatting to a group from a London teaching hospital about accountability. As I see it, accountability doesn't only mean taking responsibility for the consequences of one's decisions (such as short-sighted cost-cutting) but also taking responsibility for what one chooses to pay attention to. One of the classic examples in Moral Philosophy concerns a ship owner who sends a ship to sea without bothering to check whether the ship was sea-worthy. Some argue that the ship owner cannot be held responsible for the deaths of the sailors, because he didn't actually know that the ship would sink. I think most people would see the ship owner having a moral duty of diligence, and would regard him as accountable for neglecting this duty.

In the current climate, the NHS leadership has a duty to achieve high quality patient care and productivity, and the evidence from Professor West is that this can best be achieved by engaging staff at all levels. Executive boards must surely be held accountable if they neglect to do this.

See also Culture of Fear (Storify, 27 Feb 2013)

The ship-owner example can be found in an essay called "The Ethics of Belief" (1877) by W.K. Clifford, in which he states that "it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence".

Updated 28 Feb 2013