NOW AVAILABLE The draft of my book on Organizational Intelligence is now available on LeanPub Please support this development by subscribing and commenting. Thanks.

Thursday, April 18, 2013

We Ought to Know the Difference

Is systems thinking really possible? Here's one reason why it might not be.

One of the concerns of systems thinking is the need to avoid the so-called environmental fallacy - the blunder of ignoring or not understanding the effects of the environment of a system. This is why, when systems thinkers are asked to tackle a concrete situation in detail, they often hesitate, insisting that it is wrong to look at the detail before understanding the context.

The trouble with this is that there is always a larger context, so this hesitation leads to an infinite regress and inability to formulate practical inroads into a complex situation. Many years ago, I read a brilliant essay by J.P. Eberhard called "We Ought to Know the Difference", which contains a widely quoted example of a doorknob. As I recall, Eberhard's central question is a practical one - how do we know when to expand the scope of the problem, and how do we know when to stop.

C West Churchman went more deeply into this question. In his book The Systems Approach and its Enemies (1979), he presents an ironic picture of the systems thinker as hero.

If the intellect is to engage in the heroic adventure of securing improvement in the human condition, it cannot rely on “approaches,” like politics and morality, which attempt to tackle problems head-on, within the narrow scope. Attempts to address problems in such a manner simply lead to other problems, to an amplification of difficulty away from real improvement. Thus the key to success in the hero’s attempt seems to be comprehensiveness. Never allow the temptation to be clear, or to use reliable data, or to “come up to the standards of excellence,” divert you from the relevant, even though the relevant may be elusive, weakly supported by data, and requiring loose methods.

Like Eberhard, Churchman seeks to reconcile the heroic stance of the systems thinker with the practical stance of other approaches. But we ought to know the difference.

This is an extract from my eBook on Next Practice Enterprise Architecture. Draft available from LeanPub.

John P. Eberhard, "We Ought to Know the Difference," Emerging Methods in Environmental Design and Planning, Gary T. Moore, ed. (MIT Press, 1970) pp 364-365

See extract here - The Warning of the Doorknob. The same extract can be found in many places, including Ed Yourdon's Modern Structured Analysis (first published 1989).

See also

Nicholas Berente, C West Churchman: Champion of the Systems Approach

Jeff Lindsay, Avoiding environmental fallacy with systems thinking (December 2012)

Updated May 14 2013

Saturday, March 30, 2013

From Enabling Prejudices to Sedimented Principles

In my post From Sedimented Principles to Enabling Prejudices (March 2013)  I distinguished the category of design heuristics from other kinds of principle. Following Peter Rowe, I call these Enabling Prejudices.

Rowe also uses the concept of Sedimented Principles, which he attributes to the French philosopher Maurice Merleau-Ponty, one of the key figures of phenomenology. As far as I can make out, Merleau-Ponty never used the exact term "sedimented principles", but he does talk a great deal about "sedimentation".
In phenomenology, the word "sedimentation" generally refers to cultural habitations that settle out of awareness into prereflective practices. Something like the "unconscious". (Professor James Morley, personal communication)
"On the basis of past experience, I have learned that doorknobs are to be turned. This ‘knowledge’ has sedimentated into my habitual body. While learning to play the piano, or to dance, I am intensely focused on what I am doing, and subsequently, this ability to play or to dance sedimentates into an habitual disposition." (Stanford Encyclopedia of Philosophy: Merleau-Ponty)

This relates to some notions of tacit knowledge, which is attributed to Michael Polyani. There are two models that are used in the knowledge management world that talk about tacit/explicit knowledge, and present two slightly different notions of internalization. 

Some critics (notably Wilson) regard the SECI model as flawed, because Nonaka has confused Polyani's notion of tacit knowledge with the much weaker concept of implicit knowledge. There are some deep notions of "unconscious" here, which may produce conceptual traps for the unwary.

Conceptual quibbles aside, there are several important points here. Firstly, enabling prejudices may start as consciously learned patterns, but can gradually become internalized, and perhaps not just implicit and habitual but tacit and unconscious. (The key difference here is how easily the practitioner can explain and articulate the reasoning behind some design decision.)

Secondly, to extent that these learned patterns are regarded as "best practices", it may be necessary to bring them back into full consciousness (whatever that means) so they can be replaced by "next practices". 

Bryan Lawson, How Designers Think (1980, 4th edition 2005)

Peter Rowe, Design Thinking (MIT Press 1987)

Wilson, T.D. (2002) "The nonsense of 'knowledge management'" Information Research, 8(1), paper no. 144

 Thanks to my friend Professor James Morley for help with Merleau-Ponty and sedimentation.

Thursday, February 28, 2013

Intelligence and Governance

Katy Steward of @TheKingsFund asks What Makes a Board Effective? (Feb 2013). She's looking specifically at the role of the Board in the National Health Service, but there is much that can be generalized to other contexts. She asks some key questions for any given board.

  • Are its members individually effective and do they communicate effectively – for example, do they challenge themselves and others?
  • Do they use energetic presentations and have insightful conversations?
  • Do they support their colleagues and have good decision-making skills?

In this post, I want to develop this line of thinking further by exploring what the concept of organizational intelligence implies for boards.

1. Boards need to know what is going on.

  • Multiple and diverse sources of information - both quantitative and qualitative
  • Understanding how information is filtered, and a willingness to view unfiltered information as necessary. 
  • Ability to identify areas of concern, and initiate detailed investigation 

2. Boards need to make sense of what is going on.

  • Ability to see things from different perspectives - patient quality, professional excellence, financial accountability, social accountability. 
  • Ability to see the detail as well as the big picture. 
  • Courage to investigate and explore any discrepancies, and not to be satisfied with easy denial.

3. Boards need to ensure that all decisions, policies and procedures are guided by both vision and reality. This includes decisions taken by the board itself, as well as decisions taken at all levels of management.

  • Decisions and actions are informed by values and priorities, and reinforce these values. (People both inside and outside the organization will infer your true values not from your words but from your actions.) 
  • Decisions and actions are guided by evidence wherever possible. Ongoing decisions and policies are open to revision according to the outcomes they yield.
  • Decision-making by consent (Robertson)

4. Boards need to encourage learning.

  • Effective feedback loops are established, monitoring outcomes and revising decisions and policies where necessary. 
  • Courage to experiment. Ability to tolerate temporary reduction in productivity during problem-solving and learning curve. Supporting people and teams when they are out of their comfort zone. 
  • Willingness to learn lessons from anywhere, not just a narrow set of approved exemplars.

5. Boards need to encourage knowledge-sharing

  • All kinds of experience and expertise may be relevant 
  • Overcoming the "silos" and cultural differences 
  • The collective memory should be strong and coherent enough to support the organization's values, but not so strong as to inhibit change.

6. Boards work as a team, and collaborate with other teams

  • Effective communication and collaboration within the board - don't expect each board member to do everything. 
  • Effective communication and collaboration with other groups and organizations.
  • Circle Organization (Robertson)

Note: The six points I've discussed here correspond to the six core capabilities of organizational intelligence, as described in my Organizational Intelligence eBook and my Organizational Intelligence workshop.

See also

Brian Robertson, The Sociocratic Method. A Dutch model of corporate governance harnesses self-organization to provide agility and a voice to all participants (Strategy+Business Aug 2006)

Steve Waddell, Wicked Problems, Governance as Learning Systems (Feb 2013)

Updated 1 March 2013

Tuesday, February 26, 2013

Developing cultures of high-quality care

#kfleadership Excellent lecture at @TheKingsFund this evening by Professor Michael West. Here are some of my notes.

When he left college West was short of money, so he took a job in the coal mines. Productivity was important to everyone, and the pay at the end of the week depended on the quantity of coal extracted. But there was one thing more important than productivity, namely safety.

In many organizations this would just be lip service. But in the coal mines, safety was taken very seriously, and management actions were completely congruent with this.

West argued that the same should apply in the Health Service. Of course productivity is fundamentally important, but the number one priority should not be productivity but high-quality and safe patient care.

Valuing patients and staff turns out to be good management. West's argument is not merely based on rhetoric, but is supported by data. Patient outcomes and patient satisfaction are highly correlated with staff satisfaction and morale, and these in turn are correlated with staff engagement, which West defined in terms of three things: pride, intrinsic engagement and involvement in decisions. Ultimately this links back to improved productivity.

Someone in the audience objected that productivity must always be the top priority, otherwise you risk running out of money to pay for patient care. West replied that productivity follows from good people management. He agreed that the NHS has a great deal to learn from the private sector, and expressed a hope that private sector expertise (including non-executive board members) would not be limited to the Marketing and Finance perspectives.

West affirmed that the NHS is full of intelligent and highly motivated people, and said that the traditional command and control mode of leadership was such a waste of resource. The key role of leaders is to learn from staff, and to realize the potential of the people.

People at all levels require courage to accept challenging targets - in other words, to strive for things that they won't always achieve. The organization must accept and learn from failure to reach these targets. Blaming people for failure to excel is not only stupid and unfair, it is also counter-productive, because it makes people risk-averse and inhibits them from striving for anything that isn't guaranteed in advance.

Leadership includes the courage to seek unwelcome information - for example feedback that indicates things not going well.

After the lecture, I was chatting to a group from a London teaching hospital about accountability. As I see it, accountability doesn't only mean taking responsibility for the consequences of one's decisions (such as short-sighted cost-cutting) but also taking responsibility for what one chooses to pay attention to. One of the classic examples in Moral Philosophy concerns a ship owner who sends a ship to sea without bothering to check whether the ship was sea-worthy. Some argue that the ship owner cannot be held responsible for the deaths of the sailors, because he didn't actually know that the ship would sink. I think most people would see the ship owner having a moral duty of diligence, and would regard him as accountable for neglecting this duty.

In the current climate, the NHS leadership has a duty to achieve high quality patient care and productivity, and the evidence from Professor West is that this can best be achieved by engaging staff at all levels. Executive boards must surely be held accountable if they neglect to do this.

See also Culture of Fear (Storify, 27 Feb 2013)

The ship-owner example can be found in an essay called "The Ethics of Belief" (1877) by W.K. Clifford, in which he states that "it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence".

Updated 28 Feb 2013

Tuesday, February 19, 2013

Cybernetic Entropy

The pioneers of cybernetics borrowed the concept of entropy from thermodynamics, the tendency of systems to become less organized over time.They regarded structure and information as ways of halting or reversing entropy, and information is sometimes defined as negative entropy (negentropy).

In the past few days, I have seen a few examples of what appears to be entropy at a higher level - over time, rules becoming less effective or even counterproductive.

We keep hearing stories about large corporations paying practically no tax. As we heard on BBC Radio 4 recently, (File on Four: Taxing Questions), new tax rules are created with the participation of interested parties, including large corporations (HSBC, Vodafone) and accountancy firms (KPMG). Having advised on the creation of loopholes, the accountants then make huge amounts of money selling knowledge of these loopholes to their clients. Sadly, even this valuable knowledge degrades over time, and new tax laws must be created with new and more obscure loopholes.

Within a sceptical article about the so-called Robin Hood tax (Algorithm and Blues) @TimHarford mentined Myron's Law - the theory that taxes collect diminishing amounts of revenue over time, as people work out legal ways to avoid paying.

Meanwhile, @CyberSal has tweeted a couple of links to articles about Payment by Results. Since Deming, systems thinkers have understood that targets and incentives often don't (and perhaps cannot) achieve the intended results. Instead, they stimulate various forms of devious behaviour, known as gaming the system.

I think the interesting point here is not just that these mechanisms don't work, but they get worse over time. To start with, people may make a genuine attempt to do things properly, and some professionals may be reluctant to game the system, but they gradually get worn down. Those that don't quit altogether become stressed, depressed and cynical. For example, if teachers don't teach to the test, and if the head teachers don't bully them into playing the game, then the school will slip down the league tables and become non-viable. But this degradation takes time, which is why I think it makes sense to think of this as another form of entropy.

How then might this entropy be halted or reversed?

More links:

Saturday, February 9, 2013

Agility and Fear

Frank Furedi argues that human thought and action are being stifled by a regime of uncertainty. The only thing we have to fear is the ‘culture of fear’ itself (April 2007),

McGregor introduced the distinction between Theory X and Theory Y, referring to different beliefs about the behaviour and motivation of workers, which may be embedded in management practices and organization culture. Ouchi argued that McGregor's distinction doesn't work for all cultures, and identified a third theory, Theory Z, which he used to explain the behaviour of most Japanese companies and some Western companies.

Theory X refers to a set of beliefs in which workers are lazy, require constant supervision, and are motivated only by financial rewards and penalties.

Theory Y refers to a set of beliefs in which workers can be trusted to pursue the interests of the firm without constant supervision, and respond to a range of motivators.

Theory Z refers to a set of beliefs about lifetime commitment between employers and employees.

If we frame fear in terms of Theory-X, then it becomes fear-and-blame and we can all go tut-tut. But isn't there also a way of framing fear in terms of Theory-Y, without yoking it to blame? Performing artists may experience some stage-fright prior to producing an outstanding performance, and while excessive stage-fright may be debilitating, some degree of anxiety may be a positive stimulus. Are we to ban all forms of anxiety and uncertainty from the organization, so that everyone can feel cosy and safe?

And what about Theory-Z? If an organization is under existential threat, then the members collectively need to focus all their energy and creativity on restoring the viability of the organization, and it would be perfectly normal for them to be emotionally as well as intellectually engaged in this task. Necessity, as they say, is the mother of invention.

All I'm saying is that there are different types of fear, which may have different effects on organizational behaviour. Fear-and-blame is one particular type of fear, but there are other types.

Many workers rightly feel responsible for their work. In most organizations, employees or contractors are ultimately vulnerable to loss of status or loss of earnings if they fail to perform satisfactorily. A completely fear-free organization would be disengaged from its customers and environment, and therefore ethically problematic.

However, a caring organization may be able to attenuate some of this feeling of vulnerability, and provide some kind of safety net that allows people to take reasonable risks without too much fear of failure. Whereas an uncaring organization either fails to provide proper boundaries, or amplifies the sense of vulnerability by capricious and unjust management practices.

How Offices Make People Stupid

@benhammersley at #RSAwork talks about the future of office work, and identifies some of the ways that organizations make themselves stupid. The irony is that a lot of these mechanisms were supposed to make offices more productive and efficient, and to promote collaboration and creativity. As Ben puts it

"We have optimized being on top of things rather than getting to the bottom of things."

Let's start with open plan offices. As Ben tells the story, these were introduced in an ideological attempt (supposedly originating in North California) to flatten the office hierarchy, to remove barriers between people, and to encourage people and technology to work together in perfect harmony. There are various dysfunctional versions of this Californian Ideology - see my post All Chewed Over By Machines (May 2011).

In practice, various interesting forms of behaviour emerge in open plan offices. Ben notes the widespread practice of more powerful workers grabbing the desks near to the wall, leaving juniors huddled in the middle in a state of permanent anxiety, as if they were antelope anticipating the lion's pounce.

Many offices are designed as semi-open plan, with people huddled in cubicles, but with the constant chance of someone popping a head over the partition.

In some offices, there is a deliberate policy to move people around - sometimes called hot-desking. One of the supposed benefits of this policy is that it encourages workers to constantly develop new relationships with their transient neighbours. For companies whose workers don't spend all their time in the office, this policy also reduces the amount of office space required. However, the uncertainty and anxiety of getting any desk, let alone a decent desk near the wall and away from the more irritating co-workers, might be regarded as a negative factor.

Putting aside the economics and culture and psychological impact of open plan offices, the essential justification is that they promote communication and collaboration. These elements are necessary but not sufficient for productivity and innovation in a knowledge-based organization. Not sufficient because productivity and innovation also depend on concentrated hard work.