My previous post Where does learning take place? was prompted by a Twitter discussion in which some of the participants denied that organizational learning was possible or meaningful. Some argued that any organizational behaviour or intention could be reduced to the behaviours and intentions of individual humans. Others argued that organizations and other systems were merely social constructions, and therefore didn't really exist at all.
In a comment below my previous post, Sally Bean presented an example of collective learning being greater than the sum of individual learning. Although she came away from the reported experience having learnt some things, the organization as a whole appears to have learnt some larger things that no single individual may be fully aware of.
And the Kihbernetics Institute (I don't know if this is a person or an organization) offered a general definition of learning that would include collective as well as individual learning.
If you understand #learning is the process of acquiring #knowledge which is a measure of an individual's "fitness" in performing a given task, you can use the same system model on both humans and organizations.— The Kihbernetics Institute (@Kihbernetics) January 3, 2022
I think that's fairly close to my own notion of learning. However, some of the participants in the Twitter thread appear to prefer a much narrower definition of learning, in some cases specifying that it could only happen inside an individual human brain. Such a narrow definition of learning would not only exclude organizational learning, but also animals and plants, as well as AI and machine learning.
As it happens, there are differing views among botanists about how to talk about plant intelligence. Some argue that the concept of plant neurobiology is based on
superficial analogies and questionable extrapolations.
But in this post, I want to look specifically at machines and organizations, because there are some common questions in terms of how we should talk about both of them, and some common ideas about how they may be governed. Norbert Wiener, the father of cybernetics, saw strong parallels between machines and human organizations, and this is also the first of Gareth Morgan's eight Images of Organization.
Margaret Heffernan talks about the view that
organisations are like
machines that will run well with the right components – so you design
job descriptions and golden targets and KPIs, manage it by measurement,
tweak it and run it with extrinsic rewards to keep the engines running. She calls this old-fashioned management theory.
Meanwhile, Jonnie Penn notes how artificial intelligence follows Herbert Simon's notion of (corporate) decision-making.
Many contemporary AI systems do not so much mimic human thinking as they
do the less imaginative minds of bureaucratic institutions; our
machine-learning techniques are often programmed to achieve superhuman
scale, speed and accuracy at the expense of human-level originality,
ambition or morals.
The philosopher Gilbert Simondon observed two contrasting attitudes to machines.
First, a reduction of machines to the status of simple devices or assemblages of matter that are constantly used but granted neither significance nor sense; second, and as a kind of response to the first attitude, there emerges an almost unlimited admiration for machines.Schmidgen
On the one hand, machines are merely instruments, ready-to-hand as Heidegger puts it, entirely at the disposal of their users. On the other hand, they may appear to have a life of their own. Is this not like organizations or other human systems?
Amedeo Alpi et al, Plant neurobiology: no brain, no gain? (Trends in Plant Science Volume 12, ISSUE 4, P135-136, April 01, 2007)
Eric D. Brenner
et al, Response to Alpi et al.: Plant neurobiology: the gain is more than the pain (Trends in Plant Science Volume 12, ISSUE 7, P285-286, July 01, 2007)
Anthea Lipsett, Interview with Margaret Heffernan: 'The more academics compete, the fewer ideas they share' (Guardian, 29 November 2018)
Gareth Morgan, Images of Organization (3rd edition, Sage 2006)
Jonnie Penn, AI thinks like a corporation—and that’s worrying (Economist, 26 November 2018)
Henning Schmidgen, Inside the Black Box: Simondon's Politics of Technology (SubStance, 2012, Vol. 41, No. 3, Issue 129 pp 16-31)
Geoffrey Vickers, Human Systems are Different (Harper and Row, 1983)
Related post: Where does learning take place? (January 2022)