Tuesday, October 9, 2012
Delusion and Diversity
Of course organizations are self-delusional - some more than others. The interesting challenge here is to distinguish between evidence-based policy (which forms policy based on the evidence) and policy-based evidence (which collects evidence to support a given policy).
"Yes", says @CoCreatr, "organizationally the difference between fundamentalism and curiosity". He points to a presentation by Seth Godin on Curiosity.
Without curiosity, people in organizations tend to use research (both qualitative and quantitative) to try and promote their own delusions and/or undermine the delusions of others. So there is a dual purpose for research.
This is why diversity helps. If there is a broad variety of delusion, then there is at least the possibility of enough evidence emerging to create sufficient cognitive dissonance for cherished delusions to be questioned. Whereas if everyone shares the same delusions (commonly known as "groupthink"), then cognitive dissonance seems much less likely to occur.
By the way, the "diversity" agenda usually focuses on achieving a fair distribution of age, gender, race, class background, etc., as well as avoiding various forms of unfair discrimination. Sometimes this is justified not only on ethical grounds, but also because it is thought to increase the likelihood of cognitive diversity - different attitudes, personality styles and belief systems.
But the correlation between externally visible diversity and cognitive diversity looks more tenuous nowadays than it might have done in the past. If you assume attitude, personality style and belief system depends solely on the colour of your skin and the number of X chromosomes, then you might exaggerate the differences between Hillary Clinton and Barack Obama. (As the popular song goes, I wonder who's Kissinger now? See my post Relationships Based on Self-Interest from January 2009).
Talking of presidents and would-be presidents ... In his 1967 book on Organizational Intelligence, Harold Wilensky praised President Roosevelt for maintaining a state of creative tension in the US administration. Wilensky reckoned that this enabled FDR to get a more accurate and rounded account of what was going on, and gave him some protection against the self-delusion of each department. Roosevelt could still get a degree of cognitive diversity from a bunch of white men with similar education. Think what he could achieve today.
Related Blogs
Relationships built on self-interest (January 2009)
What is the Purpose of Diversity? (January 2010)
Organizational Intelligence and Gender (October 2010)
Delusion and Diversity (October 2010)
More on the Purpose of Diversity (December 2014)
Links added 12 December 2014
Friday, March 6, 2009
Negative Evidence
People who should know better (so-called scientists) make this problem worse by using the phrase "no scientific evidence". For example "no scientific evidence that eating infected meat carries any risk to humans" or "no scientific evidence that mobile phones cause headaches".
This creates the impression that there may actually be lots of evidence, but we can safely ignore it because it hasn't been collected or approved by somebody in a white coat.
Just as some kinds of evidence are inadmissible in a court of law, so some kinds of evidence are inadmissible in a scientific journal. Among other things, this leads to publication bias, where people perform calculations based only on the data that have passed through some publication filter, which is then systematically incomplete.
See my comment to Science isn't about Checklists
Tuesday, June 17, 2008
Memory and the Law
"Many experts are challenging the view that eyewitnesses recounting what they saw is the best way of tapping their memory. Some think brain scans could be the way forward." [Memory Mixup, BBC News Magazine, 17 June 2008]
We already have technology that is supposed to detect discrepancies between what the witness remembers and what the witness says - it's called a polygraph or lie detector. Now we apparently need another technology that detects discrepancies between what the witness consciously remembers and what is buried in the witness's unconscious.
The lie detector has been controversial ever since its invention, and features in a Chesterton story called "The Mistake of the Machine". (Of course it is not the machine that makes the mistake, as Chesterton's hero Father Brown points out, but the people using the machine who misinterpret its output.)
- Chesterton and Friends: Lie Detectors
- David Wallace-Wells: The Big Lie (Washington Monthly, April 2007), via AntiPolygraph.org
Of course humans sometimes lie, and sometimes this can be detected by the polygraph, but that doesn't make the polygraph an instrument of truth. (For that matter, people sometimes blurt out secrets under the influence of alcohol or torture, or get artistic inspiration under the influence of mind-bending drugs, but none of these are reliable instruments of truth either.)
And human memory is sometimes unreliable, but that doesn't make the brain scan an instrument of truth either. Constructing evidence from the unconscious contents of a brain is no more reliable than constructing history from an archaeological sift through a mediaeval rubbish tip. It may be possible, and may yield some intriguing results, but the results are always speculative and uncertain.
Meanwhile, our "common sense" understanding of the brain and its contents is probably less accurate and less coherent than our understanding of mediaeval waste disposal. That's why psychoanalysts make more money than archaeologists. They do, don't they?
Update
"India has become the first country to convict someone of a crime relying on evidence from this controversial machine." [Source: New York Times, via Bruce Schneier]
Related posts: The Dashboard Never Lies (February 2020), Lie Detectors at Airports (April 2022)
See also
Sarah Scoles, Polygraphs Aren’t Very Accurate. Are There Better Options? (Undark, 25 March 2026)
Sunday, July 30, 2006
Security Trends
F-Secure does seem to have some evidence for the growing sophistication of malware attacks, and a plausible explanation for the fact that these attacks are less visible. But explanation is not evidence.
Point One. Even if visible attacks are decreasing, this doesn't provide conclusive evidence that invisible attacks are decreasing.
Point Two. The lack of evidence that invisible attacks are decreasing does not imply any evidence that invisible attacks are increasing.
But that's not quite what F-Secure says. F-Secure avers that the reduction in visible attacks provides evidence that invisible attacks could be increasing.
But this is rubbish. We don't need evidence for the possibility of increased attack; it's not something that requires evidence. What we want to know, which F-Secure avoids telling us, is some measure of what is going on. And F-Secure is not offering us any evidence that is relevant to this question.
Technorati Tags: evidence-based risk security trust
Tuesday, November 15, 2005
Science and Scientists
In his second excusion into Intelligent Design, he reiterates his point about science and scientists. Although he is inclined to believe in evolution, he is uncomfortable about accepting the authority of the men and women in white coats.
I’d be surprised if 90%+ of scientists are wrong about the evidence for Darwinism. But if you think it’s impossible, you’ve lived a sheltered life.The public has often been disappointed by the men and women in white coats. Most scientists get funding from political or commercial sources, and may be subject to political or commercial pressure. We have frequently been assured of the absolute safety of various things, we have been told that there is "no scientific evidence" of any risk, only to discover later that this reassurance was at best incomplete. No wonder if many intelligent non-scientists reserve judgement.
Let me say very clearly here that I’m not denying the EXISTENCE of slam-dunk credible evidence for evolution. What I’m denying is the existence of credible PEOPLE to inform me of this evidence. The people who purport to have evidence of evolution do a spectacular job of making themselves non-credible.
Meanwhile, scientists themselves are indignant at such suggestions, and strongly resist the idea that scientific truth might be regarded as a social construction. Scientists are taught to present their findings using a dry and impersonal third-person rhetoric, as if to emphasize an idealized independence from worldly matters, and an absolute trustworthiness.
Dilbert has clearly touched a raw nerve. Why else would so much energy and emotion be invested in arguing with a cartoonist?
Previous post: Dilbert on Intelligent Design (November 2005)
Thursday, September 8, 2005
Shifting Standards
Bruce suggests athletes need escrow blood samples, to defend themselves against retrospective charges.
Retrospective tests introduce new modes of risk and uncertainty for athletes. Not only new tests as Bruce indicates, but possibly also new interpretations of the regulations, based on shifting expert opinion from scientists and lawyers. (Does chemical substance X2 invented at time T2 and detected in a blood sample taken at time T3 count as banned under a regulation enacted at earlier time T1, which refers to an almost identical substance X1, if a reliable test to discriminate between X1 and X2 wasn't perfected until a later time T4, and X2 itself wasn't explicitly banned until T5? Complicated, huh?)
An older and long-established athlete may have access to more expensive ways of cheating, as well as a greater incentive to cheat as athletic prowess starts to wane. Meanwhile the direct cost of being detected (in terms of suspension from further competition) reduces to zero as the athletic career comes to an end. Retrospective testing may increase the chance of detection, but it also increases the average delay between transgression and detection. So we might expect retrospective testing to amplify the age imbalance: greater compliance from younger athletes, and lesser compliance from older ones.
Meanwhile, crime writer Patricia Cornwall has a theory (Portrait of a Killer) about the identity of Jack the Ripper, based partly on DNA evidence. (Nineteenth century crime subject to twentieth century forensics.) So do we also need to think of escrow DNA, to deal with posthumous allegations of crime or paternity?
[Update] Bruce Schneier adds some useful game theoretic analysis to his earlier post, prompted by dope-testing of Floyd Landis.
Technorati Tags: escrow forensic regulation risk uncertainty
Wednesday, February 2, 2005
Corrupting Evidence
A draft chapter on Corrupt Techniques is currently available online, in which he discusses several modes of presenting misleading or corrupt evidence.
| Effects without causes | Beware of presenters using the passive voice or bullet lists. |
| Cherry-picking | Presenters pick and choose, select and reveal only the evidence that advances their favored point of view. |
| Punning | Puns enable over-reaching, as sharply focused ideas tend to sprawl, grow mushy and collapse into vague metaphors when applied to content outside their original domain. |
| Chartjunk | Chartjunk develops from the premise that audience can be charmed, manipulated or fooled by means of content-free misdirection: garish colors, generic decoration, phoney dimensionality, corny clip-art. |
There is nothing wrong with trying to persuade people - within a framework of trust and mutual respect. Corrupt persuasion is an abuse of trust.
Corrupt persuasion also carries a metacommunication, for those able to pick it up. Tufte writes: "For consumers of presentations, gratuitous and cartoonish decoration of statistical graphics has the redeeming virtue of providing insight into the presenter's integrity and analytical skills: no integrity, no analytical skills." In other words, it ain't what you say, it's the way that you say it.
Update (August 2005): See also review by Graham Shevlin of Tufte's essay The Cognitive Style of PowerPoint.
Related Posts: The PowerPoint Collection