More precisely, he argues that it is unethical to believe things without proper evidence. (He is particularly interested in beliefs about product and software development, but the argument applies more generally.)
As far as I can see, there are three steps in this argument.
1. People are ethically responsible for their beliefs. (According to Bob, this was the basis for a controversial paper presented to the Metaphysical Society by William Kingdon Clifford in 1876.)
2. An unfounded belief is unethical.
3. A person who holds unfounded beliefs is unethical.
Let's look at step 1 first. This appears to entail an ethical obligation to subject one's beliefs to some kind of "due diligence". However, most of our beliefs are based, not on evidence that we have personally collected and analysed, but at least partly on evidence that has been filtered through other sources. We may have reasons to trust certain sources more than others, but if it is unethical to believe things without proper evidence, it would also surely be unethical to trust things without proper evidence. We may accept an ethical obligation to subject our beliefs to "due diligence", but this is normally a collective obligation rather than an individual obligation.
Step 2 asserts that any failure to ground beliefs in proper evidence is an ethical failure. People are rightly held accountable for failing to act in certain circumstances (for example failing to save someone from drowning), but ethical censure generally assumes both awareness (knowing that someone needed rescue) and capability (being able to swim). So the problem with Step 2 is that the more complex the beliefs are, the greater the intellectual power (intelligence) that is required to appreciate and thoroughly investigate these beliefs. If the management team isn't individually or collectively intelligent enough to understand what proper evidence would look like, then believing things without proper evidence is a consequence of insufficient intelligence.
Does being stupid count as an ethical failure? (Being deliberately or avoidably stupid might, but most instances of stupidity are not deliberate.) Appointing people and teams who don't have enough intelligence might be unethical, but only if the appointment was deliberate or avoidable, and so on along the responsibility chain until we can find someone who should have known better.
Step 3 assumes that we can categorize people as ethical or unethical based on incidence of ethical or unethical behaviour. Once we have a hard-and-fast concept of sin, then we can define a sinner as a person who has committed (and not yet purged) at least one sin. The trouble with this is that if we are all sinners, the category of "sinner" ceases to have much value except for the purposes of hellfire rhetoric. Labelling all executives as unethical (and why stop at executives, by the way) becomes merely a rhetorical gesture.
So where does this leave the virtues of diligence, responsibility and probity? Firstly, I hold that these are collective virtues - executives display moral character in a particular organizational setting, and we may not know how their ethics would stand up in a different setting.
Secondly, I think character and intelligence are distinct virtues. We should not automatically suppose that intelligent people are more ethical than less intelligent people, and therefore we should not define "ethical" to mean something that only very intelligent or highly educated people can comply with.
Thirdly, there is a widespread belief (especially among consultants) in the value of knowledge (although I don't know exactly what would count as proper evidence for this belief - if executives are unethical, I dread to think where this leaves consultants). If we define knowledge as justified true belief, then knowledge is degraded to the extent that it is unjustified or untrue, or for that matter disbelieved. If it is unethical to believe something without proper evidence, it may sometimes also be unethical to disbelieve something. Sometimes excessive scepticism shades into cynicism and negativity, and maybe this can be just as unethical as unjustified optimism.
Related posts: Intelligence and Governance (February 2013), Ethics and Uncertainty (March 2019)
No comments:
Post a Comment