Post partly based on Chapter 2 of Organizational Intelligence by Harold Wilensky, published in 1967.
In his book, Wilensky presents a detailed account of the history of Allied strategic bombing during the Second World War, which demonstrates the longevity of preconceptions in the face of evidence.
British and American leaders strongly believed that strategic bombing would win the war, and that large-scale use of ground forces would be unnecessary. Their faith in the efficacy of bombing was reinforced by selectively listening to those advisers who were most strongly committed to the doctrine of bombing (for example, Churchill relied upon Lord Cherwell and "Bomber" Harris), and ignoring those scientists who articulated disparate views. Although there was growing evidence during the war that bombing was failing to achieve its strategic goals, and was diverting resources from more effective tactics, it was not until after the war had ended that there was a general willingness to examine the evidence more open-mindedly.
As it happens, the British and Americans held slightly different versions of this doctrine. The British preferred area bombing, which they wrongly expected to damage the German economy as well as undermining German morale. They did not regard as relevant evidence the fact that German bombing had not significantly affected the British economy or morale: perhaps they couldn't imagine that the German population might be equal to the British in initiative and resilience and resolve.
Meanwhile, the Americans preferred precision bombing, which they wrongly expected to disrupt the German production of armaments. As it happens, the Germans also had this expectation, for example in fearing the effects of American attacks on German ball bearing production, and had underestimated their own ability to find spare capacity as well as relocating production away from the factories that had been bombed.
Although there was some early systems thinking within the intelligence communities, and some intelligence analysts might have been able to think about the urban area or the armaments supply chain as a system, this kind of thinking did not significantly influence the dominant narrative.
In any case, the so-called precision bombing wasn't at all precise, so it wasn't that different in practice from area bombing; thus the apparent difference of doctrine between the British and the Americans was more imaginary than real. Difference of opinion at one level hides a fundamental agreement at a deeper level.
Writing during the Vietnam War, Wilensky notes how these beliefs, which had been discredited by military analysts after the Second World War, had mysteriously reemerged a generation later. Even if the decision-makers were aware of the evidence that strategic bombing had been an expensive failure during the Second World War, they may have thought that this evidence was no longer relevant, because they now had much more sophisticated technology - the B52.
And of course the same beliefs have continued to reemerge at regular intervals since, such as the conflicts in the Balkans and Iraq/Afghanistan. Faith in strategic bombing seems to be based on a combination of blind trust in expensive technology ("We've spent billions of dollars on this hardware, so it's gotta be good.") and wishful thinking ("We sure don't want to use ground troops here.").
More generally, we can identify any number of powerful preconceptions, across the public and private sector. There is typically a great deal invested in this kind of false doctrine - a combination of ego, political credibility and hard cash - as well as powerful vested interests who will profit financially or politically from the continuation of the doctrine. Surely one of the primary responsibilities of a critical intelligence is to challenge preconceptions of all kinds.
If your organization or industry is driven by preconceptions and false doctrines, I'd love to hear from you.
Since writing this post, I have read Margaret Heffernan's excellent book Willfill Blindness, which explores the power of wishful thinking and intelligence suppression in organizations. Strongly recommended.
Updated 5 November 2014