Here is the Mindsystem strategy for detecting weak signals
- Do something to cut down the background noise
- Be on the alert for the smoke screen that ‘Conventional Wisdom’ can throw up
- Develop techniques to “see the emerging patterns” in the chaos which is information overload
- Look for and expect the unexpected.
- Adjust our attitude to seek success in the unusual and the marginalised ideas and opportunities
As far as I can see, this strategy only works if you already know how to separate the weak signals from the background noise, and can see through the smoke screen, etc etc. In which case you don't need a strategy at all. This sounds more like wishful thinking than real theory - weak signal-theory rather than weak-signal theory. As for expecting the unexpected, this is of course perfect nonsense, as Oscar Wilde knew perfectly well.
For some reason, this so-called strategy is attributed to Bryan S Coffman, who had written a paper on Weak Signal® Research in 1997. However, Coffman's position on background noise seems much more realistic and practical. "In order to uncover a full picture of the situation, a great deal of noise must be processed. Note that the noise is frequently eliminated only by understanding it." I believe the true author is likely to be one John England, Executive Director of Mindsystems, who claims credit for the Mindsystems blog on his Linked-In entry. (Update - this attribution has now been corrected.)
As far as I can see, Bryan Coffman's original material is a lot more intelligent and useful, and there is also some interesting work on Weak Signal Analysis by Dale Coffman (some relation perhaps), but this all seems to fall short of the pioneering work of Igor Ansoff. So we have a curious process of Chinese Whispers here, with ideas and theories being attenuated in transmission. An interesting form of metacommunication perhaps?
In a further comment, John England tries to explain the example between the expected and the unexpected using an example based on controlling the power grid. He regards a variation in TV transmission times as "expected", and regards an outage in a power source as "unexpected". However, I presume that both of these classes of event are anticipated by the designers of the control dashboard, and fully covered by the training received by the managing engineer, so they can hardly be regarded as weak signals.
In fact the outage of a power source is clearly an example of a strong signal, which Holger Nauheimer describes thus. "Strong signals about things going the wrong way are easy to notice in an organization - or even to measure by numbers. Strong signals will show up anyway and everybody will be concerned about them. When we notice strong signals we know that something needs to be done." In comparison, the variation in TV transmission times might once have been disregarded by power engineers, because it doesn't seem relevant until we make the connection between TV viewing and kettles (and for that matter toilets), so it might formerly have been a weak signal. However, the monitoring of TV transmission times is now embedded in standard working practices for the control of various networks including power grid and water, so it is no longer a weak signal relative to current knowledge and practices.
If that is the kind of thing that Mr England means by "expecting the unexpected", then it doesn't entail anything more than "being prepared for known problems to occur at any time". And that is not going to be much help in detecting genuinely weak signals. However, Mr England defines "weak signals" as merely "variations from the norm" and recommends "systems to constantly monitor specific bodies of knowledge", which suggests that he is talking about something rather different from the rest of us, and certainly different from the notion of weak signal introduced by Ansoff in the mid 1970s.
Meanwhile, for most people including Oscar Wilde, the phrase "expect the unexpected" has a paradoxical air, so it sounds more like a Zen koan than a simple usable guideline. As for "approaching every situation with an open mind and leaving your baggage at the door" - this is something we might all want to achieve, we might even imagine we are good at it, while noting how often other people fail to do this. Simply telling people to be open-minded is useless, because everyone aready imagines himself or herself to be pretty open-minded already. The point is to construct a social process (this is part of what I call organizational intelligence) that allows preconceptions and expectations to be exposed and challenged, and allows weak signals to be detected and reasoned about.
We may not all have the kind of software that is used by the FBI and Homeland security (which Mr England describes as a "luxury"), but there are many organizations that are a lot better than these at detecting and dealing with weak signals, so there is no need to regard the processing of weak signals as outside the capabilities of any organization.