I have received several copies of an email inviting me to download some Big Brother software, in order to keep tabs on my loved ones. (Is my spouse cheating online? Are my kids talking to dangerous people on instant messenger?)
At one level, this product claims to provide me with a mechanism to invade another person's privacy - in other words to breach the security of some system. It is therefore selling (or at least promising) a security threat. At another level, it is sold as a way of protecting me and my family from various threats - spouse or kids innocently (or not so innocently) having contacts with dubious characters via the internet. There is an interesting tension between these two levels.
But perhaps the real implied danger (apart from running up excessive phone/ISP bills) is when the contacts cease to be mediated by the Internet - e.g. online cheating leads to offsite cheating. Online cheating (whatever that may be) becomes not the primary offence/risk, but a clue towards some other offence/risk.
This illustrates a general point -- that there is a temptation (encouraged by technology) to measure and monitor what is easy to measure and monitor, even if this provides at best an indirect indication of what's really at issue.
The general point applied to trust and security is that security monitoring typically measures the wrong things. This may start with a valid observation, that there is a close correlation between X (which is the real threat) and Y (which is easy to measure). So by measuring Y, we get an indication of X. But this is vulnerable in two ways.
Firstly, the fact that X-threats are monitored via Y may leak out and become public knowledge - and therefore useless. Secondly, a determined investigator or hacker may be able to infer an internal connection between X and Y by observing (and testing) the behaviour of the system from the outside. The forced coupling between X and Y represents a simplification in behaviour, a reduction in requisite variety.
Note that indirect measurement is commonplace in quality management systems, as long as you have appropriate mechanisms to callibrate and control the metrics. Among other things, an indirect measurement may give earlier warning of an impending problem than waiting for the direct measurement. But the reasonable precautions that may be necessary and sufficient for quality management systems are grossly inadequate for security systems.
And this is not just a technological point, but a sociological one. People generally use weak and unreliable signals to make significant trust/mistrust decisions, and may flip catastrophically from blind trust to unremitting suspicion. Even in relation to their loved ones (as Shakespeare discovered).