Tuesday, October 8, 2019

Ethics of Transparency and Concealment

Last week I was in Berlin at the invitation of the IEEE to help develop standards for responsible technology (P7000). One of the working groups (P7001) is looking at transparency, especially in relation to autonomous and semi-autonomous systems. In this blogpost, I want to discuss some more general ideas about transparency.

In 1986 I wrote an article for Human Systems Management promoting the importance of visibility. There were two reasons I preferred this word. Firstly, "transparency" is a contronym - it has two opposite senses. When something is transparent, this either means you don't see it, you just see through it, or it means you can really see it. And secondly, transparency appears to be merely a property of an object, whereas visibility is about the relationship between the object and the viewer - visibility to whom?

(P7001 addresses this by defining transparency requirements in relation to different stakeholder groups.)

Although I wasn't aware of this when I wrote the original article, my concept of visibility shares something with Heidegger's concept of Unconcealment (Unverborgenheit). Heidegger's work seems a good starting point for thinking about the ethics of transparency.

Technology generally makes certain things available while concealing other things. (This is related to what Albert Borgmann, a student of Heidegger, calls the Device Paradigm.)
In our time, things are not even regarded as objects, because their only important quality has become their readiness for use. Today all things are being swept together into a vast network in which their only meaning lies in their being available to serve some end that will itself also be directed towards getting everything under control. Levitt
Goods that are available to us enrich our lives and, if they are technologically available, they do so without imposing burdens on us. Something is available in this sense if it has been rendered instantaneous, ubiquitous, safe, and easy. Borgmann
I referred above to the two opposite meanings of the word "transparent". For Heidegger and his followers, the word "transparent" often refers to tools that can be used without conscious thought, or what Heidegger called ready-to-hand (zuhanden). In technology ethics, on the other hand, the word "transparent" generally refers to something (product, process or organization) being open to scrutiny, and I shall stick to this meaning for the remainder of this blogpost.

We are surrounded by technology, we rarely have much idea how most of it works, and usually cannot be bothered to find out. Thus when technological devices are designed to conceal their inner workings, this is often exactly what the users want. How then can we object to concealment?

The ethical problems of concealment depend on what is concealed by whom and from whom, why it is concealed, and whether, when and how it can be unconcealed.

Let's start with the why. Sometimes people deliberately hide things from us, for dishonest or devious reasons. This category includes so-called defeat devices that are intended to cheat regulations. Less clear-cut is when people hide things to avoid the trouble of explaining or justifying them.

(If something is not visible, then we may not be aware that there is something that needs to be explained. So even if we want to maintain a distinction between transparency and explainability, the two concepts are interdependent.)

People may also hide things for aesthetic reasons. The Italian civil engineer Riccardo Morandi designed bridges with the steel cables concealed, which made them difficult to inspect and maintain. The Morandi Bridge in Genoa collapsed in August 2018, killing 43 people.

And sometimes things are just hidden, not as a deliberate act but because nobody has thought it necessary to make them visible. (This is one of the reasons why a standard could be useful.)

We also need to consider the who. For whose benefit are things being hidden? In particular, who is pulling the strings, where is the funding coming from, and where are the profits going - follow the money. In technology ethics, the key question is Whom Does The Technology Serve?

In many contexts, therefore, the main focus of unconcealment is not understanding exactly how something works but being aware of the things that people might be trying to hide from you, for whatever reason. This might include being selective about the available evidence, or presenting the most common or convenient examples and ignoring the outliers. It might also include failing to declare potential conflicts of interest.

For example, the #AllTrials campaign for clinical trial transparency demands that drug companies declare all clinical trials in advance, rather than waiting until the trials are complete and then deciding which ones to publish.

Now let's look at the possibility of unconcealment. Concealment doesn't always mean making inconvenient facts impossible to discover, but may mean making them so obscure and inaccessible that most people don't bother, or creating distractions that divert people's attention elsewhere. So transparency doesn't just entail possibility, it requires a reasonable level of accessibility.

Sometimes too much information can also serve to conceal the truth. Onora O'Neill talks about the "cult of transparency" that fails to produce real trust.
Transparency can produce a flood of unsorted information and misinformation that provides little but confusion unless it can be sorted and assessed. It may add to uncertainty rather than to trust. Transparency can even encourage people to be less honest, so increasing deception and reducing reasons for trust. O'Neill
Sometimes this can be inadvertent. However, as Chesterton pointed out in one of his stories, this can be a useful tactic for those who have something to hide.
Where would a wise man hide a leaf? In the forest. If there were no forest, he would make a forest. And if he wished to hide a dead leaf, he would make a dead forest. And if a man had to hide a dead body, he would make a field of dead bodies to hide it in. Chesterton
Stohl et al call this strategic opacity (via Ananny and Crawford).

Another philosopher who talks about the "cult of transparency" is Shannon Vallor. However, what she calls the "Technological Transparency Paradox" seems to be merely a form of asymmetry: we are open and transparent to the social media giants, but they are not open and transparent to us.

In the absence of transparency, we are forced to trust people and organizations - not only for their honesty but also their competence and diligence. Under certain conditions, we may trust independent regulators, certification agencies and other institutions to verify these attributes on our behalf, but this in turn depends on our confidence in their ability to detect malfeasance and enforce compliance, as well as believing them to be truly independent. (So how transparent are these institutions themselves?) And trusting products and services typically means trusting the organizations and supply chains that produce them, in addition to any inspection, certification and official monitoring that these products and services have undergone.

Instead of seeing transparency as a simple binary (either something is visible or it isn't), it makes sense to discuss degrees of transparency, depending on stakeholder and context. For example, regulators, certification bodies and accident investigators may need higher levels of transparency than regular users. And regular users may be allowed to choose whether to make things visible or invisible. (Thomas Wendt discusses how Heideggerian thinking affects UX design.)

Finally, it's worth noting that people don't only conceal things from others, they also conceal things from themselves, which leads us to the notion of self-transparency. In the personal world this can be seen as a form of authenticity; in the corporate world, it translates into ideas of responsibility, due diligence, and a constant effort to overcome wilful blindness.

If transparency and openness is promoted as a virtue, then people and organizations can make their virtue apparent by being transparent and open, and this may make us more inclined to trust them. We should perhaps be wary of organizations that demand or assume that we trust them, without providing good evidence of their trustworthiness. (The original confidence trickster asked strangers to trust him with their valuables.) The relationship between trust and trustworthiness is complicated. 



UK Department of Health and Social Care, Response to the House of Commons Science and Technology Committee report on research integrity: clinical trials transparency (UK Government Policy Paper, 22 February 2019) via AllTrials

Mike Ananny and Kate Crawford, Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability (new media and society 2016) pp 1–17

Albert Borgmann, Technology and the Character of Contemporary Life (University of Chicago Press, 1984)

G.K. Chesterton, The Sign of the Broken Sword (The Saturday Evening Post, 7 January 1911)

Martin Heidegger, The Question Concerning Technology (Harper 1977) translated and with an introduction by William Lovitt

Onora O'Neill, Trust is the first casualty of the cult of transparency (Telegraph, 24 April 2002)

Cynthia Stohl, Michael Stohl and P.M. Leonardi, Managing opacity: Information visibility and the paradox of transparency in the digital age (International Journal of Communication Systems 10, January 2016) pp 123–137.

Richard Veryard, The Role of Visibility in Systems (Human Systems Management 6, 1986) pp 167-175 (this version includes some further notes dated 1999)

Thomas Wendt, Designing for Transparency and the Myth of the Modern Interface (UX Magazine, 26 August 2013)

Stanford Encyclopedia of Philosophy: Heidegger, Technological Transparency Paradox

Wikipedia: Confidence Trick, Follow The Money, Ponte Morandi, Regulatory Capture,Willful Blindness


Related posts: Defeating the Device Paradigm (October 2015), Transparency of Algorithms (October 2016), Pax Technica (November 2017), Responsible Transparency (April 2019), Whom Does The Technology Serve (May 2019)

No comments:

Post a Comment