Sunday, April 28, 2019

Responsible Transparency

It is difficult to see how we can achieve an ethical technology without some kind of transparency, although we are still trying to work out how this could be achieved in an effective yet responsible manner. There are several concerns that are thought to conflict with transparency, including commercial advantage, security, privacy, and the risk of the device being misused or "gamed" by adversaries. There is a good summary of these issues in Mittelstadt et al (2016).

An important area where demands for transparency conflict with demands for confidentiality is with embedded software that serves the interests of the manufacturer rather than the consumer or the public. For example, a few years ago we learned about a "defeat device" that VW had built in order to cheat the emissions regulations; similar devices have been discovered in televisions to falsify energy consumption ratings.

Even when the manufacturers aren't actually breaking the law, they have a strong commercial interest in concealing the purpose and design of these systems, and they use Digital Rights Management (DRM) and the US Digital Millenium Copyright Act (DMCA) to prevent independent scrutiny. In what appears to be an example of regulatory capture, car manufacturers were abetted by the US EPA, which was persuaded to inhibit transparency of engine software, on the grounds that this would enable drivers to cheat the emissions regulations.

Defending the EPA, David Golumbia sees a choice between two trust models, which he calls democratic and cyberlibertarian. For him, the democratic model "puts trust in bodies specifically chartered and licensed to enforce regulations and laws", such as the EPA, whereas in the cyberlibertarian model, it is the users themselves who get the transparency and can scrutinize how something works. In other words, trusting the wisdom of crowds, or what he patronizingly calls "ordinary citizen security researchers".

(In their book on Trust and Mistrust, John Smith and Aidan Ward describe four types of trust. Golumbia's democratic model involves top-down trust, based on the central authority of the regulator, while the cyberlibertarian model involves decentralized network trust.)

Golumbia argues that the cyberlibertarian position is incoherent. 
"It says, on the one hand, we should not trust manufacturers like Volkswagen to follow the law. We shouldn’t trust them because people, when they have self-interest at heart, will pursue that self-interest even when the rules tell them not to. But then it says we should trust an even larger group of people, among whom many are no less self-interested, and who have fewer formal accountability obligations, to follow the law."
One problem with this argument is that it appears to confuse scrutiny with compliance. Cyberlibertarians may be strongly in favour of deregulation, but increasing transparency isn't only advocated by cyberlibertarians and doesn't necessarily imply deregulation. It could be based on a recognition that regulatory scrutiny and citizen scrutiny are complementary, given two important facts. Firstly, however powerful the tools at their disposal the regulators don't always spot everything; and secondly, regulators are sometimes subject to improper influence from the companies they are supposed to be regulating (so-called regulatory capture). Therefore having independent scrutiny as well as central regulation increases the likelihood that hazards will be discovered and dealt with. This could include the detection of algorithmic bias or previously unidentified hazards/vulnerabilities/malpractice.

Another small problem with his argument is that the defeat device had already hoodwinked the EPA and other regulators for many years.

Golumbia claims that "what the cyberlibertarians want, even demand, is for everyone to have the power to read and modify the emissions software in their cars" and complains that "the more we put law into the hands of those not specifically entrusted to follow it, the more unethical behavior we will have". It is certainly true that some of the advocates of open source are also advocating "right to repair" and customization rights. But there were two separate requests for exemptions to DMCA - one for testing and one for modification. And the researchers quoted by Kyle Wiens, who were disadvantaged by the failure of the EPA to mandate a specific exemption to DMCA to allow safety and security tests, were not casual libertarians or "ordinary citizens" but researchers at the International Council of Clean Transportation and West Virginia University.

It ought to be possible for regulators and academic researchers to collaborate productively in scrutinizing an industry, provided that clear rules, protocols and working practices are established for responsible scrutiny. Perhaps researchers might gain some protection from regulatory action or litigation by notifying a regulator in advance, or by prompt notification of any discovered issues. For example, the UK Data Protection Act 2018 (section 172) defines what it calls "effectiveness testing conditions", under which researchers can legitimately attempt to crack the anonymity of deidentified personal data. Among other things, a successful attempt must be notified to the Information Commissioner within 72 hours.

Meanwhile, in the cybersecurity world there are fairly well-established protocols for responsible disclosure of vulnerabilities, and in some cases rewards are paid to the researchers who find them, provided they are disclosed responsibly. Although not all of us have the expertise to understand the technical detail, the existence of this kind of independent scrutiny should make us all feel more confident about the safety, reliability and general trustworthiness of the products in question.




David Golumbia, The Volkswagen Scandal: The DMCA Is Not the Problem and Open Source Is Not the Solution (6 October 2015)

Brent Mittelstadt et al, The ethics of algorithms: Mapping the debate (Big Data and Society July–December 2016)

Jonathan Trull, Responsible Disclosure: Cyber Security Ethics (CSO Cyber Security Pulse, 26 February 2015)

Aidan Ward and John Smith, Trust and Mistrust (Wiley 2003)

Kyle Wiens, Opinion: The EPA shot itself in the foot by opposing rules that could've exposed VW (The Verge, 25 September 2015)


Related posts: Four Types of Trust (July 2004), Defeating the Device Paradigm (October 2015)

No comments:

Post a Comment