Much of the discussion was about the threats posed to public reason by electronically mediated speech acts, and the challenges of regulating social media. However, although the tech giants and regulators have an important role, the primary question in the event billing was not about Them but about Us - how do *we* communicate ethically in an increasingly digital age.
I don't claim to know as much about ethics as the three professors, but I do know a bit about communication and digital technology, so here is my take on the subject from that perspective.
The kind of communication we are talking about involves at least four different players - the speaker, the spoken-to, the spoken-about, and the medium / mediator. Communication can be punctuated into a series of atomic speech acts, but it is often the cumulative effects (on public reason or public decency) that worry us.
So let me look at each of the elements of this communication in turn.
First the speech act itself. O'Neill quoted Plato, who complained that the technology of writing served to decouple the writer from the text. On social media, the authorship of speech acts becomes more problematic still. This is not just because many of the speakers are anonymous, and we may not know whether they are bots or people. It is also because the dissemination mechanisms offered by the social media platforms allow people to dissociate themselves from the contents that they may "like" or "retweet". Thus people may disseminate nasty material while perceiving themselves not as the authors of this material but as merely mediators of it, and therefore not holding themselves personally responsible for the truth or decency of the material.
Indeed, some people act online as if they believed that the online world was entirely disconnected from the real physical world, as if online banter could never have real-world consequences, and the online alter ego was an entirely different person.
Did I say truth? At the event, the three philosophers devoted a lot of time to the relationship between ethics and epistemology (questions of truth and verifiability on the Internet). But even propositional speech acts are not always easily sorted into truth and lies, while many of the speech acts that pollute the internet are not propositions but other rhetorical gestures. For example, endless repetition of "what about her emails?" and "lock her up", which are designed to frame public discourse to accord with the rhetorical goals of the speaker. (I'll come back to the question of framing later.)
The popular social media platforms offer to punctuate our speech into discrete units - the tweet, the post, the YouTube video, or whatever. Each unit is then measured separately, and the speaker may be rewarded (financially or psychologically) when a unit becomes popular (or "goes viral"). We tend to take this punctuation at face value, but systems thinkers including Bateson and Maturana have drawn attention to the relationship between punctuation and epistemology.
(Note to self - add something here about metacommunication, which is a concept Bateson took from Benjamin Lee Whorf.)
Full communication requires a listener (the spoken-to) as well as a speaker. Much of the digital literacy agenda is about coaching people to interpret and evaluate material found on the internet, enabling them to work out who is actually speaking, and whether there is a hidden commercial or political agenda.
One of the challenges of the digital age is that I don't know who else is being spoken to. Am I part of an undifferentiated crowd (unlikely) or a filter bubble (probably)? The digital platforms have developed sophisticated mechanisms for targeting people who may be particularly receptive to particular messages or content. So why have I been selected for this message, why exactly does Twitter or Facebook think this would be of interest to me? This is a fundamental divergence from older forms of mass communication - the public meeting, the newspaper, the broadcast.
And sometimes a person can be targeted with violent threats and other unpleasantries. Harassment and trolling techniques developed as part of the #GamerGate campaign are now widely used across the internet, and may often be successful in intimidating and silencing the recipients.
The third (and often unwilling) party to communication is the person or community spoken about. Where this is an individual, there may be issues around privacy as well as avoidance of libel or slander. It is sometimes thought that people in the public eye (such as Hillary Clinton or George Soros) are somehow "fair game" for any criticism or disparagement that is thrown in their direction, whereas other people (especially children) deserve some protection. The gutter press has always pushed the boundaries of this, and the Internet undoubtedly amplifies this phenomenon.
What I find even more interesting here is the way recent political debate has focused on scapegoating certain groups. Geoff Shullenberger attributes some of this to Peter Thiel.
"Peter Thiel, whose support for Trump earned him a place on the transition team, is a former student of the most significant theorist of scapegoating, the late literary scholar and anthropologist of religion René Girard. Girard built an ambitious theory around the claim that scapegoating pervades social life in an occluded form and plays a foundational role in religion and politics. For Girard, the task of modern thought is to reveal and overcome the scapegoat mechanism–to defuse its immense potency by explaining its operation. Conversely, Thiel’s political agenda and successful career setting up the new pillars of our social world bear the unmistakable traces of someone who believes in the salvationary power of scapegoating as a positive project."
Clearly there are some ethical issues here to be addressed.
Fourthly we come onto the role of the medium / mediator. O'Neill talked about disintermediation, as if the Internet allowed people to talk directly to people without having to pass through gatekeepers such as newspaper editors and government censors. But as Rae Langton pointed out, this is not true disintermediation, as these mediators are merely being replaced by others - often amateur curators. Furthermore, the new mediators can't be expected to have the same establishment standards as the old mediators. (This may or may not be a good thing, but in a later article, Sophia Ignatidou argues that "removing regulated, accountable and experienced journalists from the equation can only be deleterious to the public interest".)
Even the old mediators can't be relied upon to maintain the old standards. The BBC is often accused of bias, and its response to these accusations appears to be to hide behind a perverse notion of "balance" and "objectivity" that requires it to provide a platform for climate change denial and other farragoes.
Obviously the tech giants have a commercial agenda, linked to the Attention Economy. As Zeynep Tufekci and others have pointed out, people can be presented with increasingly extreme content in order to keep them on the platform, and this appears to be a significant force behind the emergence of radical groups, as well as a substantial shift in the Overton window. There appears to be some correlation between Facebook usage and attacks on migrants, although it may be difficult to establish the direction of causality.
But the platforms themselves are also subject to political influence - not only the weaponization of social media described by John Naughton but also old-fashioned coercion. Around Easter 2016, people were wondering whether Facebook would swing the American election against Trump. A posse of right-wing politicians had a meeting with Zuckerberg in May 2016, who then bent over backwards to avoid anyone thinking that Facebook would give Clinton an unfair advantage. (Spoiler: it didn't.)
So if there is a role for regulation here, it is not only to protect consumers from the commercial interests of the tech giants, but also to protect the tech giants themselves from improper influence.
Finally, I want to emphasize Framing, which is one of the most important ways people can influence public reason. For example, hashtags provide a simple and powerful framing mechanism, which can work to positive effect (#MeToo) or negative (#GamerGate).
President Trump is of course a master of framing - constantly moving the terms of the debate, so his opponents are always forced to debate on these terms. His frequent invocation of #FakeNews enables him to preempt and negate inconvenient facts, and his rhetorical playbook also includes antisemitic tropes (Hadley Freeman) and kettle logic (Heer Jeet). (But there are many examples of framing devices used by earlier presidents, and it is hard to delineate precisely what is new or objectionable about Trump's performance.)
In other words Rhetoric eats Epistemology for breakfast. (Perhaps that will give my philosopher friends something to chew on?)
J.L Austin, How to do things with words (Oxford University Press, 1962)
Anthony Cuthbertson, Facebook use linked to attacks on refugees, says study (Independent, 22 August 2018)
Anthony Cuthbertson, Facebook use linked to attacks on refugees, says study (Independent, 22 August 2018)
Paul F. Dell, Understanding Bateson and Maturana: Toward a Biological Foundation for the Social Sciences (Journal of Marital and Family Therapy, 1985, Vol. 11, No. 1, 1-20). (Note: even though I have both Bateson and Maturana on my bookshelf, the lazy way to get a reference is to use Google, which points me towards secondary sources like this. When I have time, I'll put the original references in.)
Sophia Ignatidou, The weaponisation of information is mutating at alarming speed (Guardian, 19 August 2019)
Alex Johnson and Matthew DeLuca, Facebook's Mark Zuckerberg Meets Conservatives Amid 'Trending' Furor (NBC News, 19 May 2016)
Robinson Meyer, How Facebook Could Tilt the 2016 Election (Atlantic, 18 April 2016)
Paul Lewis, 'Fiction is outperforming reality': how YouTube's algorithm distorts truth (Guardian, 2 Feb 2018)
John Naughton, Mark Zuckerberg’s dilemma - what to do with monster he has created? (Open Democracy, 29 October 2018)
Geoff Shullenberger, The Scapegoating Machine (The New Inquiry, 30 November 2016)
Zeynep Tufekci, YouTube, the Great Radicalizer (New York Times, 10 March 2018)
Wikipedia: Attention Economy, Disintermediation, Framing, Gamergate Controversy, Metacommunication, Overton Window
Related posts: Security is downstream from strategy (March 2018), YouTube Growth Hacking (November 2018), Polarization (November 2018), The Future of Political Campaigning (November 2018)
Updated 19 August 2019