We publish the detailed report of the speakers' talks at the policy workshop "Elections at the time of social media. European elections, disinformation, micro-targeting: what to do?", which took place last May 14th in Rome, within the project ESVEI promoted by OBC Transeuropa/ CCI
In a recent TED talk in Vancouver Carole Cadwalladr, the journalist who exposed the Cambridge Analytica case, turned the spotlight on the role of social platforms in the Brexit campaign, illuminating the themes of personalised political propaganda and online disinformation. A few days after the vote to elect the new European Parliament, marked by fears of interference and external influences through social media, this was the objective of the policy workshop "Elections at the time of social media", which was attended by about forty insiders. Questioning the potential impact of "Internet giants" on our democracies is in fact an indispensable step in order to be able to identify concrete measures to protect the integrity of public discourse and our democratic processes.
Cognitive bias and echo chambers
Illustrating the results of several studies on how users consume information online, Walter Quattrociocchi and Fabiana Zollo of Ca’ Foscari University of Venice explained that the spread of online disinformation depends on cognitive and psychological issues before technological ones. The so-called "confirmation bias" – our brain's tendency to give more credit to views that are consistent with our own belief system, and neglect or reject information that contrasts it – and human beings' tendency to be influenced by the opinions of the surrounding group ("conformism") favour the emergence of "echo chambers", closed information bubbles built around narratives shared by a certain group of people who, with their similar opinions, reinforce each other's beliefs. How to get out of these closed users and news structures not communicating with each other?
Lower polarisation to defuse disinformation
By observing the behaviour of a large number of users around a certain polarising theme, it is possible to identify in advance the potential targets of disinformation campaigns within the next 24 hours. It is therefore important to act preemptively, before the distribution of disinformation around issues susceptible to polarisation of opinions is triggered. "The question we are trying to answer now", added Fabiana Zollo, "is how to communicate the polarising issues in a more careful way, with particular care to create welcoming narrative frameworks and lower polarisation".
Complexity of the information system and virality
Nicola Bruno, Dataninja journalist and fact-checker, talked about the propagation mechanisms of disinformation with focus on sources. Online disinformation networks are often a galaxy of multiple sources, contents, and actors with different motivations: recognising the sources we deal with becomes complex. It is necessary to overcome the vision of fact-checking as verification of the single content.
The importance of sources also for a new media literacy
"Removing content, shutting down pages is perhaps not very useful and could be harmful. Rather, we need to shift our attention to the entire information chain – who the actors involved are, which audience is targeted, how intermediaries (or social media and their algorithms) work – to act on mechanisms, understanding what the incentives are. Are they economic or political in nature? How recursive are they? How much do they change?", pointed out Bruno, stressing the importance of working on sources in the work of educational fact-checking and media literacy.
Transparency and intelligibility to regain control as users
Giorgio Comai, researcher at OBC Transeuropa, illustrated the preliminary results of a collaboration between OBCT and Dataninja bringing some concrete examples of online information networks. There are networks of sites and Facebook pages, not visibly connected to each other, that promote each other's contents, making it increasingly difficult for users to understand why they are exposed to certain information. On the contrary, "it is important for all of us to be able not only to understand why we are faced with a certain source of information instead of another, but also to actually be able to determine the contents offered to us by digital platforms", stressed Comai, insisting on the importance of obtaining more transparency from the mechanisms that distribute information on the web. According to the researcher, the poor control users have over these aspects contributes to the climate of distrust and confusion in which we currently find ourselves.
"Regulating the Internet does not mean eliminating all the ills of society from the Internet"
According to Fabio Chiusi, researcher at the Nexa Center for Internet & Society and contributor of Valigia Blu, we should be greatly concerned by the policy approach that threatens to prevail, according to which “Internet regulation would mean eliminating all the ills of society from the Internet, hate, lies, propaganda, and a whole series of other problems". For example, the White Paper proposed in Britain by premier Theresa May deems that platforms must be held responsible for the contents they host and adopts the logic of the preventive filter operated by the platform itself through artificial intelligence and algorithms. The risks inherent in these approaches are enormous: they endanger freedom of expression, promote censorship, and expose our societies to dangerous authoritarian tendencies. "We need to regulate platforms, i.e. containers – for example in relation to online political advertising – but avoid content laws, about what is true and what is false".
More research, more journalistic competence, more dialogue between stakeholders
For political choices to be correctly interpreted, the debate on disinformation should leave ideology and go back to scientific research, where it belongs. "We need both more funding for research and more journalism capable of dealing with these issues competently and clearly", stated Chiusi, arguing that the creation of informal venues where these issues can be discussed with more stakeholders (researchers, journalists, technology companies, etc.) would help to find consensus around less toxic policy proposals.
Data Protection Authority: "it's time to join forces"
Online political communication is still a real "Far West" without rules, confirmed Riccardo Acciai, executive at the Italian Data Protection Authority, reconstructing the investigation opened against Facebook in relation to the Cambridge Analytica case in Italy. "We've been distracted for about ten years and we've allowed these platforms to expand dramatically and acquire huge revenues". Now that we recognise the problem it is difficult to do everything, but something can be done: first of all, ask that Facebook (and the other technological giants) respect the principle of limitation of purpose: the sanctions that can be imposed for illegitimate treatment of data are up to 4% of turnover. Furthermore, "we must join forces against the opacity of funding, both by using the transnational cooperation mechanisms introduced by EU regulations and by strengthening collaboration between the national guarantee authorities (Antitrust, Data Protection Authority, Agcom)".
Online electoral propaganda: self-regulation is not enough, a law is needed
According to Daniele De Bernardin of Openpolis, in the face of the many problems touched by the debate – from disinformation to privacy violation – it is crucial to focus on the specific problem of political propaganda on social media during the election period and fix what he defined "a very serious regulatory gap, with important implications for the quality of our democracy". In fact, while there are stringent rules for traditional election campaigns, today that the Internet is the main source of information there is total absence of regulation of online political campaigns, which work in a pervasive way and in total opacity. Attempts by civil society to address the problem of (lack of) platform transparency no longer suffice. According to De Bernardin and other experts, the European Commission's attempt to make up for the states' inaction by focusing on the self-regulation of platforms through the Code of practice on disinformation has too many flaws: "monitoring does not work, it must show that steps forward are made even though they are in fact very little".
Transparency obligations: towards an independent archive of online political advertising
The expectation that private companies will take steps to protect the public interest is probably unrealistic in the first place. The Italian Parliament must take action by establishing transparency obligations regarding online sponsored content by law: a public archive where all citizens can monitor political advertising, including the various elements of profiling used by advertisers (parties and candidates first and foremost). However, the issue has not yet entered Parliament and political parties appear uninterested.
Cambridge Analytica: the tip of the iceberg
In the last speech of the session Antonella Napolitano, advocacy officer at Privacy International, framed the issue of privacy in a broader context, interpreting the right to privacy in connection with all the rights it enables (dignity, freedom of choice and expression, etc.) and offered an overview of cases at the global level where the violation of sensitive personal data opens very concerning scenarios (manipulation, commercial exploitation, formation of public opinion, etc.). Since self-regulation has shown its limits, Napolitano urged stakeholders (civil society, protection authorities, political parties) to act for a change of practices, each in its own sphere of competence, “as we are running out of time”.
Multidisciplinarity and strategic litigation
The debate, concluded Tommaso Scannicchio of CILD, exposed a clear normative gap in terms of transparency of online advertising, which however requires a political will that is currently difficult to see. What to do then? It is necessary to make the most of existing regulatory instruments and encourage dialogue between different disciplinary profiles – data analysts, jurists, computer scientists, cognitive psychologists, perhaps even experts in military tactics to address a possible "infowar" dimension. It would also be desirable for the Italian Data Protection Authority and civil society to join forces, as is already happening in Austria, Germany, Belgium, Poland, and Hungary among others, both through communication campaigns and the tools of strategic litigation: “the sanctioning approach is what we have. We come from years of deregulation, of space entirely granted to the market, freed from rules: perhaps the time has come to reverse the trend”, concluded Scannicchio.
This publication has been produced within the project ESVEI, supported in part by a grant from the Foundation Open Society Institute in cooperation with the OSIFE of the Open Society Foundations. The contents of this publication are the sole responsibility of Osservatorio Balcani e Caucaso Transeuropa.