The HJS Virtual Event Series: ‘COVID-19 and Social Media – Meeting Challenges using Lessons Learned from Countering Terrorism’
- This event has passed.
The HJS Virtual Event Series: ‘COVID-19 and Social Media – Meeting Challenges using Lessons Learned from Countering Terrorism’
3rd June 2020 @ 3:00 pm - 4:00 pm
Following the extensive use of social media platforms by extremist groups and terrorist organisations for propaganda and recruitment purposes, technology companies have taken important proactive policy decisions on removing material and banning users from their sites. This discussion will focus on whether some of the same techniques and lessons learned can be applied to new challenges following the COVID-19 pandemic. Where possible, can counter-narratives be employed to address conspiracy theories put forward by extremist organisations and actors trying to explain the causes of the pandemic? And which aspects of the current crisis – health misinformation, for example – require a unique approach?
The Henry Jackson Society is delighted to welcome representatives from Facebook to discuss some key research questions and current solutions being fostered in the online space to counter challenges posed by COVID-19.
Dr Erin Saltman is Facebook’s Policy Head of Counterterrorism and Dangerous Organisations for Europe, the Middle East and Africa. Her background and expertise include processes of radicalisation within a range of regional and socio-political contexts. Her research and publications have focused on the evolving nature of online extremism and terrorism, gender dynamics within violent extremist organisations and youth radicalisation. She also manages Facebook’s work with the Global Internet Forum to Counter Terrorism (GIFCT).
Jessica Zucker is a Product Policy Manager at Facebook leading misinformation policy in Europe, the Middle East and Africa. Ms Zucker joined Facebook in 2019 after three years at Microsoft, where she worked in the US and in Europe covering cyber-security, election integrity and misinformation. Prior to working in the tech sector, Ms Zucker worked in nonprofit and government including at the US Department of State’s Cyber Policy Office, the US. Department of Defence Southern Command, and as a Fulbright Scholar in South Korea where she co-founded a nonprofit. A San Diego native, she holds a Bachelor’s degree in Political Science and Economics from the University of Miami and a Masters in Public Policy from Harvard University.
Nikita Malik is the Director of the Centre on Radicalisation and Terrorism (CRT) at the Henry Jackson Society. She is an internationally recognised expert on countering violent extremism, terrorism, and hate-based violence, with a focus on youth deradicalisation. In her role, she has worked with key policy makers and government departments in the UK and globally. A key component of Nikita’s work focuses on the propagation of extremist material online, including on social media platforms and the Darknet. Her research has put forward a number of solutions to foster engagement between UK government policymakers and technology companies.
Event Summary:
In ‘Covid-19 and Social Media – Meeting Challenges using Lessons Learned from Countering Terrorism’, Nikita Malik chaired a panel of Dr Erin Saltman, Facebook’s Policy Head of Counterterrorism and Dangerous Organisation for Europe, the Middle East, and Africa, and Jessica Zucker, Facebook’s Product Policy Manager leading misinformation policy in Europe, the Middle East, and Africa. The panellists discussed the challenges Facebook has faced in misinformation and extremism, particularly during the Covid-19 period, and how the company has innovated to ensure Facebook is a safe and secure platform. Ms Zucker began by explaining Facebook’s policy approach to misinformation. Central to this is Facebook’s coordination with a network of 60 independent fact-checking organisations and connecting people with more information from authoritative sources. During Covid-19, a this has manifested by connecting people to either local health authorities such as the NHS or more international authorities like the WHO. Ms. Zucker continued to say that Facebook has a policy of removing misinformation that will lead to imminent physical harm. Part and parcel of enforcing this policy has been consulting experts, such as health professionals during Covid-19, so Facebook can better understand the causality between misinformation and harm and enforce the policy accordingly. In other cases, when a piece of information has been rated false by fact-checkers but is not going to cause imminent harm, Facebook will create friction and limit the visibility of the information. This is coupled with placing fact-checking labels and providing a fact-checking article so users have access to more reliable information. Ms. Zucker offered that the results of this strategy thus far have been promising.
Dr Erin Saltman offered that there are many times the panellists paths do not cross in term of their oversight of policy. However, in relation to dangerous organisations and terrorism, Facebook has taken a similar approach of connecting with experts to create policies that are enforceable. Accordingly, Facebook is one of the only tech companies to have a definition of terrorism which it enforces through policy. Driving the implementation of this strategy is machine-human partnership with artificial intelligence handling the scale of extremism online with humans handling the scope. Accordingly, Facebook takes down 98-99 percent of what Facebook classifies as terrorism that the company finds itself, but relies on academic experts, NGOs, and other partnerships to manage the space as it is constantly shifting.
The panellists further outlined that Facebook has a rigorous appeal process for decisions from the account level to the content level. While there are certain areas that are not eligible for appeal, once a user has appealed a decision, the situation is presented to two re-reviewers with the possibility of further review if agreement is not reached. The necessity for an appeals process is born out the difficulties of creating policies suitable for 2 billion users globally spanning many countries and legal frameworks. A similar process is in place for misinformation, with fact-checking organisations themselves also constantly being certified and recertified by the International Fact-Checking Network.
The panellists concluded by addressing several questions regarding Facebook’s approach to the anti-vax community given the pertinency to Coronavirus. The panellists agreed that this is a delicate area because the misinformation within the anti-vax community is a ‘many headed beast’, though not always appearing to cause imminent harm. Accordingly, Facebook’s strategy begins with content labelling and providing further information. However, Ms. Zucker clarified that the anti-vax sentiment is not always the same as vaccine misinformation which Facebook policies reconcile. Facebook plans to prepare for the likely increase in vaccine misinformation by being in close contact with vaccine trials so the company can understand how to best enforce the misinformation policy. Through partnering with experts, Facebook believes its policies