DATE: 4 March @ 2:00pm – 3:00pm
EVENT TRANSCRIPT: Disinformation During COVID-19: The US-UK Experience
SPEAKERS: Damian Collins MP, Nina Jankowicz
EVENT MODERATOR: Dr Danny Steed
Dr Danny Steed 00:01
Good afternoon, ladies and gentlemen. Welcome back to the Centre on Cybersecurity and Online Threats here at Henry Jackson Society. We have a really exciting event this afternoon disinformation during COVID-19. I think we all know, we’re all pretty clear by now what we’re all going through and I don’t think anybody hasn’t had a barrage of confusion and misinformation, disinformation land on them somehow in this. So, the panellists that we have today are really quite exciting. We have Damien Collins MP, and Nina Jankowicz from the Wilson Centre, and they’re going to give us their view on the US-UK trends that we’re going through. The real questions that we were putting out to the guests were, what are the types of trends that we’re seeing with disinformation? Conversely, US, UK, what actions are being taken to try and combat this type of disinformation and what lessons are tumbling out of this? So, just first to introduce my first guest Damien Collins. Damien is the MP for Folkestone and Hythe, was chair of the DCMS subcommittee on disinformation and the creator of organization, Infotagion, which was designed specifically to combat COVID disinformation. My second guest, Nina Jankowicz, is the disinformation fellow at the Wilson Centre and author of the book “How to Lose the Information War”. Nina was just telling me it’s definitely having a paperback run later in the summer, so make sure you get it. Damien, please, would you like to go ahead with your remarks?
Damien Collins MP 01:51
Yes, of course. But thank you so much. It’s great to be with, with you and Nina for this event this afternoon. And on such an important topic. When I was chair of the Digital, Culture, Media and Sport Select Committee, we started an inquiry into disinformation and fake news in 2018, well in 2017 we started the investigation and produced final reports in 2019; it’s quite a long inquiry. But that was really born out of a sense of a growing feeling there was more and more fake news on social media in particular, covered a variety of topics and there was interests in particular as well as to whether the Russians have sought to directly influence the presidential election in America in 2016. And there was some initial research work being published about that. But when we look at the situation today, three years later, I think we see something quite different. And that’s, I think COVID is, well, because of COVID, the COVID is one of the principal factors, because disinformation has gone from being something that in the past was discussed largely in the political sphere, sometimes to do with fraud or spam content on social media. But we now see it in the context of a wider impact of society. There may be people in the past, he would have said, “well, nothing I’ve seen on Facebook would have convinced me how to vote in an election”. But what we’ve seen in the last 12 months is people’s attitudes towards something new being influenced and shaped by what they’re seeing on social media. And in particular, that’s not just affecting what people think it’s affecting how they act, as well. And we see that very clearly with something like Coronavirus, which is new. So we have conspiracy theories around how COVID started. Now, who caused it. The way we’ve seen disinformation around the vaccine in particular anti vaccine conspiracy theories trying to persuade people not to take it and that is we’ve seen as a consequence of that, you know, we saw stories last summer in the UK about people who attacked 5g mobile phone masts because they believed that they were spreading, the rays were spreading Coronavirus has been caused by 5g. So, we saw people attacking and setting fire to those phone masts as a consequence. We saw the film ‘Plandemic’ spread around the world in the summer last year again spreading conspiracy, medical conspiracy theories. On Infotagion we saw a series of extraordinary stories not just about how to treat COVID or the certain miracle cures that will protect you from it, you know, we saw there was a fake call distribution on WhatsApp, which was supposedly from someone who was a nurse in a London hospital saying that there would be no emergency service cover; no ambulances will be called out over Easter weekend because there was no capacity because of Coronavirus. We saw this photograph supposedly from a London hospital spread across the internet last summer that were purportedly showing sort of bodies all over the place in the London hospital, the hospital totally unable to cope with what was going on. But actually, these folks grasper identified as having been taken as part of a disaster relief operation in South America several years before, but nevertheless are shown as being true. So, we’ve seen in this context of COVID, how disinformation can spread. And I think what we need to do when we combat that is think not just about how do you rebut that content, but how do you give alternative sources of information for people to go to check stuff that whether it’s accurate or not, when we launched Infotagion, that’s what we tried to do to sort of say, if you see something that doesn’t look right, send it to us, we will fact check it, we will post the results for other people to see as well. So, we want people to be questioning what they see. But we have to understand the context within which people see things. People increasingly get their news and information from social media. The latest Ofcom media nations report for the UK, was published last summer shows that about 35% of people in the UK, get their news, get some of the news from Facebook. So, they’re not going to a sort of curated news site. They’re, seeing articles as they’re being shared by other people through the Facebook newsfeed or Facebook is recommending content to them, because it thinks that they are interested in it. And the Facebook business model is based around engagement, so the more you engage with certain topics of information, the more you’re more likely to see them. If you engage with conspiracy theories, you will see more conspiracy theories. If people share that content with you and you engage with it, you’ll see more of the same. Therefore, the context within which people see information, and the role of social media companies in recommending content to people, has become really important. And I would say that when we think about well, how do we combat the issue of harmful misinformation, disinformation that causes someone to take a substance that might be harmful for them, because they think it’s a cure all for a virus, or causes people to attack a mobile phone mast because they think it’s the cause of the virus. The context within which they see that information and the recommendation of that information to them, I think is as important as saying, what do we do about the people posting it? Why can’t we take down more harmful content? How do we stop people posting in the first place? Or how do we educate people not to believe it would question what they see? And therefore, when we think about the response to that, I think and what we’re seeing this year in the UK, is the government moving towards publishing its online harms bill, what we’re seeing there is a kind of requirement to create a duty of care for social media companies to act against known sources of harmful content. And that should also it might be that should also include not recommending and promoting that content, so not recommending stuff that we know that can cause harm. It also creates a sort of a really important debate, I think going forward as well, was when we consider what harmful content is, what do we mean? Now, these companies like Facebook, unsurprisingly, have a view on harmful content that is based largely on American law. As an American company, you’d expect nothing else, where people have a right to speak freely. The only inhibitor on that right, according to the Supreme Court, is if what you’re saying is going to cause imminent illegality, it’s going to provoke an action or an act from another person that will be illegal and harmful, and is seen as an imminent and credible threat. Now in response to things like the mobile phone 5g masts being burned down, Facebook removed those conspiracy theories. The conspiracy theorists then migrated to large Russian platforms to continue doing the same, but at least they were being marginalised. So, companies like Facebook recognise that imminent threat. But what do you do where someone is posting content not just an individual story that in itself could be a trigger that causes someone to harm themselves or others? But it’s a constant drumbeat of a thing of half-truth, partial truth, disinformation, conspiracy theories mixed in, but [inaudible], that’s largely what people see; how does that affect what people think and believe? And in the last year, I think we saw some evidence of that effect. With COVID, we saw it in the declining in polling data across Europe, showing a declining likelihood that people would take COVID vaccine if it was given. In France, they got to a point where only about half of people said they would take it. Even in the UK, we saw a rising number of people a considerable percentage of 30% or more, saying they wouldn’t take it. Now, that trend has reversed in the UK as we’ve actually started vaccinating people, the demonstration and reaffirm it, people say “I’ve been vaccinated and I’m safe,” it’s hard to turn that trend. But, you see how the preponderance of disinformation around the COVID vaccine changed people’s view as to whether they were likely to take it or not. We’ve seen that in a political context in America with the views of Republican voters as to whether they believe the election was fortunately stolen from Donald Trump, which from a standing start on polling, they reach 75% of people just before who were voting republican just before the inauguration. So, this other problem of saying well, maybe not necessarily the most egregious forms of disinformation, which we know are out there and which should be removed and could cause an imminent harm. But there’s conduct constant drumbeat which could shape and change people’s worldview and shift them into believing, or likely to believe a conspiracy into believing a fact. And that I think we should see as being a really important issue that we have to think about and how we respond to it. And again, it goes back to, I think, again, two things, the role of social media companies and allowing their systems to amplify or even recommend that sort of content. But then you look at the role of bad faith actors as well. And again, I think we’ve seen a scene of change here to where going back to the Russians in 2016, they were creating content and targeting American voters with ad tools with that content, to try and get them to act in certain way. Here, I think bad faith actors don’t have to do that quite as much [inaudible] original content. What they can do instead is they can look for conspiracy theories, in divisions of opinion, and they can seek to amplify those divisions they can, they can seek to look for content being created by people seeing within this country, within the United States, and amplifying that and creating a bigger audience for that rather than having to do it for themselves. But again, that goes back to this sort of network effect. And understanding that in the world where people consume media and information, not necessarily from mostly broadcasters who decide what they’re gonna amplify what they’re going to schedule, what order things go in. But instead through a feed, an algorithmic feed, which is designed around them and the things they’ve seen before, where actually increasingly they’re less likely to see a breadth of opinion, but a constant drumbeat based on the information they’ve largely sought out in the past. And that, I think, is the sort of challenge of COVID disinformation and the challenge we will face as a society, thinking about how we confront this in the next few years. Now, for me, a big part of that is creating a regulatory environment, which helps the social media companies make better decisions about what they allowed on the platforms, and what they choose to promote. And I think that that is important as well as you know, user education and giving people the tools they need for themselves to try and try to identify what is likely to be true or false or harmful, or safe, and that they should rely on. So, I think that these for me have been the lessons of the last year.
Dr Danny Steed 12:03
Damien, fascinating, thank you for your remarks. Everybody, just before I hand over to Nina, please do get the Q and A’s into the little q&a chat channel. Sean and I will be triaging and going straight into q&a sooner. So, please keep them in we want to make sure we have pretty lively questioning today. Nina, over to you.
Nina Jankowicz 12:31
Thanks, Danny, and very happy to be here with you and Damien this morning. It’s interesting, you know, we’re looking at this from a comparative perspective. But for me, as somebody who mostly focuses on disinformation in Central and Eastern Europe, this whole phenomenon of COVID disinformation has really highlighted for me actually how many similarities there are between most of the countries dealing with us. We’re all dealing with a perfect climate for disinformation, so much uncertainty, and fear and really a lot of unknowns in this equation, all of which contribute to the ability of malign actors, whether those are foreign governments, domestic disinformers; or folks who are looking to make a profit as we’ve seen with snake oil salesmen and hucksters all over the information space since last March, and really allows them that that perfect climate and really manipulative narratives that they can put out there and achieve their goals. A lot of what I’m going to say really does support what Damien has already laid out for us. I’m going to talk about some trends in the information space beyond specific narratives and kind of changes in the infrastructure that I’ve noticed over the past year in particular. The first is conspiracy convergence, and Damian kind of alluded to this a little bit. What we’re seeing is a way in for many of the folks who are spreading disinformation, through the infrastructure of social media, through different fissures or vulnerabilities in our society. So I’ll give you one example, which is a piece that I reported out for the Atlantic in May or June of last year, looking at a gym owner in my home state and actually hometown of Hillsborough, New Jersey, a graduate of my high school who really was, you know … a couple months into the COVID pandemic and the shutdowns in the state of New Jersey, wanted to reopen his gym and felt that he could do so safely with enough spacing with enough distancing and was really making an economically based argument to reopen his business. He decided to post his call for reopening and his decision to reopen in the face of the governor’s order in New Jersey in a Facebook group called reopen New Jersey. And his cause really became a cause célèbre in the reopen movement which had attracted a lot of different people, especially because he was coming from the health community, there were people who were kind of in the alternative health or medical freedom community who kind of glommed on to his cause. There were even Q Anon supporters who showed up at the rally when he reopened his business. And he became like a conservative firebrand, essentially, even though when I interviewed him, he said, “You know, this isn’t about politics for me, it’s about supporting my family, I have three young kids, I feel I can do this safely, I’m going to have workouts outside.” And that isn’t actually the agreement that he reached with the local police after a while. And he was embarrassed and confused about why people showed up to his reopening with Q Anon flags, and he actually asked them to leave. But this just shows, especially through the infrastructure of things like Facebook groups, which I’ll get to in a second, how these conspiracies can converge, how there are openings within these specific vulnerabilities and disinformation vectors online. And so that brings me to the second trend, which is groups and closed spaces and how they really amplify and draw in new adherence to many of these theories. So, I’ve been tracking groups for a number of years. And what’s interesting is that since Facebook’s pivot to privacy after the Cambridge Analytica scandal, they’ve really become a big part of Facebook’s engagement metrics. You might notice this happens to me every couple of days now you open up your Facebook app, and confetti pops up from the bottom and it says, here’s where you can access your groups more easily. Why is Facebook doing that? Why are they incentivizing you to go to your groups? Why do they give you special notifications when people are posting in groups? It’s because people told Facebook, we want to feel like we’re talking to our friends and family online, we want to have a more, you know, tailored experience and people are joining groups. There was a famous Facebook ad at the Super Bowl last year that said, you know, are you ready to rock and it showed geologists and rock climbers and rock musicians, right, you can find your tribe in Facebook groups. But on that other side of the, you know, positive spectrum, you might see dog pictures or baby pictures, and you might be in a group of yoga moms. There’s also the conspiracy theorists and folks who are spreading disinformation. And as I said, I’ve been tracking this for a number of years, it’s only really gotten worse as Facebook has incentivized people to join groups and engagement in these groups. And disinformers have recognised this as a perfect vector for amplifying disinformation. Basically, when you look at a group, this is a group of people who are already vulnerable in some degree, especially when we’re talking about, you know, fringy ideas like medical freedom, or a lot of anti-vaccination groups that existed even before the COVID pandemic, right. And they see this trust that exists in these groups; community where people, you know, they trust the sources of information here, because again, it’s a smaller universe that they’re operating in, in many cases, they’ve operated there for the last two to three years. And this is a great place to just drop a link. You don’t need to even pay for advertising or micro targeting. And it gains its own kind of legs. Because Facebook again is incentivizing people to engage with it. And Damian touched on how you know, on Facebook, the most engaging content is often the most enraging and emotional content. And that’s very true with conspiracy theories. So, if something is doing really well in a group, and people are getting a lot of notifications for it, and there are a lot of comments going on, people are more likely to engage with that because it’s going to be bumped up in their feeds, they might even get a special notification about it. And this is really become a problem during the Covid-19 pandemic. It’s why we’ve seen so many groups shut down from everything to do with the Plandemic video to the Stop the Steal movement, and when you’re in these political groups, it’s not just like they’re talking about the insurrection at the Capitol and how they love Donald Trump and that’s it. No, again, we have this conspiracy convergence with, you know, spokes of a wheel, the core is that they’re all united that there is this trust in the group that you know, Susie from down the block, added them to the group and they trust Susie, so therefore, everyone there is trustworthy. And the information there isn’t scrutinised as much as the information that might be on the regular news feed or in the mainstream media. So, what actions have been taken? Far too few, in my opinion. Obviously, we were in a unique situation here with a change of administrations and the transition to the Biden administration. I think we’re seeing a little bit more attention being paid to disinformation right now. But obviously, this issue has been extremely polarised in the United States and every time we have a hearing with social media executives on Capitol Hill, it becomes a place for sound bites rather than a place for really fulsome discussion of solutions and an understanding of how these platforms are working. So we’ve not seen much progress in the regulatory environment, although I am hopeful that we will start to see some moves in that direction now that Congress is controlled entirely by one party. And we’ve seen the social media companies obviously take a few steps, in my opinion, too little too late; political advertising is back on as of today in the United States. We’ve seen Facebook just recently decide to remove some more COVID disinformation and before it was just related to the sales of medical equipment, and PPE, which is crazy, you would have thought, you know, a year into this, that we wouldn’t be making these decisions now. And many of the other social media companies are not doing much better. But what has been inspiring, for me is the kind of citizen based debunking efforts that have been going on. I love watching videos of doctors who are on Tik-Tok debunking conspiracy theories that they see there in a very compelling and down to earth fashion, kind of bringing citizen science to the people. So, you know, we see civil society filling in some of the gaps. Obviously, we can’t rely on individual overworked doctors to be debunking the conspiracy theories that are being spread at an industrial scale, in fact, many of them by our own elected officials, but that hasn’t given me a little bit of hope. What lessons can we learn? The first and most important lesson of the past year for me, and this is one that I have known for a long time, having worked in Ukraine, where the consequences of disinformation are, you know, on the streets every day when you’re walking around and seeing news from the front in the east, online harms have offline consequences. I think in the United States, we have been very reticent to buy into that fact. I did a hearing with the House Intelligence Committee in October, right before the election, not a single republican showed up. And even among the democrats who were gathered there to question me and the other witnesses, there was some incredulity about the idea that silly memes on the internet could cause offline harm. And I think that now with you know, COVID, hopefully starting to be in the rearview mirror, and with what has happened at the Capitol on January 6th, hopefully that is more actual, for many of our representatives, that this isn’t just about silly memes that those silly memes do inspire offline action very, very often, more often than I think many of them want to believe. And that brings us to the importance of clear and transparent communications from our leaders, I think, where the UK and the United States have diverged, it is because, and I know there have been, you know, misgivings about how the government in the UK has handled some of the comms related to COVID, but what we’re talking about is a clear divergence here. You’re talking about, you know, how they communicated the tier system we’re talking about is COVID, real or not, and we’ve had people, you know, in our elected positions, really giving rise and endorsing, and in some ways laundering those conspiracy theories; giving them legitimacy. And I think that’s a real shame. We cannot fight disinformation, whether it’s coming from abroad or with inside the house when our elected officials have decided to dispense with the importance of truth and the facts. And the third lesson is the importance of clear reporting and responsible reporting, even among, you know, reputable mainstream outlets, we have seen some really, really kind of misleading headlines given the nuance of this situation, given the fear and uncertainty surrounding it. I think the media is learning a lot of lessons. Just the other day, for example, when President Biden announced that there will be enough doses of vaccine for all American adults by the end of May a lot of people were celebrating that thinking, “Oh, we will be vaccinated by the end of May”. No, it’s a, it’s purely a question of supply. And the headlines needed to be tweaked to reflect that. So, I think there’s going to be some introspection among the journalistic community as well. And then finally, talking about regulation, I’ll end exactly where Damien did. You may be aware of Section 230 of the Communications Decency Act here in the United States, this has become again, really a dividing rod among folks talking about social media regulation, essentially what it does, it’s 26 words of bill passed in the 1990s, that allows internet platforms a limited liability for the content that their users post. And people on both sides of the political spectrum have been calling for the either amendment or repeal of Section 230 so that we can hold social media platforms liable for the content that their users posts and as someone with a background in democracy promotion and who has studied, nascent attempts at internet regulation around the world, I am here to say that I do not support such, such a … we could talk about amendments but definitely not repeal. Essentially, what section 230 does is it does allow users to express themselves. If section 230 goes away, we will see a lot more clamping down on freedom of speech from the social media platforms. It’s essentially what happened in Germany with a NetzdG law where platforms are held liable for potentially illegal content that their users are posting, and in response platforms just started taking down content on mass in order to avoid fines. So, I think we need to think a little bit more creatively about this. Rather than, you know, trying to fit a square peg in a round hole, I think we need to think about new regulations that fit the internet era, rather than trying to apply these kind of old telecoms principles to something that is entirely different and moving at a much faster speed, and that we have a lot greater challenges than we did before. So, I’m hoping to see some creativity from Congress. That’s not always what they are known for. But one thing that I think is certain we need to look at transparency and oversight as a first step. And that’s what Damien and his committee did so well and we learned so much, everyone around the world did, from their investigations. And I’m hoping to see that sort of bipartisan effort coming out of Congress in the next couple of months and years. So, I’ll stop there. Thanks so much.
Dr Danny Steed 26:13
I’ve been frantically coordinating with Sean, because there’s been a quite an avalanche of questions, Nina, while you were speaking. So, I want to get straight into this. Sean, can we bring Jim Clark in please?
Jim Clark 26:34
Hi, there. Thank you for letting me ask a question. Really interesting talks. And so, my two questions were should Emmanuel Macron and the other EU leaders who disputed the vaccine efficacy be marked for disinformation and same way Trump was? And, I’ve noticed that Q Anon followers are hoping that Trump’s going to be made president again today; what will they do when they are disappointed?
Nina Jankowicz 27:09
Damian, do you want to take the first one, and I’ll do the second?
Damien Collins MP 27:13
Sure, I was just going to say on the second, I think they’ll be as disappointed as they were when JFK Jr. didn’t come back from the dead to be Trump’s running mate in the election. They seem to move on from that reasonably swiftly. But there was a huge campaign asserting that was about to happen. Even sort of, you know, Trump JFK sort of yard signs and things that are all over the place. With regards to what other leaders say, well, firstly, we must remember that Trump’s wasn’t the only world leader who got his content flagged, President Bolsonaro in Brazil also was as well. And the challenge here is I think it’s not just, you know, someone a leader saying something you disagree with, I don’t think it’s the job of a social media company to give an opinion on anything controversial thing the leader says. The thing is, is that leader using their position to assert something that is not only plainly not true, but actually by saying it could cause harm could cause direct harm to, to their people. And I think now, I think on the debate in Europe on AstraZeneca, I think what people like Macron have said, and other EU readers have been, while they’ve been proven to be wrong, I think by the, by the success of the AstraZeneca vaccine in the UK, but they’ve given an opinion on it. I don’t think that opinion itself is likely to cause direct harm. But I think any leader anywhere in the world, should be prepared to see their content flagged by social media company if what they’re saying is both not true, and likely to lead to harm.
Nina Jankowicz 28:38
I definitely agree with that. And I’ve been for the past three years, basically clamouring for social media companies to impose and implement their terms of service very equitably across the board, because some people, members of Congress, world leaders are getting away with spreading potentially extremely harmful disinformation with a large megaphone, who’s to say that influencers and even just little guys who then get retweeted or endorsed or shared by, you know, these folks aren’t going to continue to do that. And we need to kind of set those standards and make them really, I think, you know, we need to expect that they’re going to be enforced. And unfortunately, across the board, whether we’re talking about disinformation, harassment, abuse, and other online harms, we’re not seeing that right now. So, I come down in the same spot as Damien. Regarding Q Anon, obviously, yes, they’re going to be disappointed again today, even though Washington has kind of gotten on high alert, and thanks to some online threats and our hindsight that is 20-20 related to physical threats against the Capitol and Congress in particular. One thing that I’ve been talking about a lot and have kind of been disappointed in the discourse surrounding Q Anon. There’s a lot of desire among Americans in particular to kind of point the finger and laugh at these folks. And something that I really try to bring forward in my work is that there’s a reason people buy into conspiracy theories. Again, obviously, we’re in an extremely uncertain fearful time, that’s part of it, they want to make sense of something that’s extremely complicated, that’s another part of it. But they might feel, in many cases left behind by politics and the media and that’s why they found this community. Again, a lot of them are friends, they exist in these groups together. So, I’m really pushing for a lot more empathy when these events happen, as they probably continue, will continue to do. But we reach out as people if you have folks who are, you know, Q Anon adherents in your life, not kind of taunting them, but rather listening to them and understanding what made them, you know, find Q Anon in the first place, what they found appealing about it. And one thing that seems to have been useful for people who are trying to bring their Q Anon friends and family back from the brink is talking about, again, that real world harm talking, about the instances in which Q Anon supporters have perpetrated or nearly perpetrated acts of violence, these sorts of things, the instances where these prophecies have not come true over and over and over again, the financial motivation behind those who are involved with Q Anon, which the Atlantic in particular has done some really great reporting on. All of this stuff, kind of coming to people not with debunking and fact checking, but from a human level and understanding that it’s going to take a long time, it’s not going to be an instantaneous thing. It is like getting somebody out of cult or an extremist situation, not really just, you know, fact checking them on disinformation. And approaching the situation with empathy is critical. And I hope that everyone who is considering having such a conversation does that.
Dr Danny Steed 32:01
Thank you. Thank you both. Can we get Robert Blakebrough next please?
Robert Blakebrough 32:10
Oh, yes. Thanks again, for the opportunity. Yeah, I obviously can understand what’s being said here, and it is extremely worrying because of the power of all this new media. So as my question says, what do the panellists think, will they comment on how we can get a more reliable and broader width of opinion into the media and on social media platforms? Easy question. Very difficult answer I suspect. Thank you.
Nina Jankowicz 33:01
Sure. One of the things that is a big part of countering disinformation is building up a better, more reliable, more robust, more trustworthy, regular mainstream media environment. And I very much admire the BBC system in the UK and I think here in the United States, we are woefully behind in terms of our public media. In my book, I cite a statistic and I think it’s a bit old now. But it’s that one point in the past two to three years, I think it was 68% of Britons would trust the BBC in a time of crisis. And I think, you know, that’s an extraordinary statistic for any American to hear. I don’t think even though PBS and NPR, our public broadcasters are among the most trusted media in the United States, they don’t come close to 68%. And we only spend $1.35 per person per year on our public broadcaster. So, one thing that I’m clamouring for, in order to bring more nuanced perspectives to Americans is much broader funding for public broadcasters, especially because in the United States, they cover news deserts. And so you might be in South Dakota, there might not be a local news station there, but your PBS and NPR stations will bring you that perspective and cover what’s going on in places that you know are far away but have a huge impact on you know, the average South Dakotan’s life. So that’s extraordinarily important. In terms of how to bring more diverse perspectives to social media, I think we hear a lot from the executives, tech executives, that’s not really something that they’re interested in doing based on their business model. Again, as Damian pointed out, you’re going to see content that you’re more likely to interact with and anyone who’s been on Tik-Tok here will know that based on what you like what you watch through to the end, you’ll have either you know a feed full of cute animals, which is what I have or a feed full of conspiracy theories or something in between, it really is able to predict what you’re going to see and interact with. It’s just not the business model of these companies to be feeding us with diverse perspectives. And, I’m not sure we should expect that of them, it would be a very different experience to be on social media platforms where you are being forced to constantly, you know, interact with content that is very, very different from your own worldview. And I’m not sure that it’s necessarily productive. But, what I do think is productive and what the company should be investing more in perhaps taxed, as has been floated in the UK, or be, you know, investing more out of purely the goodness of their hearts and the fact that they are, you know, multibillion dollar corporations is media literacy and civics. We often overlook the human component of all of these campaigns, the idea of building a healthy news diet and understanding how different parts of society are talking about these issues. And it’s something that I try to do consuming, you know, a wide variety of sources, obviously, not everyone has the time or studies these things like I do, but I’m trying to teach you know, your average person, how to build that healthy diet, when, when they’re being bombarded with the sources that they are on social media, understanding that that is created in such a way curated in such a way to feed their interest and emotion and encouraging them to seek different perspectives elsewhere, I think is the the most democratic way that we could go about this without asking the social media companies to, to curate feeds in a way that I personally would not like to see them do, because I don’t think they’d make good choices based on what we’ve seen so far.
Damien Collins MP 36:48
I think one of the problems is that, you know, we’ve got systems that weren’t necessarily designed for sharing news and political ideas, which are. So, if my experience of Facebook was principally taking an interest in Manchester United Football Club, and engaging with other fans of that club, and looking at goals scored by that club, and buying the merchandise of that club, you wouldn’t expect Facebook’s response to be what we think you seem to be very fixated on this one football club and we think you should have more of a plurality of other clubs so you can determine whether there are other teams that are sort of better than the team you support. That would be a sort of really bizarre thing and not in the interests of the user. But, in an environment like that, or the kind of next up feature on YouTube, if you’re if you’re interested in particular genre of music the next up, you know, gives you more music that is similar may help you discover music, which is similar, that you didn’t know about and you think all those things seem good. But when you put your political ideas and disinformation conspiracy theories into a system like that, that is principally designed to give you more of what you want to keep you on there longer that creates a different sort of challenge. I think the issue here is when there is known bad stuff, its how good are the companies identifying the same sort of information that’s harmful, has been recognised as harmful, it’s already been removed from being reposted in the same context, how good are they at learning that if they’re serving content to preteen girls, that leads to self-harm, that their system shouldn’t be actively recommending that to similar users? Well, now, I believe they have got the data capability to do that. And because this is basically using the same technology that’s used to hold people’s attention, to recognise when you’re directing harmful stuff at of them as well. And even if what they could do is say, we may not be able to read in real time, everything anyone posts on our on our systems, but what we can do is when that starts to reach a certain audience, or a certain scale, and what we can be good at making sure that we’re not actually recommending content that is clearly harmful. And that we’re and we’re actually removing stuff we’ve already identified as being harmful from the system or treating people who are, you know, who’ve got channels on YouTube with very big or very large numbers of subscribers, or people who are the moderators of groups with hundreds of 1000s of people there is actually setting a slightly higher standard or having a high level of interest for the platforms and what goes on there. And I think that’s what we could see with the online harm system of regulation, where you see a bit of a sliding scale there is a bigger intervention expected in the places where the greatest amount of harm could be caused either from a group with a massive following or by the platform actively amplifying something that is that is known to be harmful. And I think that that’s where we surely expect to see more, a better intervention there as well. I think that’s what we can reason about the companies to have the information they need to be to be more responsive there too. But the problem is, you know, if they didn’t necessarily think if their responsibility to do that, and that’s, that’s the kind of the mind shift change that we need.
Dr Danny Steed 39:50
Thank you, Damian. I think we’re going to Debbie Abrahams next.
Debbie Abrahams 40:03
Thanks very much for this is very, very interesting as always. I wondered, first of all, if you wanted to comment on the work that’s been going on around greater global coordination, so for example, around research and monitoring. But also, and this is particularly to Damian, recognising that political parties use social media in our political campaigning, do you think that this is going to hamper the priority that regulation and enforcement really needs
Damien Collins MP 40:59
So, I think, I mean, social media has become an incredibly important part of political campaigning. And I think, so, either a couple of important things here. One is that political parties campaigning under their own badge, do so with, you know, with great jeopardy for themselves, the, you know, parties routinely get things wrong, they say things they shouldn’t say, they make mistakes in their campaigns. But, if their names attached to it, they get called up, for it provokes a debate about, you know, why did you say this? Why did you make this claim? Is that accurate or not? That’s, that’s part of political debates. I think part of the problem we’ve seen with, you know, say disinformation campaigns around politics is its not always obvious who’s doing it. And you see, organisations that pop up, suddenly spend a large amount of money, where you don’t really know where that money is coming from, and who is promoting that particular message. If let’s say someone created a deep fake film of a politician saying something they’d never said, and pay to advertise that on Facebook, currently, and as Nina said, you know, particularly now that Facebook have removed their political ads ban, Facebook would simply say it is not there, they don’t have a role in intervening. So, even if it’s someone’s created, something’s totally fabricated and they use the ad tools to promote it, then there’s no intervention that can be made, the company won’t intervene on that. And I question, whether that is right. The other thing like questionable political ads, in particular, is the targeting tools that are being used. And there’s a dispute between the tech companies on this. So, Facebook have got a tool called look alike audiences where you can, you can, you know, take a set of data about people who are your known supporters, create a look alike audience using Facebook page for people that have never contacted you, and you don’t know, and target them and use assumptions made about them to, to micro-target them with messages. Now, Facebook let you do that, YouTube don’t. YouTube have got a similar tool called customer match. And they’ve stopped the use of that for political campaigns. I think some of the micro-targeting around people’s narrow interests in political ads, I’m not sure that that is well served by that, I think, but I think ads and political ads on social media have a role to play. I think maybe some of the targeting tools should be slightly more general based on you know, age and geography, that constituency basis. And there should be more, and it should be more transparent who is paying for those ads and who’s posting them. We should, you know, we were still yet to translate into law, the sorts of rules we have for offline campaigning online. I put a leaflet through someone’s door, I have to say, who’s paid for that leaflet and which candidates there to promote it. There’s no obligation at all to do that online. And I think there are there reforms we can make, which I think will create a healthier ecosystem for online campaigning, the politicians have always wanted to campaign in the places where people are. And one of those places is social media but I think it needs to be done in a in a fair and transparent way.
Nina Jankowicz 43:51
Yeah, I would agree. And we’ve got a similar situation in the United States. I won’t bore you with too many of the details, since I don’t think it really affects this audience too much. But, I will say that one of the things that’s most disturbing to me is that political advertisements are allowed to target people based on race and socioeconomic status. So, it’s not just about your district, if you’re running for Congress, we have seen a lot of evidence, in fact, from a Channel 4 documentary if I’m not mistaken, that Republicans in particular have been targeting black voters not to get out the vote, but to suppress the black vote, because they knew that that was a challenge for them and their particular constituencies. So that’s extremely worrisome to me. On the global coordination, part of things. I think it’s improving, but the structures that exist, unfortunately, they they’re a bit outdated. We have spent the last four or five years trying to create new structures in many cases, creating what I call the band aid effect. Everyone’s like, okay, here’s this new structure that the EU and NATO are working on together to fight disinformation and that’s going to cure all of our problems. And we’ve seen this happen in multiple international organisations, everyone has, to some extent their counter disinformation programme and joining them up is extraordinarily difficult. In fact, many of these organisations are competing for funding most of the time. And in part, I think this is because the United States has been so absentee on this issue, we have left a vacuum of leadership, because we do throw around a lot of aid, and you know, democracy support programming, things like that around the world. And we have not really been addressing this to the extent that we should have. So, I’m hopeful that that will change under the Biden administration, and that the US convening power can kind of put that forward. In lieu of that the UK Government has been doing a lot on this issue. They’re, you know, very involved in the G7 rapid response mechanism that is dealing with disinformation, which is kind of communicating about threats as they’re happening among the G7 nations. There is a working group that was created through the Cabinet Office, dealing with these issues. And there’s been a lot of upscaling and training that the UK Government has done for Allied governments around Europe and other key constituencies around the world. So, I think there is hope. Unfortunately, the gears of government and bureaucracy are not super well suited to responding to these threats with their multiple levels of sign off needed and the way that, frankly, the government’s move incredibly slowly. So, the global coordination effort can certainly be stepped up. One thing that I’m also pushing for in case there are any donors and implementers in the audience is that in order to reduce some of the competition that we’ve seen among grantee organisations, to encourage and require of grants and grantees that they are working together, that there is some element of perhaps transatlantic coordination or cooperation and it’s not just individual organisations replicating behaviour that’s ongoing around the world. We really need to be more joined up and cooperative to the extent that it is possible here.
Dr Danny Steed 47:09
Nina, thank you. I’m just spying your dog making a cameo in the corner. Not sure if anyone else spotted? Now, I’ll just read out his question from the chat channel for both, it’s not addressed to either, in particular. Are current disinformation challenges best seen as appeals to outsiders or just the so called losers from globalisation, rather than the appeals to the tribal left or right, if so, can the mainstream left and right work collaboratively in the national interest?
Damien Collins MP 48:29
I don’t believe that it’s a question of, you know, people who are who have been the losers of globalisation or for whom there are losers, the financial crisis, who, in desperation, look for this information. I think it’s I don’t think that at all. And I think actually, if you look at people that attend Q Anon events or look at the sort of people who were, you know, who took to the streets of the Washington on the sixth of January, in protest at Joe Biden being sworn in as president, I think it was quite, it was quite a diversity of people, you know, incomes, backgrounds, interests, that that were reflected there. I think it is, it is more widespread than that. I think there is a there is a you know, I think a growing message that is that I think people have bought into for different reasons, that, you know, you can’t trust the government, you can’t trust experts, you can’t trust institutions, you can’t trust the mainstream media, you know, you have to seek out the truth in these forums that exist on social media, and the more you engage with them, the more you get this totally different, different worldview. Now, people have always been drawn to buy into conspiracy theories and disinformation. That has always been the case. But I think we’ve never had a situation before where access to that sort of information is so widespread and so easy, and then it’s possible to create a group of people with a preponderance to believe something together in large numbers and in spaces online, from where they can be effectively radicalised and spread the virus of disinformation even further. So, I think the, the tools that have been created have become a really important part of the way the network effect has been created. If, you know, 200 years ago, someone with a conspiracy theory might have stood on Hyde Park corner and spoke the world about it. But the amount of damage they could do was relatively small. But that’s, that’s very different now. And that’s why part of the challenge is not just to say, you know, when, when we met when we first started the select committee inquiry, I started out thinking that, you know, disinformation was a bit like other forms of bad content that we saw online that it was, like pirated, copyright, pirated material and other things. There was question of this stuff is bad, why can’t the companies be more effective at taking down this stuff? And that, of course, you know, takedown of stuff that’s clearly harmful and hateful is, is important. But it’s the kind of amplification effect that the tools of these companies allow. That’s I think, what we should focus on the most. And I think that is what is causing this phenomenon.
Nina Jankowicz 50:56
Yeah, I would agree with that. And on the on the question of cross, you know, party coordination, this is something that has been really difficult to watch from here in Washington. We’ve seen both political parties use disinformation as a political cudgel, to serve their own momentary ends. And I think that has been to the detriment of democratic discourse here in the United States, and the general awareness and understanding of these issues and how they trickle down to affect, again, real people in the real world, not just online. And so this is something I repeat in every congressional testimony I do but disinformation is not a partisan issue, it is a democratic one. And the sooner that we can see that sort of coordination and cooperation across the political aisle, and example setting, as I was talking about before, from our political leaders who set the tone of the discourse in the country, I think we’ll start to finally be able to move forward on these issues; dealing with social media regulation, as Damian was just talking about, dealing with the campaign regulations that we need to update so that disinformation is not incentivized. Because right now, in in the United States campaign environment, there’s no real reason not to use disinformation, it seems to work. All of these things are incredibly important, but can’t really be achieved without, you know, cross partisan, bipartisan cooperation.
Dr Danny Steed 52:25
Wonderful, thank you. I’m sure we’ve got time for a couple more coming through. Can we go to Joy Wolfe? I know, you’ve put in a couple down there. But would you mind just leaving us with your favourite question?
Joy Wolfe 52:44
My favourite question, I think, is probably whether the one of the reasons for the misinformation taking root was that there was so little proper information that came out from special sources at the beginning. And also, there’s, as I say, a lot of misinformation circulating, and not nearly enough good communication, which really allowed the bad information to flourish.
Nina Jankowicz 53:19
I definitely would agree with that. That has been one of the big problems here in the United States. And I think if you look at some of the countries that have done a good job communicating in particular, New Zealand, to some extent, Australia, you know, even I mean, Germany is not doing so well at the moment, but had been doing a lot better during the pandemic, and at least they got to see a summer unlike us here in the United States. And clear, consistent communication that is nuanced, but still has, you know, clear messages for people to take home about their own behaviour I think is something that we’ve missed in the United States. Certainly, and it is refreshing to start to hear that again, from our leaders since January 20.
Damien Collins MP 54:01
I think as I said earlier on here, I mean, I think the is the around the vaccine and the most effective tool so far about the deal with anti-vaccine conspiracies has been to constantly reassure people that every day, here are people who’ve been vaccinated. This is what the vaccination figures are. This is the impact it’s having on public health. See high profile people again, you know, endorse it, different sorts of voices, the different audiences will trust. That’s very important. But you’re absolutely right, something like COVID was a kind of a gift for the disinformation movement because you have something new that people in a way efficient even official at the most official sources don’t necessarily know all the facts, and where there’s ambiguity, that’s where, you know, conspiracies can thrive. But as Nina said, it makes it important that we have to, it’s not just about trying to tackle disinformation and expose it as disinformation. But it’s also making sure that, you know, good accurate public information is getting out there at scale as well.
Dr Danny Steed 55:00
Thank you both. I think we’ve got time just to squeeze one last one in. Can we go to Luke, please?
Luke Bliss 55:14
Just for both of you. Thank you very much for a really interesting talk. I was just wondering, across quite a lot of larger media companies, you’re seeing kind of fact checker articles on more kind of more controversial statements by larger figures. Do you think that there’s a kind of policy space for these to be implemented by central governments? It’s going to have a fact checker department or would that just be seen as entirely unworkable and to open to sort of bipartisan abuse? So administration comes in, packs that department full of essentially, their people who are ideologically ideologically aligned with their administration to use it more as kind of a silencing tool, especially in the more vulnerable state that the conversation is around freedom of speech? Thank you.
Nina Jankowicz 56:10
Yeah, so um, I, in my book, actually detail a couple of instances where governments have tried to do this one of them has has been in the Czech Republic, surrounding anti migrant disinformation and the Ministry of Interior there stood up the centre against terrorism and hybrid threats, which, among other duties, is set to fact check statements relating to their portfolio and immigration in the public domain, and they met a lot of criticism from the general public, even from within the government. In general, I do not think that it is really government’s duty to be the fact checker for the general public. And to the extent possible, yeah, this should be done by civil society, organisations by media. And I, while I think there is a certain necessity, obviously, for media and civil society to play this role in terms of the efficacy of turning around disinformation, the research is extraordinarily mixed on this, it seems that the most of the people who are reading these fact checks and are consuming them are folks who already have good media consumption habits already, you know, would not be taken in by disinformation. And the question is, how do we reach the people that that are taken in by that, and a fact check, really, as I explained before, isn’t usually the way with extraordinarily emotionally based disinformation. So certainly not government’s role, and we need to think about how to harness the tools of civil society and the media to reach the vulnerable populations.
Damien Collins MP 57:43
I agree with that. I think it is difficult for, I mean, I think on financial data, you know, government set up the Office of budget responsibility to try and give an impartial data on the economy at the time the budget is presented, but people don’t necessarily believe what the ABR says all thinks its predictions are wrong. I think it’s difficult for governments to be judge and jury on facts and indeed, often political discourse, you can get two politicians arguing about the same issue, quoting facts that see their argument that are all perfectly valid, but paint a totally different picture of what’s going on. So, that’s always been the case. I think the there is a role that I think for innocent civil society groups and the media and politicians themselves, to call out things that are clearly not true when they’re being said and where evidence can be provided very quickly to demonstrate that a conspiracy sort of myth is being circulated widely is making claim that is clearly not true. And I think we need to think to the near future, I mentioned deep fakes earlier, if you can think of you know, we may soon have a problem where things can be very easily and credibly presented to people on social media, based on not just theories that may not be true, but actually scenarios presented that it never happened, and what lever will we pull in that case when someone circulating a fake film or fake image which is highly inflammatory or you know, could lead to people committing violent or harmful act? Or, you know, to smear a politician election based on something never happened, what will I respond to be then? Because that day will be with us far sooner than we realise.
Dr Danny Steed 59:24
Thank you both. I’m loath to hit the brakes. I really am. But I know you two are very busy people and we’ll have plenty to do as well. So, I’m going to wrap it up here. Thank you both so much for your remarks and to all of the audience for making it such a lively q&a. I think we’ve really got some covered some fascinating ground here today. Just to let you know of the next events here for the Cyber Centre. We’ll be convening again on the 18th of March, about the role of underwriting risk in the cyber insurance markets, and what the cyber insurance market have been trying to do [inaudible] and they have this fright of, oh dear, we’re sat on how many hundreds of millions of policies and what happens when it all goes wrong. So, join us then and otherwise, I hope you all have an enjoyable rest of your day. Thank you and good afternoon.