Online Radicalisation and Algorithms, written by Isaac Kfir, looks at how extremists utilise platforms’ algorithms to radicalise vulnerable people. It finds that algorithms in use on social media platforms aid extremists and lead more people to self-radicalisation, it is the technology itself that promotes these messages.
The internet has radically changed every aspect of society, and radicalisation is no different. The internet exists to enhance interaction, bring communities together and make the acquisition of information easier. Social media platforms are where the majority of these interactions and connections take place. This paper looks at how, since the early 2010s, extremists have become more media literate and primed their content to attract more people. They have used the tools of the internet to reach greater audiences.
These tools include:
- Harnessing algorithms that promote more clicked-on stories. Stories which inspire disgust, fear and surprise.
- Creating communities where people’s viewpoints are affirmed, and ‘facts’ are trusted. This sustains a closed ‘information loop’.
- Gaming the system by generating clicks, and likes from these communities, which then tells the algorithm to share this popular content to more users. 70% of the content watched on YouTube comes from recommendations (driven by algorithms).
- Once they have attracted people on mainstream platforms, sharing more extreme content privately, or on secondary sites such as Telegram.
Historically studies have shown that radicalisation is a slow process, which requires a physical community of like-minded individuals. However, it is now easier to access that community, no matter where they are in the world. Vulnerable people and children are more susceptible to the promotion of extreme material. The report’s author recommends that more community education be conducted to give people the tools needed to discern facts from falsehoods.
The issue of people self-radicalising without ever setting a foot outside their own home is a real one. It is essential that the algorithms which shape our online spaces are free from manipulation and abuse, and move beyond considering what is most popular to what is true.