By: Guy Fiennes
Oxford Disinformation & Extremism Lab
The UK’s experience of the 2024 Southport riots is broadly comparable to the event-based disinformation deluge on Israeli social media post-October 7th. In both cases, an opportunistic ecosystem of far-right activists, anti-Muslim/racist actors, conspiracists and hostile state interference contributed to misinformation and incitement. Although the Israeli case was more severe and societal tensions further entrenched, small fact-checking organisations Bodkim (‘Let’s Verify’) and FakeReporter fought the spread of mis-and-disinformation and incitement. Their approach – which notably includes crowdsourcing to identify potentially false or inciting content to be addressed by OSINT experts and analysts – provides a valuable model to safeguard democracy, which could be emulated by the UK and other countries.
Background
Between July and August 2024, misinformation spread on social media that the perpetrator of a stabbing incident in Southport, UK, was a Muslim asylum seeker named Ali al-Shakati. The misinformation was amplified by what has been described as a “Russian-linked fake news outlet”. This was further boosted by Russian state media, as well as domestic and international extremists commonly labelled as ‘far-right’, ultimately contributing to unrest across the UK in the days following.
Misinformation and posts which used hyperbolic and xenophobic language were further propagated by a range of high profile accounts, some of which were known public figures or featured coordinated inauthentic activity.
FakeReporter and Bodkim: Fact-checking Disinformation in the Israel-Gaza Conflict
Following Hamas’ attack into Israel on 7 October 2023, Israeli far-right extremists took to social media to organise and incite violence against local Muslim and Arab minority communities, contributing to and exploiting the misinformation and conspiracy ecosystem fuelled by the crisis across platforms. Misinformation demonised Arab citizens of Israel and other elements of Israeli society or cast them as part of a larger conspiracy, increasing the likelihood of political violence and fuelling appetite for more severe state measures against Palestinians in the region.
Fig 1. Russian state media and a far-right account amplify misinformation.
October 7 exacerbated societal tensions and the volume of domestically produced misinformation and hate circulating online, providing a wealth of material for bad actors, including hostile state actors, to exploit. Previous online inauthentic coordinated behaviour linked to Iran had already promoted misinformation about elections, impersonated public figures and journalists, and promoted hate against minority groups.
Two fact-checking organisations, FakeReporter and Bodkim, expanded their existing operations to cope with the sharp increase in mis-and-disinformation online post-October 7. The organisations use crowdsourced digital activism; public individuals who see malicious or misleading content can send to it the organisations for review by experts. Once analysed, they share updates of suspect or debunked content on their social media (Facebook and X) to inform the public and report it to platforms.
FakeReporter is more internationally renowned, producing longer form investigations and reports, and tending to work more on far-right activity and coordinated inauthentic behaviour, including exposing Iranian, Russian, and Israeli online operations. In contrast, Bodkim typically features one-off fact-checking of claims made by domestic public figures or mis-and-disinformation present on news channels and social media. However, their work frequently overlaps.
Fig 2. Bodkim fact-checks Israeli politicians and news channels. (Left) News channel misleadingly claims that a Palestinian Authority employee was involved in terrorist activity. (Right) Bodkim adds context to an otherwise misleading statement from a prominent Israeli politician.
These organisations play a key role on the frontline against hate and disinformation on social media in the Israeli context, exposing and removing hostile state disinformation, domestic misinformation, and hateful incitement. Crucially, they constitute a bridge between the public, tech platforms, civil society, and government bodies, grounded in their aligned interest to prevent the spread of mis-and-disinformation and hateful incitement in their communities.
Crowdsourced Digital Activism
Crowdsourced digital activism has previously been cynically used by state actors to wield against their critics, as with the controversial Act.il or its successor 4.il. However, while still appealing to the patriotic instinct of netizens, the crowdsourced digital activists in this context form a grassroots protection against misinformation and incitement, rather than contributing to aggressive online nationalism. In the chaos of and immediately after Hamas’ October 7 attack, these organisations and their participants acted as a buffer against the activities of hostile states, far-right extremists, and conspiracies which threatened to exacerbate tensions and inflame violence.
Fig 3. Alert for hostile foreign state operation (left) calling for crowdsourced assistance, and disinformation intended to incite against the Arab Israeli minority (right) exposed by FakeReporter and Bodkim
Although the Israeli context is more contentious than the UK’s unrest, there is a similarity in the threat. Both communities are confronted by an opportunistic alignment of hostile state actors and far-right activists, who contribute to disinformation on social media which can incite against minorities and nurture tensions along pre-existing ethnic, religious, and political lines.
Fig 4. FakeReporter’s website
The bridging between concerned digital citizens and small groups of fact-checkers combines the wisdom of crowds and experts into a novel model of digital democratic resilience against extremism, state disinformation operations, conspiracies, hateful incitement, and political disinformation.
Fig 5. FakeReporter posts warnings against hostile state activity and domestic extremists.
Learning from Other Contexts over Waiting to React
When the Kremlin made its digital assault on America’s democracy in 2016, Estonia had already been educating its youth about online disinformation for six years, having previously been subject to Russian disinformation operations. On 10 August 2024, after the Southport riots, the UK education secretary announced that the country would begin to “arm our children against the disinformation, fake news and putrid conspiracy theories awash on social media.” The tendency to adopt proactive policies only after a local crisis suggests that democracies are waiting for the crisis to strike rather than acting preventatively, based on the experiences of other countries.
Multicultural liberal democracies protect free speech, do not have direct control over tech platforms and the media, and contain populations with diverse and often polarised beliefs and identities. Consequently, they are particularly vulnerable to bad actors who seek to exacerbate societal tensions and spread disinformation. The Israel-Hamas war, the war in Ukraine, polarised politicking in the US, and increasing activism against migrants, religious minorities, and the LGBTQ+ community both online and off provide fertile material for hostile state actors, conspiracists, and their varied allies of convenience, who seek to undermine social cohesion and, at worst, accelerate the collapse of the liberal democratic system. In this context, FakeReporter and Bodkim should be recognised as a valuable model for democratic digital resilience, to be emulated ahead of the next event-based disinformation crisis.
Guy Fiennes is a multi-lingual MENA specialist who has written about disinformation, far-right extremism, and gender. He was previously the Project Coordinator for the NATO & the Global Enduring Disorder Project while it launched the Disorder Podcast. He holds an MPhil in Modern Middle Eastern Studies with Persian from the University of Oxford.