Search results

Your search found 6 items
Sort: Relevance | Topics | Title | Author | Publication Year
Home  / Search Results
Date: 2026
Abstract: This research investigates how recommender algorithms on TikTok and Rumble expose UK minors to antisemitic content. Analysts created 10 TikTok profiles representing 15-year-old users with varied political and cultural interests, including neutral interest in the Israel-Palestine conflict, left and right-wing political interest, male lifestyle influencer content, far-right content and two neutral accounts. The profiles were prompted towards relevant topics for each interest through an hour and a half of manual content viewing, followed by content engagement via bespoke bot over 14 days, resulting in over 5,500 recommended videos. Thematic analysis clustered content into 10 core themes, revealing pathways from neutral lifestyle content to highly politicised and conspiratorial clusters. Relevant themes were manually reviewed, revealing that harmful content persisted through videos, comments, and TikTok’s sticker and sound features, illustrating systemic gaps in safeguarding minors. On Rumble, analysts collected 4,412 videos from the platform’s “Editor’s Picks” over six months. Analysts filtered for antisemitism-related keywords and reviewed 259 videos potentially relevant to antisemitism. Findings show Rumble hosts more overt antisemitic content than TikTok, including slurs, Holocaust distortion and conspiracies about Jewish control. These findings underscore urgent gaps in platform accountability and the need for robust enforcement of the Online Safety Act to protect children from the normalisation and mainstreaming of antisemitic content.
Date: 2025
Abstract: Since long before the October 7 attacks, Jewish communities in Europe have experienced growing hate, harassment and hostility on social media. This policy paper articulates the key challenges of online antisemitism, and provides comprehensive and practical policy steps which governments, platforms, regulators and civil society organisations can take to address them. Built through 42 interviews with Jewish organisations and experts in antisemitism and digital policy from across CCOA’s five geographies (France, Germany, Italy, Poland and Sweden), it collates local experiences and channels them into a cohesive pan-European strategy, uniting communities and sectors in joint responses.

Interviewees identified five central challenges with online antisemitism:

Jewish communities and organisations across the five geographies report the significant behavioural, social and psychological impacts of online antisemitism, which have created a chilling effect on participation in public life.
Concerns exist not just over fringe violent extremist content, but the prevailing normalisation of mainstream antisemitism and a permissive culture which facilitates its spread across all areas of society.
There are a wide range of social media platforms in the social media ecosystem each adopting distinctive approaches and standards to content moderation, however the widespread accessibility of antisemitism suggest that significant barriers remain to the effective implementation of Terms of Service, and that many platforms are failing in this regard.
There is limited awareness and understanding of the Digital Services Act (DSA) in Jewish civil society, little capacity to implement it, and a lack of confidence in its efficacy in addressing antisemitism.
Law enforcement has lacked both the capacity and legislative tools to effectively respond to the scale of illegal activity on social media.
Mainstreaming Digital Human Rights
This policy paper presents policy recommendations for Governments, Tech Platforms, Digital Regulators, and Civil Society. These approaches constitute a collective pathway, but may be diversely applicable across different geographies, communities and jurisdictions.

Editor(s): Rose, Hannah
Date: 2024
Date: 2021
Abstract: Since the beginning of the Covid-19 pandemic, the economic uncertainties and anxieties around the virus have been weaponised by a broad range of extremists, conspiracy theorists and disinformation actors, who have sought to propagandise, radicalise and mobilise captive online audiences during global lockdowns. Antisemitic hate speech is often a common feature of these diverse threats, with dangerous implications for public safety, social cohesion and democracy. But the Covid-19 crisis has only served to exacerbate a worrying trend in terms of online antisemitism. A 2018 Fundamental Rights Agency survey on Experiences and Perceptions of Antisemitism among Jews in the EU found nearly nine in ten respondents considered online antisemitism a problem. Eight in ten encountered antisemitic abuse online. This report, conducted by the Institute for Strategic Dialogue (ISD), presents a data-driven snapshot of the proliferation of Covid-19 related online antisemitic content in French and German on Twitter, Facebook and Telegram. The study provides insight into the nature and volume of antisemitic content across selected accounts in France and Germany, analysing the platforms where such content is found, as well as the most prominent antisemitic narratives – comparing key similarities and differences between these different language contexts. The findings of this report draw on data analysis using social listening tools and natural language processing software, combined with qualitative analysis. Covering the period from January 2020 until March 2021 to build insights around the impact of the Covid-19 pandemic on online antisemitism, the Executive Summary International Holocaust Remembrance Alliance (IHRA) working definition of antisemitism was used to identify channels containing antisemitic content, before developing keyword lists to identify antisemitic expressions widely used on these channels.