The Centre for Abuse and Trauma Studies (CATS) highlights the damaging impact that child sex abuse material can have on content moderators

The dark corners of cyberspace can play host to the some of the most distressing images imaginable, including child sexual abuse material (CSAM), which is highly damaging not only to the children depicted in those images and videos, but also to those who review and remove such content. Unfortunately, the increase of digitalisation and online connectivity, has seen pressure on the tech industry, law enforcement agencies and other organisations to police the internet increase rapidly.
The growth in online sexual offending has had significant implications for those who are tasked with responding to such crimes. A recent online event hosted by the Centre for Abuse and Trauma Studies (CATS) at Middlesex University featured leading academics, practitioners and frontline responders, who all emphasised the urgency in addressing the impact that exposure to CSAM and other potentially traumatic material has on the mental health of content moderators.
The aim was to share the preliminary findings of a project entitled ’Invisible Risks: Content Moderators and the Trauma of Child Sexual Abuse Materials’ which has been conducted by CATS over recent years, with support and funding from the Tech Coalition and End Violence Safe Online Initiative.
In this blog, the event chair and Middlesex University criminologist Dr Elena Martellozzo, with criminologist Dr Paul Bleakley, University of New Haven, highlight some of the key points raised in the webinar and share insights that may contribute to improving well-being for content moderators, no matter what type of organisation they may work for.
The first speaker of the event was child psychiatrist Dr Richard Graham, who has worked to support positive mental health in children and young people, and those who work with them, for many years, and served as a consultant on the ‘Invisible Risks’ project. Dr Graham invited the audience to reflect on the meaning of occupational health in the content moderation field and, particularly, the concept of burnout. Dr Graham suggested that many of the challenges experienced in content moderation are not entirely new. He referred back to the 1960s and 1970s when the swift growth of international travel meant air traffic controllers were suddenly faced with inadequate equipment, changing shift patterns, long shifts without breaks, stress and fatigue of those long shifts, the monotony of automated work, and the challenges arising from using new technology. Because of these rapid changes in working conditions, many felt burnt out and, tragically, this poor state of well-being even resulted in several mid-air collisions. Dr Graham urges that this scenario is “what we need to be thinking about is when we’re looking at harms in the online world.” He continued that “understanding the impact of the work upon content moderators may have come late, but it is not too late to make changes, because we need these professionals to be able to work well and flourish.”
The next speaker, Denton Howard, the CEO of The International Association of Internet Hotlines (INHOPE), referred to content moderators as “first digital responders who put out the fire that is CSAM.” He argued that no matter how cutting-edge innovative technology might be, the human eye is still instrumental in assessing and making the final decisions when it comes to responding to potentially harmful online content. He stressed “while the perception is that we’re technology driven and there is lots of fancy machines that do that [content moderation], behind that [technology] are the people, and unless we take care of the people, the machines won’t work and we won’t achieve what we’re supposed to do.” Whilst artificial intelligence continues to develop, Howard said it is not yet at the stage where it can make clear distinctions during the assessment of an image, and what response is warranted to that material.
However, there is substantial evidence which suggests that constant surveillance of traumatic images can affect content moderators’ wellbeing, mental health, and quality of life. It can also have knock-on impacts on their professional competence, satisfaction, and productivity. These arguments were supported by panellist Andy Briercliffe, an online child safety specialist who has spent more than 20 years in British law enforcement undertaking investigations into various types of serious internet crime. Andy shared some of his personal experiences, and outlined the impact that viewing harmful content may have on content moderators’ lives including (but not limited to) burnout, secondary traumatic stress, compassion fatigue, and vicarious trauma. As Briercliffe noted, if we want content moderators to act as ‘fire fighters’ and ‘put out’ CSAM effectively, it is important that they are appropriately supported — that we understand their coping strategies and the organisational factors which may assist or hinder the efficacy of the coping mechanisms they utilise.
Several of the same issues were raised by Dr Ruth Spence, a researcher in psychology at CATS who also served as the project manager of ‘Invisible Risks’. Following on from Briercliffe, Dr Spence presented some of the study’s preliminary results based on a survey conducted with 212 content moderators, and in-depth interviews with another 11. Dr Spence explained that more than a third of content moderators (34.6%) presented signs of experiencing moderate to severe psychological distress, which would ordinarily result in a referral for treatment and more than half (58.5%) showed signs of mild-to-low psychological distress, amounting to 93.1% of content moderators exhibiting indicators of distress on some level.
Dr Spence went on to observe that a high proportion of content moderators also experienced secondary trauma characterized by intrusive thoughts, avoidance, hyperarousal (e.g., sleep disturbance, hypervigilance), and other cognitive and/or emotional affects. Nonetheless, in spite of these profound challenges, Dr Spence also noted that content moderators seemed to cope relatively well with this distress, which the ‘Invisible Risks’ project seeks to explore further in order to determine appropriate intervention strategies.
Paula Bradbury, a criminologist and senior researcher with CATS, presented the project’s findings on leadership in the content moderation industry, which highlighted the need for a leadership environment that is empathetic, shows appreciation for the work of content moderation, and an understanding of the challenges.
Ultimately, Paula said, content moderators did not want to be part of a profession that is hidden: they called for more networking opportunities which would allow the field to professionalise, more training opportunities, and more direct feedback on their job performance from leadership. She added content moderators also emphasised the importance of building “the right team” which they defined as one in which they could rely on teammates, and felt comfortable being open with them about their experiences, and concerns. Paula noted most content moderators interviewed referred to their professional colleagues as a “second family” that was bonded through the shared challenges faced in the course of their work. This second family was seen as essential to the coping strategies employed by moderators, and stands as an essential informal support system.
Professor Antonia Bifulco, co-founder and director of CATS, closed the event, highlighting that often when people hear about content moderation, they think that it is all dependent on complex technology, such as AI.
However, she continued, “this work is about people and it is important to ensure they are kept in the picture, as they are the people behind the technology that protects us from seeing disturbing content.”
As mentioned over the course of the event, practitioners are working with the material comparable to toxic waste, and Professor Bifulco added that “this work can involve moral injury. It’s about material that really offends your values and sense of what’s right and wrong.”
It is this type of material that content moderators are faced with on a daily basis and, as the early results of the ‘Invisible Risks’ project suggest, this may have a significant (and hidden) impact on their well-being and psychological health. With greater understanding of how the role of content moderator effects those performing said role, we will be better placed to offer evidence-based support to the industry, allowing content moderators perform their job effectively in a way that lessens the personal risk of harm.
About the authors
Dr Elena Martellozzo

Criminologist Dr Elena Martellozzo is an associate professor in Criminology at Middlesex University and the associate director of CATS, with extensive experience in researching subjects related to online harms and abuse against women and children. Elena has acted as an advisor on child protection to governments and practitioners in Lithuania, Italy and Bahrain to support the development of national child internet safety policy frameworks and led and co-led research project both in the United Kingdom and internationally.
Recognised as one of the world’s leading experts in criminology, she focuses primarily on online harms, online violence against children and women and online safety. She is an active member of the UK Centre for Internet Safety (UKCIS) and an expert advisor for EIGE on Cyber violence against Women and Girls, mapping national policies, research, data, and definitions on cyber violence against women and girls across the EU-27.
Elena delivers regularly expert training on online safety and prevention to professionals working in the online safety and sexual health area. As a published peer-reviewed author, she regular speaks in national TV and print media.
Twitter: @E_Martellozzo
Dr Paul Bleakley

Dr Paul Bleakley is an assistant professor in Criminal Justice at the University of New Haven in Connecticut, USA, with an expertise in policing, corruption and abuse studies. He has a particular focus on historical criminology, and especially the use of historical research methods to examine police corruption and cold case crimes. Dr Bleakley has written two books on this subject, Under a Bad Sun: Police, Politics and Corruption in Australia (2021) and Policing Child Sexual Abuse (2022). He has published his research in a range of leading journals such as Criminology & Criminal Justice, Critical Criminology, Deviant Behavior, Policing, and Criminal Justice Studies.
A former journalist, Dr Bleakley previously worked for Middlesex University.
Twitter: @DrBleaks