INSIGHT
Moderators protect us from the worst of the internet. That comes at huge personal cost
31st October 2024
Content moderators play a crucial role in safeguarding online spaces by filtering harmful and extreme content. However, this responsibility comes at a heavy emotional toll due to the graphic nature of the material they encounter.
This article was originally published by The Conversation.
By Alexandra Wake, Associate Professor, Journalism, RMIT University
Unless you’re a moderator for a local community group discussing garbage collections or dog park etiquette, you are unlikely to fully understand the sheer volume and scale of abuse directed at people online.
But when social media moderation and community management is part and parcel of your daily work, the toll on people and their loved ones can be enormous. Journalists, often early in their careers, can be on the receiving end of torrents of abuse.
Read more: Digital “preparedness is vital” – ABC’s Social Media Wellbeing Advisor
If they come from culturally or linguistically diverse backgrounds, that reluctance to report can be even higher than other colleagues.
There’s growing employer concern about how moderating confronting content can affect people’s wellbeing. Employers also have a duty to keep their staff safe at work, including online.
The ABC wanted to understand what this looked like in practice. Its internal survey data shows just how bad the problem has become for moderators who are employed to keep audience members safe when contributing to online discussions.
What did the ABC find?
In 2022, the ABC asked 111 staff who were engaged in online moderation as part of their jobs to self-report the frequency of exposure to potentially harmful experiences.
First it was important to understand just how long people were spending online moderating content. For those who had to moderate content every day, 63% they did it for less than an hour and a half, and 88% moderated for less than three hours.
The majority of staff surveyed saw potentially harmful content every week.
71% of moderators reported seeing denigration of their work weekly, with 25% seeing this daily.
Half reported seeing misogynistic content weekly, while more than half said they saw racist content weekly.
Around a third reported seeing homophobic content every week.
In the case of abusive language, 20% said they encountered it weekly.
It’s a confronting picture on its own, but many see more than one type of this content at a time. This compounds the situation.
It is important to note the survey did not define specifically what was meant by racist, homophobic or misogynistic content, so that was open to interpretation from the moderators.
Listen toour podcast
Uncovering and exploring the biggest
issues facing public media
A global issue
We’ve known for a few years about the mental health problems faced by moderators in other countries.
Some people employed by Facebook to filter out the most toxic material and have gone on to take the company to court.
In one case in the United States, Facebook reached a settlement with more than 10,000 content moderators that included U$52 million (A$77.8 million) for mental health treatment.
In Kenya, 184 moderators contracted by Facebook are suing the company for poor working conditions, including a lack of mental health support. They’re seeking U$1.6 billion (A$2.3 billion) in compensation.
🚨NEW:
At Facebook’s external content moderation facility in Africa, employees describe traumatic working conditions, alleged union-busting, and pay as low as $1.50 per hour, my investigation found.
🧵Thread (1/)https://t.co/wGHOVWVfyM
— Billy Perrigo (@billyperrigo) February 14, 2022
The case is ongoing and so too are other separate cases against Meta in Kenya.
In Australia, moderators during the height of the COVID pandemic reported how confronting it could be to deal with social media users’ misinformation and threats.
A 2023 report by Australian Community Managers, the peak body for online moderators, found 50% of people surveyed said a key challenge of their job was maintaining good mental health.
What’s being done?
Although it is not without its own issues, the ABC is leading the way in protecting its moderators from harm.
It has long worked to protect its staff from trauma exposure with a variety of programs, including a peer support program for journalists. The program was supported by the Dart Centre for Journalism and Trauma Asia Pacific.
But as the level of abuse directed at staff increased in tone and intensity, the national broadcaster appointed a full-time Social Media Wellbeing Advisor. Nicolle White manages the workplace health and safety risk generated by social media. She’s believed to be the first in the world in such a role.
As part of the survey, the ABC’s moderators were asked about ways they could be better supported.
Social media journalists, especially moderators and live bloggers, are like firefighters on the digital frontline, but putting out online fires is not without risks.
Hear from @DartAsiaPacific‘s Erin Smith and @Reporting4Work in their MPC column: https://t.co/bxW3QEZywl pic.twitter.com/dbFBd5UPyR
— Melbourne Press Club (@MelbPressClub) December 7, 2023
Turning off comments was unsurprisingly rated as the most helpful technique to promote wellbeing, followed by support from management, peer support, and preparing responses to anticipated audience reactions.
Turning off the comments, however, often leads to complaints from at least some people that their views are being censored. This is despite the fact media publishers are legally liable for comments on their content, following a 2021 High Court decision.
Educating staff about why people comment on news content has been an important part of harm reduction.
Some of the other changes implemented after the survey included encouraging staff not to moderate comments when it related to their own lived experience or identity, unless they feel empowered in doing so.
The peer support program also links staff others with moderation experience.
Managers were urged to ensure that self-care plans were completed by staff to prepare for high-risk moderation days (such as the Voice referendum). These includes documenting positive coping mechanisms, how to implement boundaries at the end of a news shift, debriefing and asking staff to reflect on the value in their work.
Research shows one of the most protective factors for journalists is being reminded that the work is important.
But overwhelmingly, the single most significant piece of advice for all working on moderation is to ensure they have clear guidance on what to do if their wellbeing is affected, and that seeking support is normalised in the workplace.
Lessons for others
While these data are specific to the public broadcaster, it’s certain the experiences of the ABC are reflected across the news industry and other forums where people are responsible for moderating communities.
It’s not just paid employees. Volunteer moderators at youth radio stations or Facebook group admins are among the many people who face online hostility.
What’s clear is that any business or volunteer organisation building a social media audience need to consider the health and safety ramifications for those tasked with maintaining those platforms, and ensure they build in support strategies.
Australia’s eSafety commissioner has developed a range of publicly available resources to help.
About the author
Alexandra Wake is a member of Dart Asia Pacific, having previously served as a director of its Board. She is currently a joint recipient of an Australian Research Council Discovery Grant, Australian Journalism, Trauma and Community.
The author would like to acknowledge the work of Nicolle White in writing this article and the research it reports.
Related Posts
10th January 2024
Why newsrooms need to prioritise journalist safety in 2024
Media consultant Hannah Storm discusses…
28th January 2022
The Digital Services Act (DSA): could it impact public media?
The European Parliament passed the DSA…
9th December 2021
Digital “preparedness is vital” – ABC’s Social Media Wellbeing Advisor
Journalists often face barrages of…
10th May 2021
Public media, education and the digital divide during COVID-19
Public media organisations have stepped…