Looking after the mental health of employees is important in any workplace setting. However, in Trust and Safety, in particular content moderation, it is an integral part of the role. Behind the scenes, Content Moderators sift through a deluge of user-generated content, confronting the darkest corners of the internet to uphold community standards and protect users from harm. Yet, amidst their pivotal role in shaping online discourse, the mental health of Content Moderators often remains overlooked. Content Moderators can be exposed to a high level of egregious content, for example, hate speech, child-related abuse, harm to animals, and war zone content which, in turn, can adversely impact their mental health.
In this blog, we delve into the critical importance of prioritizing mental health in content moderation, exploring the unique challenges faced by moderators and the indispensable role of supportive measures in safeguarding their wellbeing.
Understanding the Challenges
Content moderation, by its very nature, exposes individuals to a barrage of distressing and potentially traumatic content. From graphic violence and hate speech to child exploitation and terrorist propaganda, moderators are constantly immersed in a digital landscape fraught with emotional triggers. The sheer volume and intensity of this content can take a significant toll on their psychological wellbeing.
There are other stressors related to the role that can increase the risk of mental health difficulties like stress and anxiety. These include a high-volume workload, target-based performance, frequent changes in the industry, and world events that directly impact the content that moderators work with, a recent example being the conflict in Gaza. Perhaps unsurprisingly there is a high level of burnout in this role, highlighting the importance of prioritizing mental health support.
The Risks of Vicarious Trauma
A mental health-related risk to be particularly aware of for Content Moderators is Vicarious Trauma (VT). VT is theorized to stem from indirect exposure to graphic content resulting in re-experiencing, avoidance, hypervigilance, and hyperarousal (Figley, 1995). VT can be triggered by a single exposure to egregious content but the more exposure to graphic content, the more vulnerable moderators are to experience and prolong acute stress symptoms (Holmana, 2013). Not every moderator will develop Vicarious Trauma; like any mental health difficulty various factors can decrease or increase the likelihood of this developing. Some risk factors include stress inside or outside of the workplace, challenging workplace dynamics, poor mental health, and a lack of social support.
Mitigating the Risks
The good news is that the risks of developing VT, burnout, and other mental health difficulties in this role can be mitigated by prioritizing content moderator wellbeing and mental health. Content Moderator teams must have extra layers of wellbeing support and be given sufficient time to access this support. Providing clear information during onboarding on the signs and symptoms of mental health difficulties, in particular VT, helps to raise awareness and catch and treat symptoms early on. Workplaces providing access to a robust wellbeing programme during the working day that includes peer support, psychoeducation training and 1:1 counselling is fundamental for supporting the mental health of Content Moderators. Early intervention is key to supporting mental health and aiding faster recovery (Rizkalla & Segal, 2019).
Further, providing wellbeing time that can be used to regulate shock is essential. Shock activates the fight, flight, freeze (FFF) response which, if not given time to deactivate, can develop into a state of chronic hypervigilance and increase the risk of VT. An interesting study carried out in 2018 showed that playing a short game, in this case Tetris, following exposure to a traumatic incident reduced intrusive memories. At one week and one month follow-up, the study found that the traumatic memories declined more quickly than the control group who carried out a written activity log (L Iyadurai et al., 2018). Although the population in this study were not Content Moderators, they were patients in an emergency department, the research suggests that providing short periods for moderators to engage in activity that supports physiological regulation after viewing shocking content could help reduce the likelihood of developing VT.
Creating a Culture of Care
Companies must prioritize the wellbeing and mental health of Content Moderators to support them in an essential but often challenging role. Though it may be uncomfortable to face, it is not helpful to shy away from the reality of the mental health difficulties that many moderators face as a direct result of their work. Being open about potential harms and tackling this head-on through a robust wellbeing programme is the safest and most ethical way to protect Content Moderators – ‘the unseen and silent guardians’ who keep platforms and users safe by at times putting their health at risk.
Designing and implementing a culture of care requires all stakeholders to be involved including Content Moderators who can share their lived experiences, management teams, support staff such as policy teams and HR, and Senior leadership and decision-makers. Embedding wellbeing into the culture of an organization through robust policies that inform praxis and developing a sense of psychological safety are essential to the wellbeing of Content Moderators.
Conclusion
In the fast-paced and often tumultuous world of content moderation, the mental health of moderators is a matter of paramount importance. By acknowledging the unique challenges, they face and implementing proactive measures to support their wellbeing, companies can ensure that moderators are equipped to thrive in their crucial role. Together, let us nurture and support moderators to uphold a vision of online spaces that are not only safe and inclusive but also supportive of the individuals who dedicate themselves to maintaining their integrity.