Content Moderation and Mental Health

0
36

Every day, millions, maybe even billions of people post videos, images, and text on the internet. Most of this content is harmless, but not always. So, someone must make sure that content is safe for the eyes of their visitors.

That’s where content moderators come in. These workers get brought in by the companies that bring you user-generated content. Their job is to filter out anything inappropriate for the platform.

They often pay the price, too, because content moderation is a tough job. Moderators must look at violent, sexual and otherwise disturbing content, so you and the people you love don’t have to.

These unsung heroes act as living human filters. Not a pretty job description for sure.

Unfortunately, merely viewing this content often impacts a moderator’s mental health resulting in post-traumatic stress disorder, or PTSD.

Sometimes moderators experience even more trauma when repeatedly viewing a traumatic event than if you were to have experienced it. Proving this point was a recent study that compared stress levels between attendees at the 2013 Boston Marathon bombings and those who viewed media coverage of the attack.

The effects of PTSD are serious. Content moderators can experience a range of negative feelings, from simple irritability to sudden anger to a persistent feeling of numbness. Those who have PTSD sometimes self-medicate with drugs, alcohol or adverse coping behaviors.

The Dart Center, which researches the effects of trauma on journalists, has spent decades analyzing this issue in the context of news reporting. They have developed some tips that are also useful for content moderators and their managers.

If you are a content moderator, there are several steps you can take to reduce your risk of developing PTSD.

For example, you can:

Write notes during viewing, so you don’t have to recheck the original content.
Reduce the size of your working window and mute the sound. This tactic creates a distance between yourself and the content and is surprisingly effective.

Managers and other higher-ups ultimately have the most power in creating a healthy environment. If you are a manager, you can reduce the amount of time that workers under your responsibility get exposed to dangerous content. You can develop self-care procedures and keep the environment calm and supportive.

However, the focus on the bottom line can prevent social media companies from taking steps to ensure a healthy workforce. When it comes to a choice between image moderation and brand recognition, most companies will center the latter in their vision. Meanwhile, worker wellness takes a backseat.

That’s why many companies are electing to use AI technology to reduce the risk to their content moderators.

It’s common for companies to use contractors to moderate incoming content, though. Contractors may be under pressure to work for very long times by their firms. Such companies employing them may not feel the need to invest in AI technology.

So there’s that issue too.

Content moderators protect the internet every day from harmful content. Maybe one day, the whole industry will have to work together to protect content moderators themselves.