Share This Story, Choose Your Platform!

Published On: May 20th, 2020|Comments Off on Moderator mental health: How we make the Internet a safer place|Tags: |3.3 min read|660 words|Views: 1626|

Content moderation, social media, and mental health

Tons of global brands seek us out to help them create safe and helpful online communities through user-generated content moderation support of all kinds including image, text, and audio/video.

One of the big trends we’ve seen in the last two years — for the best! — has been a greater understanding of the psychological implications of content moderation. There are articles about this seemingly everywhere these days. The New Yorker recently interviewed author Mary South, who wrote a book where the main character is, in fact, a content moderator. The Verge wrote an article in 2019 about “the secret lives of content moderators,” predominantly looking at Facebook moderators — and in late 2019, former Facebook moderators sued the company (and won) over PTSD.

YouTube moderators are now signing documents about their potential susceptibility to PTSD.  

This all begs the question: What is the responsibility of content moderation outsourcing companies, what is the responsibility of the end client, and how can we work together to ensure the health and well-being of moderators?

The Conectys approach to protecting content moderators

There are a few core tenets to how we approach well-being and mental health within content moderation and front-line customer experience work:

RECRUITMENT: We match employee personality and emotional resilience with future job requirements by diligently screening candidates and being transparent about the benefits and the common challenges they will experience. In the recruitment process, we use emotional intelligence testing as well.

TRAINING: We do lots of it every time we launch a new project or bring new agents on board, but part of our initial training for moderators is a full-day session called “Moderation Skill Pack.”

While it wasn’t initially designed as a mental health tool, we’ve seen the positive effects it’s had by setting the mindset of newcomers on the mission of their job: making the internet a safer place. We talk about their exposure to negative content, and agents are introduced to the positive effects of social media but also learn about how some people choose to use it with negative intent. (We typically refer to these people as “bad actors.”)

Simply establishing the mindset that each agent will directly contribute to the wellbeing of a community by catching negative acts and blocking them has a massive impact on mental health by emphasizing the positive force our agents provide.

REAL-TIME SUPPORT: We also have something called HR Connect. In short, a dedicated  Human Resources professional is assigned to each site and regular (and proactively) check-ins with moderators, either by request or at random, to offer them guidance and attempt to catch any signs of mental distress caused by their moderation duties.

Additionally, our managers have extensive training on how to identify possible signs of mental health issues and how to offer support in such cases. We talk frequently with support agents about mental health, developing resilience, and more.

We run these programs from the beginning of launching a new site/account, and we continuously improve and customize them throughout the life of the project. We’re actively developing a six-month follow-up to the Skill Pack Training described above; the goal is to extend ideas of digital trust and safety, but bring in contextual lessons from a moderator’s first half-year in the role.

The best news: in almost two years of activity, we have not had any confirmed cases of mental health issues caused by negative activity on any account. That’s good for our agents (most importantly) and that’s good for our clients’ communities.

Bottom line
It all comes back to training, support, and care. If you focus intensely on those areas, and regular follow-up with those who work in content moderation, you can turn a potential negative into a strong positive for your people and the brands you work with.

See how to promote a positive company image with the help of a content moderation service.

NPS&CSAT-blog-postWays to measure customer success at scale
problems in the gaming industryGaming content moderation: How to do it better
Moderator mental health: How we make the Internet a safer place

Intrigued about oursourcing?

Get in touch with our sales team now!

Contact sales now

Recent  Posts

  • conectys global bpo blog

The 3 Areas Expected to See Even More Outsourcing in 2018

January 28th, 2018|Comments Off on The 3 Areas Expected to See Even More Outsourcing in 2018

Outsourcing continued to grow in 2017, and is expected to do the same in 2018. And while nearly any type of business within any industry can benefit from offloading some of their services to a qualified BPO partner, [...]

  • conectys meeting room branded

Bringing education to the next level

May 18th, 2017|1 Comment

The team at Conectys are pleased to announce a new partnership, with one of the UK’s largest specialists in technology-assisted learning. The collaboration targets end-user support for a language development multiplatform app that will offer a rich and [...]

Complexity in the US spells success in the Philippines

October 16th, 2016|Comments Off on Complexity in the US spells success in the Philippines

The US medical diagnosis coding system is changing. That, in itself, might not be much news to the rest of the world. But for the outsourcing industry in the Philippines, it is a sign that business will soon [...]

  • Work from home success

Automation beyond the Hype

June 23rd, 2016|Comments Off on Automation beyond the Hype

We launched ConectysOS 2.0, a major redesign of our proprietary, private cloud-hosted Customer Engagement and Analytics automation platform, and have implemented it with all of Conectys' global clients. There are several key takeaways from this journey and we [...]

Related Posts