social.coop is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Fediverse instance for people interested in cooperative and collective projects. If you are interested in joining our community, please apply at https://join.social.coop/registration-form.html.

Administered by:

Server stats:

487
active users

J. Nathan Matias 🦣

This extraordinary story by 60 Minutes puts a face to the experiences of content reviewers who train AI, do content moderation, and carry out other microtasks at scale for large companies.

youtube.com/watch?v=qZS50KXjAX

There are two lawsuits from content moderators active in Kenya:

- Daniel Motaung's case against Sama *and* Meta bbc.com/news/technology-645419
- Mercy Mutemi's lawsuit with 165 moderators against Sama & Meta: codastory.com/authoritarian-te

In both cases, courts have ruled that Meta can be sued directly. So we can expect more details on these cases in the coming years.

www.bbc.comFacebook's parent firm Meta can be sued by ex-moderator, judge rulesA Kenyan court has said that Facebook's parent company can be sued by a former content moderator.

Alexandra Gonzalez & I have spent the last year reviewing research on the mental health of content reviewers.

Scientists have known since the 1960s this work has substantial mental health consequences. In the meantime, global demand has expanded with generative AI. But research to measure mental health impacts and support moderators has been thin.

Grateful for everyone who's working to improve this dreadful dilemma that has harmed so many people over the decades.

citizensandtech.org/research/m

Citizens and Technology LabMeasuring mental health among content moderatorsHow can we create reliable measures of the psychological impacts of content moderation work?

I should note that tech firms have a very real supply chain problem when it comes to content reviewing.

Like anything that companies turn into a global commodity, it's hard for customers that make bulk purchases of this product (of content review work) to ensure ethical practices across the supply chain.

The problem is that it's a supply chain of PTSD. Here's a panel of industry leaders talking about the challenges — that @sarahgilbert and I blogged in 2022

citizensandtech.org/2022/09/mo

Citizens and Technology Lab · What Can Companies Do For Moderator Well-Being?What can Trust and Safety teams do to improve the well-being of the moderators who review content, a job that is often outsourced worldwide through third party firms?

Alexandra presented an early version of our work on the mental health of content rviewers at the Trust & Safety Research Conference at Stanford earlier this fall.

We're currently finalizing our full systematic review of the science and will publish it after further feedback and validation from other scientists.