This extraordinary story by 60 Minutes puts a face to the experiences of content reviewers who train AI, do content moderation, and carry out other microtasks at scale for large companies.
There are two lawsuits from content moderators active in Kenya:
- Daniel Motaung's case against Sama *and* Meta https://www.bbc.com/news/technology-64541944
- Mercy Mutemi's lawsuit with 165 moderators against Sama & Meta: https://www.codastory.com/authoritarian-tech/social-media-authoritarian-tech/mercy-mutemi-meta-lawsuit/
In both cases, courts have ruled that Meta can be sued directly. So we can expect more details on these cases in the coming years.
Alexandra Gonzalez & I have spent the last year reviewing research on the mental health of content reviewers.
Scientists have known since the 1960s this work has substantial mental health consequences. In the meantime, global demand has expanded with generative AI. But research to measure mental health impacts and support moderators has been thin.
Grateful for everyone who's working to improve this dreadful dilemma that has harmed so many people over the decades.
https://citizensandtech.org/research/moderator-mental-health/
I should note that tech firms have a very real supply chain problem when it comes to content reviewing.
Like anything that companies turn into a global commodity, it's hard for customers that make bulk purchases of this product (of content review work) to ensure ethical practices across the supply chain.
The problem is that it's a supply chain of PTSD. Here's a panel of industry leaders talking about the challenges — that @sarahgilbert and I blogged in 2022
https://citizensandtech.org/2022/09/moderator-wellbeing-trustcon/
Alexandra presented an early version of our work on the mental health of content rviewers at the Trust & Safety Research Conference at Stanford earlier this fall.
We're currently finalizing our full systematic review of the science and will publish it after further feedback and validation from other scientists.