TITLE: Moderating illegal content online and staff welfare – Where do the responsibilities lie?
The price others pay for digital dumping – Aspects of online child protection. Huge volumes, unknowable quantities, of unambiguously illegal or profoundly harmful materials are circulating on the Internet. Child sex abuse materials (CSAM) and terrorist propaganda are the types of content most frequently mentioned in this context but there are several others.
There has been an entirely proper focus on the supply chain companies use to manufacture or deliver their products or services. Typically, these initiatives have been designed to eliminate child labour, slavery or environmental harms. Isn’t it time internet businesses and institutions were pressed to do something for those who daily have to face the unfaceable on our behalf?
Already we are aware of at least one case that is being brought in a US court by ex moderators who claim their former employer did not do enough to shield them from Post Traumatic Stress Disorder. Whatever the eventual outcome of that case might be it seems a portent of other actions that could be brought. However, our aim should be to avoid the possibility of such suits by insisting on high standards.
INHOPE co-ordinates a global network of hotlines that receive reports of CSAM. To be a member of INHOPE a hotline has to sign up to several commitments in terms of staff welfare needs e.g. access to counsellors, safe spaces and secure places.
Any company or institution/platforms employing moderators, either in-house or via third parties, should sign up to something similar to the standards that institutions like INHOPE and IWF use for their own analysts who review CSAM all day long, something similar and there should be a mechanism to reassure the public that its terms are being honoured in practice, not just in theory.
- Moderator: Marie-laure Lemineur, ECPAT International, Thailand
- Remote moderator: David NG, eHelp Association, Hong Kong
- John Carr, ECPAT International, UK
- Larry Magid, member of the Safety Advisory Boards of Facebook, Twitter and Snapchat , USA
- Susie Hargreave, Chief Executive, Internet Watch Foundation, UK
- Marco Pancini, Director of EU Public Policy, Google
- Karuna Nain, Global Safety Programs Manager, Facebook, USA
Links to sites of interest on the topic:
http://www.lemonde.fr/pixels/article/2017/12/05/critique-google-veut-etendre-son-equipe-de-moderation-a-10-000-personnes_5224748_4408996.html (French - Le Monde)
Blog of YouTube CEO Susan Wojcicki: https://youtube.googleblog.com/2017/12/expanding-our-work-against-abuse-of-our.html
Internet Watch Foundation post on welfare of their analysts: https://www.iwf.org.uk/news/shorter-working-days-counselling-and-table-tennis-how-internet-watch-foundation-iwf-takes-care