At the recent DIODE meeting in Indonesia we participants split off into groups to focus on particular topics one of which was the development implications of crowdsourcing platforms. There is a significant critique of crowdsourcing platforms such as Amazon MTurk focussing on the same criticisms levelled at the “gig economy”: lack of any holidays, sickness, pension and health and safety. Whether the quintessential “gig platform” Uber has employees or is just a technology company is a topic for Union activism and courtroom debate. Amazon MTurk, Crowdflower, Upwork, Fiverr etc. have yet to come under the same level of scrutiny as Uber who remain in the spotlight of Union and other stakeholder activism.
This is perhaps because exemplary platform-based organisations such as Samasource take paid work to marginalised individuals where the work options are limited and who otherwise would either be unemployed or working in much worse conditions. However, this does not mean the work is risk free and little is known about the nature of the work done in developing countries and the effects.
The late sociologist Ulrich Beck in his book “Risk Society”[1] argues that there is a “systematic attraction between extreme poverty and extreme risk” citing the examples of people in developing countries poisoned by engaging in work such as manual processing of asbestos brake linings, fertiliser application and afflicted by the lax safety standards that caused the Bhopal chemical disaster.
How could crowdsourcing carry such a relationship? During the course of the discussion we focussed on some of problematic issues around the issue of content moderation. It would appear there is some evidence of Beck’s “poverty-risk” hypothesis applying here.
The task of removal of extreme pornographic and other images considered offensive such as ISIS beheadings reported as offensive from platforms such as Facebook is undertaken by workers in the Philippines[2]. The mental health implications of this are as yet unknown but the interviews the Wired article below indicate that significant trauma is experienced by the workers who are responsible for identifying and removing many hundreds of such images per day.
There is some literature derived from other occupations such as the police who are dealing with paedophilia images as part of investigations. It is worth considering the conclusions of the study :
“The results revealed that Internet Child Exploitation (ICE) investigators experience salient emotional, cognitive, social and behavioural consequences due to viewing ICE material and their reactions can be short and long term. The degree of negative impact appears to vary markedly across individuals, types and content of material and viewing context, with variation based on individual, case-related and contextual factors both in and outside the workplace.”[3]
Viewing these images clearly has negative effect and measuring and monitoring the effect is under researched. The Police force recognise there is a problem with repeated exposure to such images and video content and have put in place policy for management of the effects such as counselling etc.
As yet there is no evidence that crowdsourcing platforms have considered the need for employee health and safety (partly because of the lack of distinction on responsibilities of platform based organisations for health and safety, ethics regarding freelance gig workers vs. employees).
There is clearly a research agenda for us to pursue regarding the ethics, training needs and mental health implications of platform based crowdsourcing.
[1] Risk Society Towards a New Modernity Ulrich Beck
[2] https://www.wired.com/2014/10/content-moderation/
[3] Powell, M., Cassematis, P., Benson, M. et al. J Police Officers’ Perceptions of their Reactions to Viewing Internet Child Exploitation Material Police Crim Psych (2015) 30: 103. doi:10.1007/s11896-014-9148-z