Crowdsourcing platforms for research

Lancez-Vous. C'est gratuit
ou s'inscrire avec votre adresse e-mail
Crowdsourcing platforms for research par Mind Map: Crowdsourcing platforms for research

1. Dignity, digital citizenship rather than "cogs"

1.1. digital sweatshops (Whitehouse, 2016)

1.2. Anonymity (Bivins, 2016)

1.3. How do we ensure that requesters and researchers are respecting e/o? (Milland, 2016) pg 265

1.4. Recognizing workers as citizens = rigor of work should have benefit (Milland, 2016) pg 265

2. Unfair payment (Milland, 2016)

2.1. Researchers or job requesters have the "freedom" to pay whatever they want (Pittman and Sheehan, 2016)

2.2. Crowdsource workers completing tasks "voluntarily" (Pittman and Sheehan, 2016)

2.3. "Many people work long hours completing surveys and other tasks for verylow wages, relying on those incomes to meet their basic needs" (Williamson, 2016)

2.3.1. "MTurk offers extraordinarily low compensation - about $2 an hour for workers in the United States" (Williamson, 2016)

2.3.2. "Workers have the option of refusing to accept any task if they consider the rate too low" (Williamson, 2016)

2.4. "Crowdsourcing participants lack employment protections..." (Williamson, 2016)

3. Research validity

3.1. Diversified research samples (Pittman and Sheehan, 2016)

3.1.1. "Moving beyond W.E.I.R.D samples" (Palmer and Strickland, 2016)

3.2. Quick collection of data at a low cost (Pittman and Sheehan, 2016) Reiterated by Palmer and Strickland, 2016

3.2.1. Graduate students and faculty members with minimal funding support (Pittman and Sheehan, 2016; Williamson, 2016)

3.3. Researchers can collect data from enough workers to generate significant statistical power (Pittman and Sheehan, 2016)

3.3.1. "Compensation rates do not appear to affect data quality" (Williamson, 2016)

3.4. "Performance on crowd-sourced assessments may also differ because participants have previously been exposed to a similar experimental manipulation" (Palmer and Strickland, 2016)

3.4.1. "Crowdsourcing makes research tasks easier" (Williamson, 2016)

3.5. "Lack of reviewer knowledge...weakens peer review" (Palmer and Strickland, 2016)

3.6. "identifying an imposter rate of 24–83% across a series of MTurk studies" (Sheehan, 2017)

3.6.1. Data validity can be increased by following best practices (Sheehan, 2017) See page 10

3.7. "leaves methodological designs vulnerable to researchers' implicit assumptions about the crowd" "Contradicts the basic idea that we control who participates in our studies" (Stamm and Eklund, 2019)

4. The socioeconomic status of workers (e.g., disabled, poor)

4.1. "different social scientists are likely reaching many of the same participants" (Williamson , 2016) Example of skewed results - questions about economic inequality - page 79

4.2. "Participants must have internet access" (Palmer and Strickland, 2016)

5. Solutions

5.1. Create worker-run platforms (Milland, 2016)

5.1.1. Ex: Daemo

5.2. "Journal editors can commmit to publishing only articles that pay respondents an ethical rate" (Williamson, 2016)

5.2.1. "Disciplinary standards for reporting the methods by which a researcher completes mundane tasks" (Williamson, 2016)

5.2.1.1. "Researchers can set a minimum wage for their own research" (Williamson, 2016)

5.2.1.1.1. Bonuses can be goven rectroactively (Williamson, 2016)

5.2.1.1.2. "Researchers are responsible for justifying their payment choices" (Milland, 2016)

5.2.1.1.3. Reiterated by Palmer and Strickland

5.2.1.2. University IRBs should create guidelines (Williamson, 2016)

5.2.1.2.1. citizen -science projects vs MTurk - Institutional Review Board Guidelines (Williamson, 2016)

5.2.1.2.2. "Institutional Review Board concerns generally do not include low payments. Most IRBs instead worry about too high of a payment that might coerce someone to participate or continue in a study that they do not want to do (Sheehan & Pittman, 2016)" (Sheehan, 2017)

5.2.2. Job requesters consider ethical pay and treatment of workers before posting jobs (Milland, 2016).

5.3. "Confront the researchers with the fact that the workers are human beings" (Milland, 2016) pg 263

5.3.1. "Researchers should see the digital worker as a citizen, imbued with all the rights and protections the Fourteenth Ammendment calls for...catch up with the proliferation of labor" (Milland, 2016)

5.4. Grant makers should provide appropriate funds (Williamson, 2016)

5.5. "A Deweyan approach to citizenship can be helpful...if the digital worker is granted protections as a member of a global digital community, then their crowdsourced labor can be seen as a community service rather than a series of economic transactions" (Milland, 2016)

5.5.1. Dispute Channel, Citizenship (Deitz, 2016)

5.5.2. Have dialaouge between community members in which disputes can be fairly ironed out (Milland, 2016) pg 265

5.6. Companies like Amazon could be responsible for managing their platforms w/ equal pay (Whitehouse, 2016; Pittmn and Sheehan, 2016)

5.6.1. What qualifies as ethical pay? (Milland 2016)

5.7. "By routing microtasks to workers based on demographics and appropriate pay, our framework mitigates biases in the contributor sample and increases the hourly pay given to contributors." (Barbosa and Chen, 2019)

5.7.1. "Our philosophy is that, instead of taking sides and defining which biases are wanted and which are not, our approach is to let a requester decide how “diverse” or “skewed” the distribution of a certain contributor demographic must be for a given labeling task." (Barbosa and Chen, 2019) **See Fig. 2

5.7.1.1. "We believe that mitigating biases and ethical issues in this process are parallel goals to rehumanizing crowd work, and that such a framework will ultimately contribute to rehumanizing crowd work via increased transparency in regards to how human factors affect the work to be completed on the platform and the resulting labels that will later be used in machine learning models." (Barbosa and Chen, 2019)

6. Uses

6.1. survey experiments (Williamson, 2016)

6.1.1. "regular feature of political analysis" (Williamson, 2016)

6.1.1.1. "attitudes towards inequality, war, and political representation- and published in prestigious journals" (Williamson, 2016)

6.1.1.2. other digital libraries (Williamson, 2016)

7. "We should not allow the perfect to be the enemy of the good" "If these values are important to study, they are also important to implement in our research practices" (Williamson, 2016)

8. Sources of Motivation: The Worker Perspectives

8.1. MTurk vs Bentham Project = money vs intellectual self-satisfaction , crowd vs community (Bivins, 2016)

9. Power dynamics

9.1. Requesters have full power to accept and reject work and the involvement of the platform is minimal - which is considered by the researchers as a good thing (Pittman and Sheehan, 2016).

9.2. "unregulated markets" (Williamson, 2016)

9.2.1. "Workers have few legal protections “as the cyberspace in which they work remains essentially unregulated for employment and labor law purposes”"(Sheehan, 2017)

9.3. "Payment issues are just the tip of the “power dynamic iceberg”. Researchers have the ability to charge what they wish: Amazon will not set payment floors. Researchers can also to refuse payment to any worker and can block any worker from participating in their studies if they believe that the worker is providing substandard work. Given their hands-off policy, Amazon will not address any type of worker complaints, and will suspend workers who are researchers frequently block. Workers are not even protected by anonymity." (Sheehan, 2017)

9.4. "Completely undermines hard earned and fought for standards of fair labour" (Schmidt, 2013)