Crowdsourcing Research Tasks Using mTurk

A short description of the mTurk model for both participants and researchers or businesses.

Published December 11, 2014, last updated on October 5, 2017 under Voices of DGHI

By Emma Zhao and Ishan Thakore 

If you have ever done social sciences research, you’ve almost certainly faced the problem of gathering information from enough people in a timely manner. Juntos, our Bass Connections project, is no exception. Now in our final stages, our goal is to get feedback on the first iteration of a health resource site we’ve developed. But that’s easier said than done. Interviews in which we’ve asked participants to review website screenshots have ranged anywhere from 10 to 75 minutes. These interviews also take quite a while to transcribe, and we heavily rely on our amazing community partner, El Centro Hispano, for the scheduling and recruitment. With limited time left in our project, could we devise more efficient way to do this?

Maybe. Enter mTurk—or "Mechanical Turk"—an Amazon service dedicated to crowdsourcing “tasks” that require human participation. People complete tasks listed on the website, which can range from taking a simple survey to transcribing a long interview. Tasks have been used to process visual data, like photos, which computers can’t do as easily as humans. By making these tasks viewable to Amazon users across the web and paying users to do them, mTurk saves time and resources that would be spent on advertising and recruiting. mTurk participants are paid a fraction of what we would pay a participant for an interview (think 50 cents compared to $30), but researchers have found that this has no significant effect on data quality.

Despite these benefits, we still had some hesitations. Because mTurk relies on user-created profiles, researchers can never be 100% sure respondents are who they say they are. This is especially important for Juntos, as we specifically want to reach Latino men who have sex with men (MSM) and trans women. Juntos is also a local project. Our site is tailored to the surrounding community, based largely on interviews in Durham, Wake and Orange counties. How do we know that feedback from someone in California will be relevant for our target population? Will they even be looking for the same resources, since health needs vary by location?

We thought it was worth a try despite these doubts. We first began by brainstorming how our interview guide could be adapted into a survey format. We removed as many open-ended questions as possible so that the online survey was not as long as the in-person format. We also decided to use Qualtrics for our survey instead of the mTurk survey builder, as Qualtrics offers powerful analytic tools. The software also allows us to keep participant responses anonymous. Based on our ideas, we began building a first iteration in Qualtrics with some sample screenshots. We used a feature called “Heat Map,” where participants can click on areas of the screenshot that they liked or disliked. Qualtrics aggregates these data into a “heat map,” which displays areas with the most and least clicks. 

Our initial iterations seemed overly repetitive, because participants were reviewing screenshots with minimal content multiple times. We also thought some of our questions would bias respondents towards agreeing with positive statements about the website. We solicited feedback from our Juntos team as well as the Social Sciences Research Institute, and the survey went through several revisions. 

Once the Institutional Review Board (IRB) reviews our survey, we will translate it into Spanish before listing it under our mTurk account. Once it is live, users can complete the task by clicking a link to Qualtrics and completing the survey. Once users complete all the questions, they receive a randomly generated code. They must input this code into mTurk to complete our task and receive payment.

While we initially had reservations, we are excited to see the results from our survey. We hope that by disseminating this information and soliciting feedback from a wider audience, we can ultimately create a better website.