1 Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com Matthias Hirth, Tobias Hoßfeld, Phuoc Tran-Gia University of W¨ urzburg, Institute of Computer Science, Germany Email: [matthias.hirth|tobias.hossfeld|phuoc.trangia]@informatik.uni-wuerzburg.de Abstract—Since Jeff Howe introduced the term ”crowdsourc- ing” in 2006 for the first time, crowdsourcing has be come a growing market in the current Internet. Thousands of workers categorize images, write articles or perform other small tasks on platforms like Amazon Mechanical Turk (MTurk), Microworkers or ShortTask. In this work, we want to give an inside view of the usage data from Microworkers and show that there are significant differences to the well studied MTurk. Further, we have a look at Microworkers from the perspective for a worker, an employer and the platform owner, in order to answer their most important questions: What jobs are most paid? How do I get my work done most quickly? When are the users of my platform active? Keywords-crowdsourcing; user statistics; platform description; I. I NTRODUCTION In 2006 Jeff Howe introduced the term crowdsourcing [1] which refers to ”the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call”. Besides various non-profit crowdsourcing applications like the Wikipedia [2] and OpenStreetMap [3] projects, commercial usage of crowdsourcing becomes more and more interesting and a large variety of crowdsourcing platforms has developed. These platforms act as mediator between the employers and the crowd. Some crowdsourcing platforms are specialized on cer- tain tasks, e.g. InnoCentive [4] on research and develop- ment, Clickworker [5] on text creation, data categorization, web-search and surveys. Other platforms like MTurk [6], Microworkers [7] or ShortTask [8] offer a framework to access the crowd which enables the employers to sub- mit individually designed tasks. These non-specialized plat- forms are particularly interesting as they have usually large crowds and are used for a large variety of different task types, like tasks related to search engine optimization, audio transcription of sound data, user surveys for products or recruiting people for scientific on-line tests. In recent years, several publications dealt with the quality of the workers, the types of the task and about the workers and employers themselves. However, most of theses studies were based on MTurk which is highly biased in terms of the home country of the workers and employers. In order to place a task on MTurk a US bank account is required, the money earned can only be transfered to an US or Indian bank account or can be used in the amazon.com shop. Thus, most of the employers are from the USA and most of the workers from India and the USA. Consequently, the question is if these results are generaliz- able or biased because of the MTurk restrictions. Therefore, we analyze the demographics of Microworkers, a crowdsourcing platform with no limitation regarding the home country of the workers or the employers and compare them to the findings about the MTurk demographics. Further, we have a look at platform specific measures to compare the two platforms directly. Hereby, we use three different viewpoints, the worker’s, the employer’s and the platform’s viewpoint. Each of them has different focuses, e.g. the worker is interested how much he can earn, the employer how fast and properly his work is done. As the platform owner charges a fee for each submitted campaign and for each successfully completed task, he is interested at which time the uses are active and how correctly the workers perform the given tasks. The rest of the paper is structured as follows. Section II gives a short background how MTurk and Microworkers work and summarizes the related work. In Section III we focus on the home countries of the Microworkers users and compare them to the home countries of the MTurk users. Sec- tion IV characterizes the jobs in both platforms and Section V compares platform parameters of MTurk and Microworkers. Section VI concludes our paper. II. BACKGROUND AND RELATED WORK In this section we give a brief overview of the crowd- sourcing concept. Afterwards, we shortly describe MTurk and Microworkers, their terminology and which information about the platforms were accessible for this paper. Further, we review the related work. A. Crowdsourcing The term crowdsourcing is a neologism combining the words crowd and outsourcing. In the traditional outsourcing approach, a firm subcontracts parts of the production process or certain tasks to a third-party provider. This is mostly done because of cost reduction or because the knowhow of the subcontractor is needed for this specific task. In order to maximize the benefits of the outsourcing process, a firm carefully chooses the outsourcing contractor to work with. In crowdsourcing, a task is not performed by a designated outsourcing company or worker, but it is accomplished by ”the crowd”. This means, that an employer using crowdsourcing