@prefix dcterms: . @prefix orcid: . @prefix this: . @prefix sub: . @prefix xsd: . @prefix prov: . @prefix pav: . @prefix np: . @prefix doco: . @prefix c4o: . sub:Head { this: np:hasAssertion sub:assertion; np:hasProvenance sub:provenance; np:hasPublicationInfo sub:pubinfo; a np:Nanopublication . } sub:assertion { sub:paragraph c4o:hasContent "Crowdsourcing [19] refers to the process of solving a problem formulated as a task by reaching out to a large network of (often previously unknown) people. One of the most popular forms of crowdsourcing are ‘microtasks’ (or ‘microwork’), which consists on di- viding a task into several smaller subtasks that can be independently solved. Conditional on the tackled prob- lem, the level of task granularity can vary (microtasks whose results need to be aggregated vs. macrotasks, which require filtering to identify the most valuable contributions); as can the incentive structure (e.g., pay- ments per unit of useful work vs. prizes for top par- ticipants in a contest). Another major design decision in the crowdsourcing workflow is the selection of the crowd. While many (micro)tasks can be performed by untrained workers, others might require more skilled human participants, especially in specialized fields of expertise, such as LD. Of course, expert intervention usually comes at a higher price; either in monetary re- wards or in the form of effort to recruit participants in another setting, such as volunteer work. Microtask crowdsourcing platforms such as Amazon Mechanical Turk (MTurk) 1 on the other hand offer a formidable and readily-available workforce at relatively low fees."; a doco:Paragraph . } sub:provenance { sub:assertion prov:hadPrimarySource ; prov:wasAttributedTo orcid:0000-0003-0530-4305 . } sub:pubinfo { this: dcterms:created "2019-11-10T12:34:11+01:00"^^xsd:dateTime; pav:createdBy orcid:0000-0002-7114-6459 . }