[4Rin1-74] Dynamic Microtask Re-posting for Efficient Answer Collection in Crowdsourcing
Keywords:Crowdsourcing, Quality control, Cost control
To improve the quality of the output in crowdsourcing at low cost, it is effective to change the number of collecting answers for each question and make a majority decision. Propose of a framework for estimating what kind of labeling method is more efficient when collecting additional answers dynamically using a development set of unknown data and collecting the entire data from the obtained results. Appropriate conditions are needed for optimize the cost-effectiveness and change the number of answers collected. By using a binary classification task and simulating the dynamic collection of answers for development set with correct labels, determine the most efficient way to perform labeling. Unknown data is collected by the determined method. As a result of the evaluation experiment, it was confirmed that the error improvement rate and the monetary cost were similar to those at the time of simulation by collecting data using the determined parameters.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.