Title

Incremental Relabeling For Active Learning With Noisy Crowdsourced Annotations

Abstract

Crowdsourcing has become an popular approach for annotating the large quantities of data required to train machine learning algorithms. However, obtaining labels in this manner poses two important challenges. First, naively labeling all of the data can be prohibitively expensive. Second, a significant fraction of the annotations can be incorrect due to carelessness or limited domain expertise of crowdsourced workers. Active learning provides a natural formulation to address the former issue by affordably selecting an appropriate subset of instances to label. Unfortunately, most active learning strategies are myopic and sensitive to label noise, which leads to poorly trained classifiers. We propose an active learning method that is specifically designed to be robust to such noise. We present an application of our technique in the domain of activity recognition for eldercare and validate the proposed approach using both simulated and realworld experiments using Amazon Mechanical Turk. © 2011 IEEE.

Publication Date

12-1-2011

Publication Title

Proceedings - 2011 IEEE International Conference on Privacy, Security, Risk and Trust and IEEE International Conference on Social Computing, PASSAT/SocialCom 2011

Number of Pages

728-733

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1109/PASSAT/SocialCom.2011.193

Socpus ID

84863418365 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/84863418365

This document is currently not available here.

Share

COinS