File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ICDE.2016.7498229
- Scopus: eid_2-s2.0-84980390225
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Crowdsourced POI labelling: location-aware result inference and Task Assignment
Title | Crowdsourced POI labelling: location-aware result inference and Task Assignment |
---|---|
Authors | |
Issue Date | 2016 |
Publisher | IEEE Computer Society. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000178 |
Citation | The 32nd IEEE International Conference on Data Engineering (ICDE 2016), Helsinki, Finland, 16-20 May 2016. In Conference Proceedings, 2016, p. 1-12 How to Cite? |
Abstract | Identifying the labels of points of interest (POIs), aka POI labelling, provides significant benefits in location-based services. However, the quality of raw labels manually added by users or generated by artificial algorithms cannot be guaranteed. Such low-quality labels decrease the usability and result in bad user experiences. In this paper, by observing that crowdsourcing is a best-fit for computer-hard tasks, we leverage crowdsourcing to improve the quality of POI labelling. To our best knowledge, this is the first work on crowdsourced POI labelling tasks. In particular, there are two sub-problems: (1) how to infer the correct labels for each POI based on workers' answers, and (2) how to effectively assign proper tasks to workers in order to make more accurate inference for next available workers. To address these two problems, we propose a framework consisting of an inference model and an online task assigner. The inference model measures the quality of a worker on a POI by elaborately exploiting (i) worker's inherent quality, (ii) the spatial distance between the worker and the POI, and (iii) the POI influence, which can provide reliable inference results once a worker submits an answer. As workers are dynamically coming, the online task assigner judiciously assigns proper tasks to them so as to benefit the inference. The inference model and task assigner work alternately to continuously improve the overall quality. We conduct extensive experiments on a real crowdsourcing platform, and the results on two real datasets show that our method significantly outperforms state-of-the-art approaches. © 2016 IEEE. |
Persistent Identifier | http://hdl.handle.net/10722/232184 |
ISBN | |
ISSN | 2023 SCImago Journal Rankings: 1.306 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hu, H | - |
dc.contributor.author | Zheng, Y | - |
dc.contributor.author | Bao, Z | - |
dc.contributor.author | Li, G | - |
dc.contributor.author | Feng, J | - |
dc.contributor.author | Cheng, RCK | - |
dc.date.accessioned | 2016-09-20T05:28:18Z | - |
dc.date.available | 2016-09-20T05:28:18Z | - |
dc.date.issued | 2016 | - |
dc.identifier.citation | The 32nd IEEE International Conference on Data Engineering (ICDE 2016), Helsinki, Finland, 16-20 May 2016. In Conference Proceedings, 2016, p. 1-12 | - |
dc.identifier.isbn | 978-150902019-5 | - |
dc.identifier.issn | 1084-4627 | - |
dc.identifier.uri | http://hdl.handle.net/10722/232184 | - |
dc.description.abstract | Identifying the labels of points of interest (POIs), aka POI labelling, provides significant benefits in location-based services. However, the quality of raw labels manually added by users or generated by artificial algorithms cannot be guaranteed. Such low-quality labels decrease the usability and result in bad user experiences. In this paper, by observing that crowdsourcing is a best-fit for computer-hard tasks, we leverage crowdsourcing to improve the quality of POI labelling. To our best knowledge, this is the first work on crowdsourced POI labelling tasks. In particular, there are two sub-problems: (1) how to infer the correct labels for each POI based on workers' answers, and (2) how to effectively assign proper tasks to workers in order to make more accurate inference for next available workers. To address these two problems, we propose a framework consisting of an inference model and an online task assigner. The inference model measures the quality of a worker on a POI by elaborately exploiting (i) worker's inherent quality, (ii) the spatial distance between the worker and the POI, and (iii) the POI influence, which can provide reliable inference results once a worker submits an answer. As workers are dynamically coming, the online task assigner judiciously assigns proper tasks to them so as to benefit the inference. The inference model and task assigner work alternately to continuously improve the overall quality. We conduct extensive experiments on a real crowdsourcing platform, and the results on two real datasets show that our method significantly outperforms state-of-the-art approaches. © 2016 IEEE. | - |
dc.language | eng | - |
dc.publisher | IEEE Computer Society. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000178 | - |
dc.relation.ispartof | International Conference on Data Engineering Proceedings | - |
dc.rights | ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.title | Crowdsourced POI labelling: location-aware result inference and Task Assignment | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Cheng, RCK: ckcheng@cs.hku.hk | - |
dc.identifier.authority | Cheng, RCK=rp00074 | - |
dc.description.nature | postprint | - |
dc.identifier.doi | 10.1109/ICDE.2016.7498229 | - |
dc.identifier.scopus | eid_2-s2.0-84980390225 | - |
dc.identifier.hkuros | 265278 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 12 | - |
dc.publisher.place | United States | - |
dc.customcontrol.immutable | sml 161004 | - |
dc.identifier.issnl | 1084-4627 | - |