• English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • English 
    • English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • Login
View Item 
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

All of the LibraryCommunitiesPublication DateTitlesSubjectsAuthorsThis CollectionPublication DateTitlesSubjectsAuthorsProfilesView

My Account

LoginRegister

The Library

AboutNew SubmissionSubmission GuideSearch GuideRepository PolicyContact

Truth Serums for Massively Crowdsourced Evaluation Tasks

  • CSV
  • RefMan
  • EndNote
  • BibTex
  • RefWorks
Author(s)
Kamble, Vijay
Shah, Nihar
Marn, David
Parekh, Abhay
Ramachandran, Kannan
Keywords
Computer Science - Computer Science and Game Theory
Computer Science - Artificial Intelligence
Computer Science - Multiagent Systems

Full record
Show full item record
URI
http://hdl.handle.net/20.500.12424/796095
Online Access
http://arxiv.org/abs/1507.07045
Abstract
Incentivizing effort and eliciting truthful responses from agents in the absence of verifiability is a major challenge faced while crowdsourcing many types of evaluation tasks like labeling images, grading assignments in online courses, etc. In this paper, we propose new reward mechanisms for such settings that, unlike most previously studied mechanisms, impose minimal assumptions on the structure and knowledge of the underlying generating model, can account for heterogeneity in the agents' abilities, require no extraneous elicitation from them, and furthermore allow their beliefs to be (almost) arbitrary. Moreover, these mechanisms have the simple and intuitive structure of output agreement mechanisms, which, despite not incentivizing truthful behavior, have nevertheless been quite popular in practice. We achieve this by leveraging a typical characteristic of many of these settings, which is the existence of a large number of similar tasks.
Date
2015-07-24
Type
text
Identifier
oai:arXiv.org:1507.07045
http://arxiv.org/abs/1507.07045
Collections
OAI Harvested Content

entitlement

 
DSpace software (copyright © 2002 - 2021)  DuraSpace
Quick Guide | Contact Us
Open Repository is a service operated by 
Atmire NV
 

Export search results

The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.