A Workshop Connecting Crowdsourcing and Online Education at HComp 2014

The online education and crowdsourcing communities are addressing similar problems in educating, motivating and evaluating students and workers. The online learning community succeeds in increasing the supply side of the cognitively skilled labor market, and the crowdsourcing at scale community creates a larger marketplace for cognitively skilled work. WorkLearn is held at HComp 2014 in Pittsburgh, November 2, 2014.

Workshop: http://www.worklearn.org/
Venue: http://www.humancomputation.com/2014/

Call for Proposals

WorkLearn 2014 is a full-day workshop at HCOMP 2014 which will bring together researchers and practitioners from crowdsourcing and online education communities to explore connections between learning and working online. We want to spark knowledge sharing and discussions on topics such as: integrating online learning platforms and online work platforms; solving shared problems like training and evaluation of both students and high-skill crowd workers; how crowdsourcing methodologies can be used to scale the labor-intensive components of education. We invite submission of short (1-2 page) position papers which identify and motivate key problems related to the intersection of issues between crowd work/human computation and online learning/education. You are invited to include a short bio in your submission to provide context for your fit with the workshop. Please send your submission to work.learn.workshop@gmail.com. Submissions invited to participate in the workshop will notified in September. We encourage submission of position papers focusing on:

  • Challenges and demands of industry
    • What skills do we need to train (crowd and online) workers for?
    • What can crowdsourcing do for learning at scale?
  • Proposals for platforms and software to connect online work and learning
    • How can a platform for online learning be linked to a platform for crowd work in a way that creates a more skilled workforce and better crowd work?
    • Visionary papers on the future of online work and learning

We are looking forward seeing you in Pittsburgh: Markus Krause, Leibniz University, Germany Praveen Paritosh, Google, USA Joseph Jay Williams, Stanford University, USA

Human Computation now accepting submissions

The interdisciplinary journal Human Computation is now accepting
manuscripts (http://hcjournal.org/ojs/HC-flyer.pdf). The editorial board
is inviting high-quality contributions in the field of Human Computation
from all related disciplines. This is an open-access, community-driven
journal with no author or subscriber fees.

Human Computation is an international and interdisciplinary forum for
the electronic publication and print archiving of high-quality scholarly
articles in all areas of human computation, which concerns the design or
analysis of information processing systems in which humans participate
as computational elements.

The journal aims to bring together Human Computation (HC) results and
perspectives from a wide field of HC-related disciplines, such as
Artificial Intelligence, Behavioral Sciences, Citizen Science, Cognitive
Science, Complexity Science, Computer Science, Evolutionary Biology,
Economics, HCI, Philosophy and others. Subtopics may include novel or
transformative applications of HC, interfaces, methods and task design,
result aggregation and selection methods, mechanism design, software and
infrastructure development, user studies related to HC, and others.

We are now accepting submissions.  The first issue will be published in
early Fall of 2014. To be considered for this issue, submissions must be
received by July 31.  More information on the submission guidelines and
process can be found on the journal site: http://hcjournal.org.

CrowdScale 2013: Call for Position Papers and Shared Task Challenge ($1500 in prizes)

CrowdScale 2013: Crowdsourcing at Scale
A workshop at HCOMP 2013: The 1st Conference on Human Computation & Crowdsourcing
November 9, 2013
http://www.crowdscale.org

Overview
Crowdsourcing and human computation at scale raises a variety of open challenges vs. crowdsourcing with smaller workloads and labor pools. We believe focusing on such issues of scale will be key to taking crowdsourcing to the next level – from its uptake by early adopters today, to its future as how the world’s work gets done. To advance crowdsourcing at scale, CrowdScale will pursue two thrusts:

Track 1: Position Papers. We invite submission of 2-page position papers which identify and motivate focused, key problems or approaches for crowdsourcing at scale.

Track 2: Shared Task Challenge. We invite submissions to a shared task challenge on computing consensus from crowds: how to generate the best possible answer for each question, based on the judgments of five or more raters per question.  Participants will submit 4-page papers describing their systems and preliminary results, with $1500 in prize money awarded to top performers.

One may participate in either or both tracks. Submitted papers will not be peer-reviewed or archived, so work shared in these papers can be later submitted to peer-reviewed venues. All papers will be posted on the workshop website to promote discussion within and beyond workshop participants. Workshop organizers will review all submissions to ensure quality, with high acceptance expected.

Position Papers
http://www.crowdscale.org/cfp
We invite submission of short (2-page) position papers which identify and motivate key problems or potential approaches for crowdsourcing at scale.  We encourage submissions identifying and clearly articulating problems, even if there aren’t satisfactory solutions proposed.  Submissions focusing on problems should clearly describe a problem of scale, why it matters, why it is hard, existing approaches, and desired properties of effective solutions.  We welcome early work, and particularly encourage submission of visionary position papers that are forward looking.

Each submitted paper should focus on one problem. We encourage multiple submissions per author for articulating distinct problem statements or methods.

Position papers are welcome to argue the merits of an approach or problem already published in earlier work by the author (or anyone else). In this case, the approach should be clearly attributed to the prior work, and the contribution of the position paper would be its argument of why the approach is promising for crowdsourcing at scale.

During the workshop, authors will self-organize into break-out groups, with each group further elaborating upon a particular critical area meriting further work and study. Each group will summarize and report its findings at the workshop’s close. In continuing discussion beyond the workshop, organizers and participants will co-author a summary paper articulating a road map of important challenges and approaches for our community to pursue.

Position paper ideas include (but are not limited to): http://www.crowdscale.org/position-paper-ideas

Shared Task Challenge
http://www.crowdscale.org/shared-task
To help advance research on crowdsourcing at scale, CrowdFlower and Google are sharing two new, large challenge datasets for multi-class classification. Both datasets are available for immediate download. To make it easy to participate, we have provided multiple formats of the data, and pointers to open source software online that is available to get started.

All participants are expected to submit a paper (up to 4 pages) describing one’s method and preliminary results on shared task metrics, and to present a poster at the workshop. Final results will be announced at the workshop, with prize money awarded to the best performer(s), as well as recognition during the workshop and in our workshop report.

Shared task participants are also invited to participate in workshop discussion throughout the day.

Important Dates
October  14: Position papers due
October 20: Shared task runs due
October 27: Shared task papers due
November 9: Workshop

Please see workshop website for additional information on schedule.

Questions: Email the organizers at: crowdscale-organizers@googlegroups.com

Workshop Organizers
Tatiana Josephy, CrowdFlower
Matthew Lease, University of Texas at Austin
Praveen Paritosh, Google

Advisory Committee
Omar Alonso, Microsoft
Ed Chi, Google
Lydia Chilton, University of Washington
Matt Cooper, oDesk
Peng Dai, Google
Benjamin Goldenberg, Yelp
David Huynh, Google
Panos Ipeirotis, Google/NYU
Chris Lintott, Zooniverse/GalaxyZoo
Greg Little, oDesk
Stuart Lynn, Zooniverse/GalaxyZoo
Stefano Mazzocchi, Google
Rajesh Patel, Microsoft
Mike Shwe, Google
Rion Snow, Twitter
Maria Stone, Microsoft
Alexander Sorokin, CrowdFlower
Jamie Taylor, Google
Tamsyn Waterhouse, Google
Patrick Philips, LinkedIn
Sanga Reddy Peerreddy, SetuServ

Collective Intelligence 2014: Call for Papers

********************************************************

CALL FOR PAPERS

********************************************************

Collective Intelligence 2014

MIT, Cambridge, MA

June 10-12, 2014

www.collectiveintelligence2014.org

********************************************************

This interdisciplinary conference seeks to bring together researchers from a variety of fields relevant to understanding and designing collective intelligence of many types.

 

Topics of interest include but are not limited to:

  • human computation

  • social computing

  • crowdsourcing

  • wisdom of crowds (e.g., prediction markets)

  • group memory and extended cognition

  • collective decision making and problem-solving

  • participatory and deliberative democracy

  • animal collective behavior

  • organizational design

  • public policy design (e.g., regulatory reform)

  • ethics of collective intelligence (e.g., “digital sweatshops”)

  • computational models of group search and optimization

  • emergence and evolution of intelligence

  • new technologies for making groups smarter

 

CONFERENCE FORMAT

The conference will consist of

  • Invited talks from prominent researchers in different areas related to collective intelligence

  • Oral presentations (see below)

  • Poster/Demo sessions (see below)

  • “Ignite” sessions in which practitioners (e.g. policy makers) connect with researchers around collective-intelligence-based solutions to real-world problems

 

SUBMISSION

Submissions of two types are invited:

  • Reports of original research results

  • Demonstrations of tools/technology

 

All submissions should  be formatted as three-page extended abstracts (see www.collectiveintelligence2014.org for Word and Latex templates), and should be submitted at https://cmt.research.microsoft.com/CI2014

 

In order to encourage a diversity of innovative ideas from a variety of fields, submissions may refer to work that is recently published, under review elsewhere, or in preparation, and may link to up to one publicly accessible paper for the purpose of describing the work in detail. However, submissions will be evaluated solely on the submitted abstract, which must therefore comprise an entirely self-contained description of the work.

 

After review by the Program Committee, a subset of submitted papers will be invited for oral presentation, as well as for presentation as posters and/or demos. A second subset will also be invited exclusively for presentation as posters and/or demos.

 

Accepted submissions (including for posters and demos) will be compiled into a single report which will be made available on http://arxiv.org. We emphasize that published abstracts are not intended to be considered archival publications or to preclude submission of the reported work to archival journals; however, we cannot guarantee that certain journals do not have policies precluding the publishing of extended abstracts.

 

Authors will not receive detailed feedback from the review process, and accepted abstracts will be included as submitted (i.e. submissions should be camera-ready).

 

IMPORTANT DATES

Extended abstract submission deadline:  January 15, 2014

Notification of acceptance / rejection:  February 15, 2014

Conference dates:  June 10-12, 2014

 

PROGRAM CHAIRS

Duncan Watts (Microsoft Research)

Michael Kearns (University of Pennsylvania)

 

GENERAL CHAIRS

Jeffrey Nickerson (Stevens Institute of Technology)

Thomas Malone (MIT)

 

PROGRAM COMMITTEE

Lada Adamic (Facebook, University of Michigan)

Christopher Chabris (Union College)

Iain Couzin (Princeton)

Winter Mason (Stevens Institute of Technology, Facebook)

Beth Noveck (NYU)

Scott Page (University of Michigan)

Paul Resnick (University of Michigan)

Matthew Salganik (Princeton, Microsoft Research)

Rajiv Sethi (Columbia University)

Anita Woolley (CMU)

 

COMMUNICATIONS CHAIR

Elizabeth Gerber (Northwestern)

 

LOCAL ARRANGEMENTS CHAIRS

Seyda Ertekin (MIT)

Lawrence Abeln (MIT)

 

PROCEEDINGS CHAIRS

Walter Lasecki and Jeff Bigham (University of Rochester)

 

CrowdConf Call for Abstracts

CrowdConf 2013, taking place Oct 22 in San Francisco, will bring together academic and industrial researchers, technologists, outsourcing entrepreneurs, and artists to discuss how crowdsourcing is transforming human computation and the future of work. We’re eager to receive research abstracts on crowdsourcing, human computation, and social computing.

feature-crowd

As the premier industrial conference for crowdsourcing, CrowdConf offers researchers an opportunity to network with personnel from the platforms we use, to see what features are coming down the pipe and to influence them, fantastic media exposure from tech journalists, and the usual opportunities to network with other researchers.

SUBMIT RESEARCH TOPICS OF INTEREST

Topics of interest include, but are not limited to the following:

  • Past, present, and future of crowdsourcing
  • Quality assurance and metrics
  • Social and economic implications of crowdsourcing
  • Task design/Worker incentives
  • Innovative projects, experiments, and applications

Share your crowdsourcing research ideas with us now. If your paper is selected for presentation or for a poster session, you’ll receive a complimentary pass to the conference, a $700 value.

Learn more about CrowdConf.

 

CfP: new Crowdsourcing area at ACM Multimedia 2013

Crowdsourcing Area at ACM MM 2013
The 21st ACM International Conference on Multimedia
October 21–25, 2013, Barcelona Spain
Call for Papers: http://acmmm13.org/submissions/call-for-papers/

Following the successful Crowd MM workshop at ACM Multimedia last year, we have added Crowdsourcing as an official technical program area (long and short papers) for ACM MM 2013 in Barcelona Spain.  Multimedia is the flagship conference for ACM SIGMM.

AREA DESCRIPTION

Crowdsourcing makes use of human intelligence and a large pool of contributors to address problems that are difficult to solve using conventional computation. This new area cross-cuts traditional multimedia topics and solicits submissions dedicated to results and novel ideas in multimedia that are made possible by the crowd, i.e., they exploit crowdsourcing principles and techniques. Crowdsourcing is considered to encompass the use of: microtask marketplaces, games-with-a-purpose, collective intelligence and human computation. Topics include, but are not limited to:

  • Exploiting crowdsourcing for multimedia generation, interpretation, sharing or retrieval
  • Learning from crowd-annotated or crowd-augmented multimedia data
  • Economics and incentive structures in multimedia crowdsourcing systems
  • Crowd-based design and evaluation of multimedia algorithms and systems
  • Crowdsourcing in multimedia systems and applications such as Art & Culture, Authoring, Collaboration, Mobile & Multi-device, Multimedia Analysis, Search, and Social Media.

Submissions should have both a clear focus on multimedia and also a critical dependency on crowdsourcing techniques.

CONFERENCE INFO

Since the founding of ACM SIGMM in 1993, ACM Multimedia has been the worldwide premier conference and a key world event to display scientific achievements and innovative industrial products in the multimedia field. At ACM Multimedia 2013, we will celebrate its twenty-first iteration with an extensive program consisting of technical sessions covering all aspects of the multimedia field in forms of oral and poster presentations, tutorials, panels, exhibits, demonstrations and workshops, bringing into focus the principal subjects of investigation, competitions of research teams on challenging problems, and also an interactive art program stimulating artists and computer scientists to meet and discover together the frontiers of artistic communication.

IMPORTANT DATES

  • Abstract for Full Papers: March 1, 2013
  • Manuscript for Full/Short Papers: March 8, 2013
  • Rebuttal May 8–17, 2013
  • Author-to-Author’s Advocate contact period: May 8–13, 2013
  • Notification of Acceptance: June 25, 2013
  • Camera-ready submission: July 30, 2013
  • Conference: October 21–25, 2013, Barcelona Spain

CONFERENCE WEBSITE

http://acmmm13.org

CfP: Crowdsourcing in Virtual Communities track at AMCIS 2013

Mini-track: Crowdsourcing in Virtual Communities
19th Americas Conference on Information Systems (AMCIS 2013)
August 15-17, 2013 in Chicago, Illinois, USA
Link: http://amcis2013.aisnet.org/?option=com_content&id=69

Following the successful crowdsourcing tracks at ACIS 2011 and AMCIS 2012, we are accepting submission to this year’s AMCIS 2013 crowdsourcing track in Chicago. AMCIS is one of the biggest annual conferences in the field of Information Systems with about 1000 participants.

DESCRIPTION
Crowdsourcing harnesses the potential of large networks of people via open calls for contribution and thus enables organizations to tap into a diversity of knowledge, skills, and perspectives. Fueled by the increasing pervasiveness of the Internet, crowdsourcing has been rapidly gaining importance in a wide range of contexts, both in research and practice. In order to provide better guidance for future crowdsourcing efforts, it is crucial to gain a deeper and integrated understanding of the phenomenon. While research on crowdsourcing is multidisciplinary, information systems take a central role in realizing crowdsourcing approaches by interconnecting organizations and globally distributed contributors. By viewing crowdsourcing from an IS perspective, this track aims to channel related research directions and move from the consideration of isolated aspects and applications to a systemic foundation for the design of socio-technical crowdsourcing systems.

We encourage submissions from theoretical, empirical, and design science research on the following and adjacent topics:
– Crowdsourcing ecosystems and markets
– Platforms, tools, and technologies
– Task characteristics, task design, and task choice
– Contributor motivation and incentive structures
– Design of workflows and processes
– Mobile crowdsourcing
– Quality assurance and evaluation of contributions
– Economics of crowdsourcing
– Case studies of crowdsourcing effectiveness
– Adoption of crowdsourcing business models
– Innovative applications

IMPORTANT DATES
January 4, 2013: Bepress will start accepting paper submissions
February 22, 2013 (11:59 pm CST): Deadline for paper submissions
April 22, 2013: Authors notified of acceptance decisions
May 9, 2013: Camera-ready copy due for accepted papers

Call for participation: Microsoft Research India presents “The Whodunit? Challenge”

Rajan Vaish, UC Santa Cruz/ Microsoft Research India
Aditya Vashistha, Microsoft Research India
Bill Thies, Microsoft Research India
Ed Cutrell, Microsoft Research India

We’ve seen that social networks can mobilize people in rich countries, but how can people mobilize in environments lacking widespread Internet access?  To understand how people will collaborate in an era of varied ICTs, where countries like India have huge mobile phone penetration; Microsoft Research India will launch the Whodunit? Challenge on February 1st, 2013. The challenge is an India wide social gaming competition that awards 1 Lakh Rupees to the winner.

The Whodunit? Challenge

 

Continue reading

Announcing HCOMP 2013 – Conference on Human Computation and Crowdsourcing

Bjoern Hartmann, UC-Berkeley 
Eric Horvitz, Microsoft Research

Announcing HCOMP 2013, the Conference on Human Computation and Crowdsourcing,  Palm Springs, November 7-9, 2013.  Paper submission deadline is May 1, 2013.  Thanks to the HCOMP community for bringing HCOMP to life as a full conference, following on the successful workshop series.

HCOMP 2013 at Palm Springs

The First AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2013) will be held November 7-9, 2013 in Palm Springs, California, USA. The conference was created by researchers from diverse fields to serve as a key focal point and scholarly venue for the review and presentation of the highest quality work on principles, studies, and applications of human computation. The conference is aimed at promoting the scientific exchange of advances in human computation and crowdsourcing among researchers, engineers, and practitioners across a spectrum of disciplines.  Papers submissions are due May 1, 2013 with author notification on July 16, 2013.  Workshop and tutorial proposals are due May 10, 2013.  Posters & demonstrations submissions are due July 25, 2013.

For more information, see the HCOMP 2013 website.