Crowdsourcing

Crowdsourcing is a neologism for the act of taking a job traditionally performed by an employee or contractor, and outsourcing it to an undefined, generally large group of people, in the form of an open call. For example, the public may be invited to develop a new technology, carry out a design task, refine an algorithm or help analyze large amounts of data.

History
The word was first coined by Jeff Howe in a June, 2006 Wired Magazine article. Though the term is new there are examples of significant crowdsourcing projects as early as the eighteenth century. In 1714, the British Government offered a public prize for a solution to the longitude problem. In the 1800s, the Oxford English Dictionary was written from volunteer contributions of millions of slips of paper. Recently, the Internet has been used to publicize and manage crowdsourcing projects.

Overview
In some cases the labor is well-compensated. In other cases the only rewards may be kudos or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers working in their spare time, or from small businesses which were unknown to the initiating organization.

Perceived benefits of crowdsourcing include:
 * Problems can be explored at comparatively little cost.
 * Payment is by results.
 * The organization can tap a wider range of talent than might be present in its own organisation.

The difference between crowdsourcing and ordinary outsourcing is that a task or problem is outsourced to the public, rather than another body. The difference between crowdsourcing and open source is that open source production is a cooperative activity initiated and voluntarily undertaken by members of the public. In crowdsourcing the activity is initiated by a client, and the work may be undertaken on an individual, as well as a group, basis.

Recent examples of crowdsourcing

 * Wikipedia, a free encyclopedia edited by volunteers. The English Wikipedia currently contains over 2,000,000 articles.
 * In 2005, Amazon.com launched the Amazon Mechanical Turk, a platform on which crowdsourcing tasks called "HITs" (Human Intelligence Tasks") can be created and publicized and people can execute the tasks and be paid for doing so. Dubbed "Artificial Artificial Intelligence", it was named after The Turk, an 18th century chess-playing "machine".
 * Searching the Nevada desert for the crash site of Steve Fossett.
 * In 2006 Google Inc. launched Google Image Labeler which utilises members of the public, in pairs, and encourages them to label random images with meaningful information
 * In 2006, the American online DVD rental company Netflix announced that they were offering a $1,000,000 prize for anybody who could improve their existing DVD rating system by at least 10%. Contest participants can download vast amounts of anonymised data from Netflix to test their proposals. In addition to the big prize Netflix are offering annual progress prizes of $50,000. So far 17,000 attempts have been submitted; the best showing an improvement of 8.26% over Netflix’s current system. Leaderboard
 * Stardust%40Home is an ongoing project, begun in 2006, utilizing internet volunteer "clickworkers" to find interstellar dust samples by inspecting 3D images from the Stardust_%28spacecraft%29.
 * Procter & Gamble posts problems on a website called InnoCentive, offering large cash rewards to more than 90,000 "solvers" who make up this network of backyard scientists.
 * YRUHRN used Amazon Mechanical Turk and other means of crowdsourcing to compile content for a book published just 30 days after the project was started.
 * NowPublic is the largest participatory news network in the world, where contributors submit written and voice reports as well as photographs and videos about breaking news events.
 * Cambrian House applies a crowdsourcing model to identify and develop software and web-based businesses. Using a simple voting model, they attempt to find sticky software ideas that can be developed using a combination of internal and crowdsourced skills and effort.
 * The Canadian gold mining group Goldcorp made 400 megabytes of geological survey data on its Red Lake, Ontario property available to the public over the internet. They offered a $575,000 prize to anyone who could analyse the data and suggest places where gold could be found. The company claims that the contest produced 110 targets, over 80% of which proved productive; yielding 8 million ounces of gold, worth more than $3 billion.
 * Threadless, an Internet-based clothing retailer that sells t-shirts which have been designed by and rated by members of the public.
 * Public Insight Journalism, A project at American Public Media to cover the news by tapping the collective and specific intelligence of the public. Gets the newsroom beyond the usual sources, uncovers unexpected expertise, stories and new angles.
 * ESP Game, allows people to collaborate in labeling images.
 * Wired and NewAssignment have launched a "pro-am" collaboration called Assignment Zero that allows citizen journalists to work with professional editors on a story, with the team's research available for re-use.
 * NASA Centennial Challenges
 * Galaxy Zoo, a project that lets members of the public classify a million galaxies from the Sloan Digital Sky Survey.
 * reCAPTCHA, uses captchas to improve digital text.
 * Barrick Gold. Brarrick Gold has offered a $10 million prize for improvements to its silver extraction process.

Controversy
The design community has long debated the ethical, social, and economic implications of crowdsourcing. More recently the negative effects of crowdsourcing on business owners have been highlighted, particularly in regards to how a crowdsourced project can sometimes end up costing a business more than a traditionally outsourced project.

Some of the pitfalls of crowdsourcing include:
 * Added costs post-completion of a project to bring a project to an acceptable conclusion.
 * Increased likelihood that a crowdsourced project will suffer failure due to lack of monetary motivation, too few participants, lower quality of work, lack of personal interest in the project, global language barriers, or difficulty managing a large-scale crowdsourced project.
 * Below-market wages, or no wages at all. Barter agreements are often associated with crowdsourcing.
 * No written contracts, non-disclosure agreements, or employee agreements or agreeable terms with crowdsourced employees.
 * Difficulties maintaining a working relationship with crowdsourced workers throughout the duration of a project.

Historical examples of crowdsourcing

 * The Alkali Prize
 * The Longitude Prize
 * Fourneyron's Turbine
 * Montyon Prizes
 * Nicolas François Appert and food preservation
 * Loebner Prize
 * Millenium Problems