I am a big fan of House MD–a great homage to Sherlock Holmes. Last week’s episode featured (as it often does) a rather cantankerous patient who becomes frustrated when House’s team fails to properly diagnose his ailment.  After about the second or third failed treatment, the patient attempted to crowdsource his diagnosis.  The patient posted his symptoms to a website and promised a $25,000 bounty to the member of the crowd that came up with the winning diagnosis.

The folks over at uTest posted four lessons learned about crowdsourcing from the episode.  They are worth a read:

Lesson 1 – All opinions are not created equal
Throughout the episode, Foreman’s team must constantly dismiss terrible opinions submitted by the crowd. While the crowd is often a good way to get new perspectives on a problem, some ideas are better than others. Good crowds must cultivate experienced subject matter experts whose ideas and contributions can rise to the top.

Lesson 2 – Reputation matters
When Foreman’s team started receiving hundreds of faxed crowdsourced opinions, they were simply overwhelmed by all of the ideas. What if Dr. House’s anonymous opinion had said that the submitter had a 98% approval rating and 95% patient satisfaction rating? Chances are his submission would have been read and considered long before the others. Good crowdsourcing systems must have reputation management features to help separate the good ideas from the bad.

Lesson 3 – In-house experts are always critical
Crowdsourcing can’t really replace the value of in-house expertise. Dr. Foreman’s team worked to evaluate many of the crowd’s opinions, helping to rule out bad ideas and test good ones. Crowdsourcing is not a replacement for in-house experts, but rather an exciting new tool to help them gain greater perspective.

Lesson 4 – Money is a paradox
When the patient offered a reward of $25,000, he unleashed a torrent of opinions from around the world. By creating such a huge bounty, he definitely increased participation. On the other hand, Dr. House submitted his correct opinion because he was bored – not for the reward. Money is a blunt weapon for incentivizing crowds that can cut both ways. Sometimes more money is necessary to maximize valuable participation, but sometimes people in the crowd are looking for other things of value like a community, an intellectual challenge, or the promise of recurring work in the future. Good crowdsourcing systems must walk the fine line between offering the right and wrong kinds of rewards to get the best results from the community.

Photo source: Fanpop!