Make Your Crowd Smart

A framework for tailoring your crowdsourcing approach to the complexity of your innovation challenge


Facing ever-increasing pressure to innovate, some companies turn to crowdsourcing for new ideas. Many crowdsourcing efforts, however, fall short of expectations or are abandoned. Amazon’s crowdsourced film script submission system, for example, was shut down after failing to attract scripts with global appeal. Quirky, a product invention startup, went bankrupt as it unsuccessfully attempted to crowdsource the entire product development process.

There is a common misconception that there is only one approach to crowdsourcing, but asking crowds to address problems that they’re poorly suited to solve leads to many crowdsourcing failures. Our research suggests instead that there are multiple approaches to crowdsourcing that are appropriate for tasks of differing scope and complexity. Our smart crowds framework of three distinct types of crowdsourcing provides guidance for managers wishing to address business problems and boost innovation opportunities through crowdsourcing.

The Smart Crowds Framework

Crowdsourcing tasks can be organized along a spectrum of problems from low to high complexity — from a simple description of a new product idea drawing on a limited number of knowledge domains, to a highly complex working prototype of, say, a new type of a spacecraft requiring integration across multiple knowledge domains. Creating connections among individuals to learn from one another within a wired crowd or forming crowd teams can enable smart crowds that are able to tackle problems of increased scope and complexity.

The best type of crowd depends on the scope and complexity of the problem you are seeking to solve. Our smart crowds framework proposed three distinct types of crowdsourcing — search crowds, wired crowds, and crowd teams — suited for different types of problems. (See “The Smart Crowds Framework.”)

Search Crowds: Defining Tasks and Rewards for Efficient Search

Search crowds are most effective at finding solutions to well-defined problems with a relatively small scope. The problem itself may be quite complex — say, addressing climate change — but the answer you seek may be a simple one-pager outlining an idea. Search crowds unfold their strengths when the best skills or technical approaches to employ in problem-solving are not obvious. Search crowds operate on the premise that the solution to a problem already exists, likely in a domain distant from the problem, but needs to be discovered by reaching out to a large and diverse group of people. Some of the best-known crowdsourcing platforms have drawn on this approach. InnoCentive’s crowdsourced challenges, for example, enable “solvers” to connect with individuals seeking new ideas; the best-fitting idea surfaces through the parallel effort of many workers. In this approach, the objective is to facilitate the search for solutions by connecting seekers to solvers, rather than attempting to improve what individuals already know.

Three factors are critical for success when using search crowds. First, you must provide a clear task description and set the right (monetary and nonmonetary) incentives for the people you want to attract. Second, you should make the problem accessible to as wide an audience as possible — for example, by reformulating a biomedical problem as a mathematical one. Third, because you do not know who will have the relevant insight, you need to recruit diverse solvers. Keep an open mind about who you target, advertise broadly, offer varied incentives that might attract diverse people, and allow multiple forms of engagement — for example, submissions of comments and ratings as well as ideas.

Wired Crowds: Promoting Learning in Crowdsourcing

Wired crowds help you find solutions to complex problems that draw from multiple knowledge areas. Managers can design mechanisms that allow and enhance interaction and learning among members of a crowd, even without explicit collaboration among crowd members. One method is to make past contributions public to serve as a trove for learning. Allowing solvers to build on past experience and reuse existing solutions enables them to tackle larger and more diverse problems. Individuals need to connect with one another to share information, but keep in mind that facilitating learning and building knowledge takes time.

Our research suggests that managers use three tools to foster successful wired crowds.1 First, they help the crowd understand what constitutes “good” work; they formally signal which ideas are best, lest people learn from bad ideas. Second, they combine the organization’s preferences with crowd preferences when determining top contributions. The crowd can provide the most valuable insights for innovation if the company provides guidance and feedback on what is suitable for its product line and feasible for its finances. Third, they ensure that crowd members receive feedback on their own work along with feedback about high-quality contributions. This feedback helps solvers learn what makes a quality contribution, equipping them to make higher-quality contributions over time.

Crowd Teams: Fostering Direct Collaboration in Crowdsourcing

Crowd teams can tackle the implementation of solutions such as prototypes or software. Composed of members knowledgeable in a range of domains, crowd teams can effectively become more than the sum of their parts, allowing further exploration of an area of innovation. For example, competing for the million-dollar Netflix Prize, self-organizing crowd teams successfully met an algorithm challenge to improve Netflix’s core recommendation system. Crowd teams also competed for the $10 million Ansari X Prize for completing the first private suborbital flight above the 100-kilometer mark (generally recognized as the boundary of outer space).

Our research found that two factors are important for successful crowd teams: bursts of activity and diverse discussion topics.2

First, encourage bursts of activity. We found that teams who communicated in bursts of high activity performed much better than teams who exchanged a continuous stream of messages. Managers can facilitate “burstiness” by encouraging periods of independent work while providing tools that enable team members to see when others are active and periodically collaborate in real time.

Second, promote diverse discussion topics. High-performing teams had fewer topic repetitions in their communication, drawing on a more eclectic set of topics. This dynamic increased information diversity of the messages exchanged within a team, was fueled by members with advanced skills and expertise, and was significantly correlated with team performance. Managers can facilitate discussion of varied topics by creating teams whose members have different life experiences, technical training, and educational backgrounds.

Conclusion

Crowdsourcing can be an invaluable source of innovation when the right type of crowd tackles the right type of problem. Our smart crowds framework enables managers to match the right type of crowd with a problem of appropriate scope and complexity and pinpoints several critical success factors for each crowd approach — but successfully employing smart crowds for innovation is not always easy. Managers should be willing to conduct ongoing experiments in crowdsourcing, drawing from a range of crowd models as they strive to enhance their crowd-powered innovation processes. Although there’s no standard formula for the “right” level of participant diversity in search crowds or the “right” balance between company and participant evaluation in wired crowds, experimenting with this framework as a guide can unlock the power of smart crowds for your organization.

References

1. C. Riedl and V.P. Seidel, “Learning From Mixed Signals in Online Innovation Communities,” Organization Science 29, no. 6 (November-December 2018): 1010-1032.

2. C. Riedl and A.W. Woolley, “Teams vs. Crowds: A Field Test of the Relative Contribution of Incentives, Member Ability, and Emergent Collaboration to Crowd-Based Problem Solving Performance,” Academy of Management Discoveries 3, no. 4 (December 2017): 382-403.