How Algorithms Can Diversify the Startup Pool

Data-driven approaches can help venture capital firms limit gender bias and make better, fairer investment decisions.

Reading Time: 18 min 


Permissions and PDF

Image courtesy of Greg Mably/

When pitching startups, men and women tend to have very different experiences in being evaluated for funding.1 Consider these questions that a venture capital investor might pose to aspiring business owners:

To a male entrepreneur: “Tell us about your vision for this venture.”

To a female entrepreneur: “Tell us about your track record for this type of venture.”

Research shows that men are more likely to receive promotion-focused (risk-loving) questions from investors; for women, prevention-focused (risk-averse) inquiries are the norm.2 Investors also tend to disfavor stereotypically female behaviors, such as being soft-spoken and nurturing (versus bold and assertive), whether those behaviors are exhibited by men or women.3 But even when ventures are pitched in the same way, investors significantly prefer pitches made by men over those made by women.4

One possible explanation for these biases is the so-called cupcake stigma — the perception of women as less serious in their business ventures than the typical male entrepreneur.5 This stigma is reinforced by venture capital funding decisions, which are made mostly by men and thus based primarily on heuristics derived by men. Indeed, less than 10% of decision makers at VC firms are women and 74% of U.S. VC firms have no female investors.6 Despite evidence that suggests companies with female owners and leaders tend to outperform male-owned startups,7 the opportunities for female founders during the past decade have expanded from 1% to only 2.2% of VC funding.8 This scarcity of women in tech is exacerbated by perceptual biases related to gendered social norms and by the persistent structural challenges women face in fields related to science, technology, engineering, and math.

Some VC firms are starting to pay attention to how bias can affect funding decisions.



1. V. Yadav and J. Unni, “Women Entrepreneurship: Research Review and Future Directions,” Journal of Global Entrepreneurship Research 6, no. 12 (2016): 1-18; and K.A. Eddleston, J.J. Ladge, C. Mitteness, et al., “Do You See What I See? Signaling Effects of Gender and Firm Characteristics on Financing Entrepreneurial Ventures,” Entrepreneurship Theory and Practice 40, no. 3 (2016): 489-514.

2. D. Kanze, L. Huang, M.A. Conley, et al., “Male and Female Entrepreneurs Get Asked Different Questions by VCs — and It Affects How Much Funding They Get,” June 27, 2017,

3. L. Balachandra, T. Briggs, K. Eddleston, et al., “Don’t Pitch Like a Girl!: How Gender Stereotypes Influence Investor Decisions,” Entrepreneurship Theory and Practice 43, no. 1 (2019): 116-137.

4. A.W. Brooks, L. Huang, S.W. Kearney, et al., “Investors Prefer Entrepreneurial Ventures Pitched by Attractive Men,” Proceedings of the National Academy of Sciences 111, no. 12 (March 2014): 4427-4431.

5. V.K. Gupta, D.B. Turban, S.A. Wasti, et al., “The Role of Gender Stereotypes in Perceptions of Entrepreneurs and Intentions to Become an Entrepreneur,” Entrepreneurship Theory and Practice 33, no. 2 (2009): 397-417; and J.E. Jennings and C.G. Brush, “Research on Women Entrepreneurs: Challenges to (and From) the Broader Entrepreneurship Literature?” Academy of Management Annals 7, no. 1 (2013): 663-715.

6. K. Clark, “Female Founders Have Brought In Just 2.2% of U.S. VC This Year (Yes, Again),” Nov. 4, 2018,

7. N. Hashimzade and Y. Rodionova, “Gender Bias in Access to Finance, Occupational Choice, and Business Performance,” Economics & Management Discussion Papers, em-dp2013-01, Henley Business School, University of Reading, U.K., 2013.

8. V. Zarya, “Female Founders Got 2% of Venture Capital Dollars in 2017,” Fortune, Jan. 31, 2018; and Clark, “Female Founders.”

9. S. Brand, “How to Finally Fix the Gender Gap in VC,” Nov. 21, 2017,

10. “Palo Alto Venture Science: Company Details,”, accessed May 14, 2019.

11. B. Schiller, “This AI Engine Takes Common Biases Out of the Venture Capital Process,” March 28, 2016,; A. Mirhaydari and K. Clark, “Data-Driven Investing: Why ‘Gut Feel’ May No Longer Be Good Enough,” March 15, 2018,; and F. Corea, “Artificial Intelligence and Venture Capital,” July 18, 2018,

12. L. Huang, “The Role of Investor Gut Feel in Managing Complexity and Extreme Risk,” Academy of Management Journal 61, no. 5 (October 2018): 1821-1847.

13. M. Lee and L. Huang, “Gender Bias, Social Impact Framing, and Evaluation of Entrepreneurial Ventures,” Organization Science 29, no. 1 (January-February 2018): 1-16.

14. L. Huang and J.L. Pearce, “Managing the Unknowable: The Effectiveness of Early-Stage Investor Gut Feel in Entrepreneurial Investment Decisions,” Administrative Science Quarterly 60, no. 4 (2015): 634-670.

15. Huang, “The Role of Investor Gut Feel.”

16. Gupta et al., “The Role of Gender Stereotypes”; and Lee and Huang, “Gender Bias.”

17. M. Palmer, “Artificial Intelligence Is Guiding Venture Capital to Startups,” Financial Times, Dec. 11, 2017.

18. Corea, “Artificial Intelligence and Venture Capital.”

19. A. Heathman, “Motherbrain: How AI Is Helping This VC Firm to Pick the Next Big Startup,” April 18, 2019,

20. Correlation Ventures, “Our Selection Model,”, accessed July 8, 2019.

21. E. Alaluf, “How Does Follow[the]Seed Examine Investments?” April 20, 2017,

22. D. Coats, “Too Many VC Cooks in the Kitchen?” March 13, 2018,; and Alaluf, “How Does Follow[the]Seed Examine Investments?”

23. F. Corea, “Data-Driven VCs: Who Is Using AI to Be a Better (and Smarter) Investor,” May 2, 2019,

24. F4 Capital, “Tomorrow’s Promise,”

25. K. Hannon and Next Avenue, “Meet Alice, the Siri for Female Entrepreneurs,” June 4, 2017,; and “Frequently Asked Questions: What Is Circular Board?”

26. B. Cowgill, “Bias and Productivity in Humans and Machines,” working paper, Columbia University, New York City, Jan. 11, 2019; and A.P. Miller, “Want Less-Biased Decisions? Use Algorithms,” July 26, 2018,

27. B.J. Dietvorst, J.P. Simmons, and C. Massey, “Algorithm Aversion: People Erroneously Avoid Algorithms After Seeing Them Err,” Journal of Experimental Psychology: General 144, no. 1 (February 2015): 114-126.

28. S. Highhouse, “Stubborn Reliance on Intuition and Subjectivity in Employee Selection,” Industrial and Organizational Psychology 1, no. 3 (September 2008): 333-342; W.M. Grove and P.E. Meehl, “Comparative Efficiency of Informal (Subjective, Impressionistic) and Formal (Mechanical, Algorithmic) Prediction Procedures,” Psychology, Public Policy, and Law 2, no. 2 (June 1996): 293-323; and D. Newman, N.J. Fast, and D.J. Harmon, “Algorithms and Fairness,” working paper, University of Southern California, Los Angeles, 2019.

29. R.M. Dawes, “The Robust Beauty of Improper Linear Models in Decision-Making,” American Psychologist 34, no. 7 (1979): 571-582; Grove and Meehl, “Comparative Efficiency”; and Y.E. Bigman and K. Gray, “People Are Averse to Machines Making Moral Decisions,” Cognition 181 (December 2018): 21-34.

30. Newman, Fast, and Harmon, “Algorithms and Fairness.”

31. Mirhaydari and Clark, “Data-Driven Investing.”

32. H.J. Einhorn, “Accepting Error to Make Less Error,” Journal of Personality Assessment 50, no. 3 (1986): 387-395; and Highhouse, “Stubborn Reliance.”

33. Dietvorst, et al., “Algorithm Aversion.”

34. J. Logg, J. Minson, and D.A. Moore, “Algorithm Appreciation: People Prefer Algorithmic to Human Judgment,” NOM Unit working paper 17-086, Harvard Business School, Cambridge, Massachusetts, April 24, 2019.

35. S.W. Gates, V.G. Perry, and P.M. Zorn, “Automated Underwriting in Mortgage Lending: Good News for the Underserved?” Housing Policy Debate 13, no. 2 (2002): 369-391.

36. Cowgill, “Bias and Productivity.”

37. J. Kleinberg, J. Ludwig, S. Mullainathan, et al., “Discrimination in the Age of Algorithms,” working paper 25548, National Bureau of Economic Research, Cambridge, Massachusetts, February 2019.

38. Mirhaydari and Clark, “Data-Driven Investing.”

39. N. Shadowen, “How to Prevent Bias in Machine Learning,” Jan. 29, 2018,; K. Hosanagar and V. Jair, “We Need Transparency in Algorithms, But Too Much Can Backfire,” July 23, 2018,; and A. Campolo, M. Sanfilippo, M. Whittaker, et al., AI Now 2017 Report (New York: AI Now Institute at New York University, 2017).

Reprint #:


More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.