Business
VCs who want better outcomes should use data to reduce founder team risk
VCs expect the companies they invest in to use data to improve their decision-making. So why aren’t they doing that when evaluating startup teams?
Sure, venture capital is a people business, and the power of gut feeling is real. But using an objective, data-backed process to evaluate teams — the same way we do when evaluating financial KPIs, product, timing and market opportunities — will help us make better investment decisions, avoid costly mistakes and discover opportunities we might have otherwise overlooked.
An objective assessment process will also help investors break free from patterns and back someone other than a white male for a change. Is looking at how we have always done things the best way to build for the future?
Sixty percent of startups fail because of problems with the team. Instinct matters, but a team is too big a risk to leave to intuition. I will use myself as an example. I have founded two companies. I know what it takes to build a company and to achieve a successful exit. I like to think I can sense when someone has that special something and when a team has chemistry. But I am human. I am limited by bias and thought patterns; data is not.
You can (and should) take a scientific approach to evaluating a startup team. A “strong” team isn’t a vague concept — extensive research confirms what it takes to execute a vision. Despite what people expect, soft skills can be measured. VCVolt is a computerized selection model that analyzes the performance of companies and founding teams developed by Eva de Mol, Ph.D., my partner at CapitalT.
We use it to inform every investment decision we make and to demystify a common hurdle to entrepreneurial success. (The technology also evaluates the company, market opportunity, timing and other factors, but since most investors aren’t taking a structured, data-backed approach to analyzing teams, let’s focus on that.)
VCVolt allows us to reduce team risk early on in the selection and due diligence process, thereby reducing confirmation bias and fail rates, discovering more winning teams and driving higher returns.
I will keep this story brief for privacy reasons, but you will get the point. While testing the model, we advised another VC firm not to move forward with an investment based on the model’s findings. The firm moved forward anyway because they were in love with the deal, and everything the model predicted transpired. It was a big loss for the investors, and a reminder that hunch and gut feeling can be wrong — or at least blind you to some serious risk factors.
The platform uses a validated model that is based on more than five years of scientific research, data from more than 1,000 companies and input from world-class experts and scientists. Its predictive validity is noted in top-tier scientific journals and other publications, including Harvard Business Review. By asking the right questions — science-based questions validated by more than 80,000 datapoints — the platform analyzes the likelihood that a team will succeed. It considers:
-
Entertainment7 days ago
Explainer: Age-verification bills for porn and social media
-
Entertainment6 days ago
If TikTok is banned in the U.S., this is what it will look like for everyone else
-
Entertainment6 days ago
‘Night Call’ review: A bad day on the job makes for a superb action movie
-
Entertainment6 days ago
How ‘Grand Theft Hamlet’ evolved from lockdown escape to Shakespearean success
-
Entertainment6 days ago
‘September 5’ review: a blinkered, noncommittal thriller about an Olympic hostage crisis
-
Entertainment6 days ago
‘Back in Action’ review: Cameron Diaz and Jamie Foxx team up for Gen X action-comedy
-
Entertainment6 days ago
‘One of Them Days’ review: Keke Palmer and SZA are friendship goals
-
Entertainment3 days ago
‘The Brutalist’ AI backlash, explained