Sam was one of the co-founders of Ansaro, a SaaS that aimed to revolutionize the recruiting industry through the use of technologies like AI. They raised $2.25M from institutional investors and $750K from friends and family, grew the team to 6 members and earned in total $100k. But with expenses of $70k/month and no product-market fit, they had to shut down 2 years later.
Hi Sam! What's your background, and what are you currently working on?
I was one of the co-founders of Ansaro. Our mission at Ansaro was to use data science to improve hiring. Specifically, we focused on better interviewing.
At Ansaro, I was responsible for product design and sales and marketing. In our early days, I did a bit of data science too. We started Ansaro in late 2016 and shut down 2 years later, at the end of 2018. After 2 pivots (while staying within the scope of data science to improve hiring), we had failed to achieve product-market fit, and we shut down the business and returned our remaining capital to investors.
We were based in San Francisco, grew to a team of 6, and raised $3M in VC funding. We had a number of large enterprise customers, but never got beyond pilot contracts. We were aiming for a SaaS model, but our actual revenue was nearly entirely from services-based pilots.
Before Ansaro, I worked in consulting (Bain), tech investing (TPG), and software development (SolveBio, then a Series A startup). I studied math as an undergrad at Stanford and did an MBA at Harvard; at both universities, I was interested in applying statistics/math/data science to messy human problems (like hiring!) Today, I’m a product manager at Opendoor, where I work on machine-learning products for the real estate industry.
What motivated you to start Ansaro?
In my first job out of college (Bain), I was frustrated by how subjective hiring was. Along with a number of other recent Stanford grads, I was assigned to review Stanford student applications for the entry-level consulting position. I was struck by how differently my peers and I assessed candidates. We were aligned on the outcomes that defined a good hire (high-performance ratings, retention), but we disagreed on the inputs that generated those outcomes. For example, some of us thought a STEM background was critical; others thought humanities majors did better. Some thought club leadership was a key input, others thought a high GPA was critical. It occurred to me that we could answer these questions analytically, by mapping our performance/retention records back to job applications and looking for correlations.
I started to dig into this problem in my free time in 2011-2012, and I quickly stumbled on 3 challenges. First, the data wasn’t readily available - it was spread out across homegrown systems that lacked APIs or basic file export capabilities. Second, HR leaders weren’t thinking about data science and their eyes tended to glaze over when I started talking about analytics. Third, I realized that the outcomes data (the “ground truth” represented by performance ratings and retention) could themselves be biased if the organization was biased in the way it treated some hires.
So I paused. But I kept watching the HR space and by 2016, I thought the first 2 barriers were eroding. It seemed to me that most enterprises had moved to cloud-based HR systems with API/export capabilities and HR leaders were starting to get excited about data science. In late 2016, I started asking companies if they would share anonymized employee data with us and a few did so. While I was still at Harvard, we confirmed that there really was predictive power in these companies’ job application data that predicted who would perform well and remain in the job. This was a signal these companies weren’t yet using or aware of. With some proof-of-concept in hand, I went full-time on Ansaro as soon as I graduated from Harvard.
How did you build it?
For our first handful of enterprise customers, we built a customized predictive hiring model for each. The advantage of this was that we weren’t beholden to any assumptions and were free to tailor to the customer’s data and business questions. This was great for exploratory models that resulted in presentations and discussions. The disadvantage was that these models were not ready to be put into production. With each new customer, we essentially started over.
After doing this a handful of times, we moved on to build a scalable platform that could do the following across customers: ingest data, fit a model, and return suggestions about live applicants. However, we failed on data ingestion, because none of our customers provided us live access to their HR systems. For some, their systems simply lacked APIs, for others it was security concerns, for others it was simply bureaucracy.
We also realized another fundamental problem: many companies don’t trust their own recruiting data - they know that applicant’s can game the system, or just straight-up lie.
So we pivoted to focusing on interviews, which most of our customers believed (a) can provide more signal than an application or resume when done well, but (b) are often done poorly today. We built a platform that allowed companies to plan structured interviews, assign an interviewer panel, and record structured feedback.
The (hopefully) headline-grabbing feature was an AI notetaker: for phone interviews, Ansaro provided a conference line, recorded the call (automatically notifying all participants beforehand), and sent the recruiter an AI-generated transcript and summary.
Problems with the 2nd version of included:
- Companies that prioritized improving their interviews were generally already using new applicant tracking systems like Lever and Greenhouse, which had all the functionality we had and more (the only thing they lacked was the AI notetaker.)
- Companies that did not prioritize improving their interviews didn’t have Ansaro functionality - but they weren’t interested in buying Ansaro!
- The AI notetaker didn’t solve a huge pain point. Notetaking in interviews is annoying, but not a billion-dollar problem. Moreover, the AI transcription quality wasn’t great, so recruiters preferred to spend the time taking their own notes, rather than editing a mediocre transcript, to share with a hiring manager.
Which were your marketing strategies to grow your business?
Our sales strategy relied entirely on personalized, outbound pitches from me to prospects.
- Prospects were generally CHROs or Heads of Recruiting from medium or large enterprises (>1000 employees)
- I primarily sourced prospects via LinkedIn, then used tools like Hunter.io to infer their email addresses, and then put them onto email drip campaigns that I run using Hubspot. The cold emails had a fair amount of personalization. We had a ~3% positive response rate (willing to take a meeting or call) from these cold emails.
- Whenever possible, I would meet prospects in person. I would travel if the prospect was large enough or sufficiently engaged.
- Secondary sources of prospects were conference attendance and personal networks (Stanford/Harvard alumni network, intros from investors, intros from customers), but we fairly quickly exhausted these personal network connections.
We did a little bit of marketing:
- Writing blog posts about HR + Data Science, better interviewing, etc.
- Sponsoring “HR + Data Science Discussion and Dinner” events, where we would have ~3 guest speakers, and then invite 2-3 customers and ~10 prospects.
- We did not do any paid marketing
Which were the causes of Ansaro failure?
First, we were too slow to pivot. I attribute that mainly to building a team/culture where people weren’t comfortable disagreeing with the fundamentals of our product plan. We all got along well, and we carved out different areas of responsibility. The downside of this was that my area of responsibility was our core product roadmap, and when our initial idea turned out to be bad, it took too many months for others to question it and for me to abandon it.
Second, we conflated our buyers and our users. CHROs spend a lot of time talking about how they want to hire better, but when the rubber meets the road, recruiters care more about hiring efficiency than new hire quality. We were pitching new hire quality improvement to our buyers (CHROs), but this wasn’t an acute pain point for our users (recruiters).
Third, we were tackling a problem where improvement takes a long time to measure. It requires months, and sometimes years, to see how a cohort of new hires turns out, and thus conclude that Ansaro is ROI positive. That’s way too long for a small startup. We tried to make the case for ROI based on backtesting, but that never resonated with HR buyers. I now believe that problems that require years to measure results are fundamentally better suited to large companies with deep pockets, as opposed to startups.
Which were your expenses? Did you achieve some revenue?
Immediately before we shut down, we were spending $60-70K per month to support a team of 6, with some minimal marketing and hosting/compute costs. We brought in about $100K in total revenue, almost entirely from non-recurring, customization-heavy pilot projects.
We raised a $3.0M seed round, which included $2.25M from institutional investors (Silicon Valley Data Capital was the lead investor), and $750K from friends and family. We pitched ~30 investors before we got a term sheet from SVDC. In hindsight, I wish we had waited longer and only engaged with institutional investors once we had (more) product-market fit / recurring SW users. In other words, I wish we had raised money only from family and friends.
If you had to start over, what would you do differently?
- Focus on a product that can be easily tested. That means products that an SMB can try, or an individual within an enterprise can test - but that does not require the entire enterprise to test.
- Focus on a product that can be quickly tested. That means user benefit is measurable within hours or days, not weeks (and certainly not months or years). Ideally, the first time a user tries the product he or she can say “I see this benefiting me X amount in Y way, so I would pay Z for this.”
- Don’t raise money from VCs customers are using a product and that product is ready for some level of scaling. “Working product” does NOT include services/consulting projects.
- Don’t rely on personal charisma for initial sales/marketing; it doesn’t scale, it’s hard to improve via experimentation, and it’s exhausting. Focus on more scalable approaches like online/offline paid marketing, SEO, and partnerships.
Which are your favorite entrepreneurial resources?
I've really enjoyed these books:
- "Drive", by Daniel Pink, is about how to motivate people by giving them autonomy and spurring creativity.
- "Introduction To Statistical Learning" goes over a terrific primer on the practical basics of machine learning.
- "The Design of Everyday Things", by Don Norman, is about the key principles of design that can be applied to any system, including systems that go far beyond software UI/UX.
- "Venture Deals", by Brad Feld and Jason Mendelson, has great advice from years of experience on both sides of startup fundraising and structuring.
For random questions I use Quora and for sourcing engineering hires, we found Hired and A-List (AngelList) to be very useful.
Where can we go to learn more?
You can check out this blog post I wrote about Ansaro’s failure.