Feedback versus Insights: A practical example for building a successful MVP

Brant Cooper
6 min readApr 19, 2018

--

This is a follow up post to my previous article “Why customer feedback is killing your innovation efforts”. I got a lot of comments misunderstanding my point about how asking customers for feedback is detrimental to building an innovative product. To clarify, I’m going to use a real-world example of a startup I mentored at Techstars who’s method for building their MVP embodies my point that insights are always more valuable than feedback.

“Feedback” is a loaded word. Asking for it in the wrong way not only will result in poor information, it can actually hurt your business because more often than not it’s predicated on opinions and not actual facts. Bad feedback sends you in wrong directions, provides false positives, and false negatives.

So how does one get valuable, actionable input from users, potential customers, and other stakeholders?

Focus on insights, not feedback.

Understanding the problem

Filtered.ai is a Techstars startup led by CEO Paul Bilodeau and Derek Bugley, whom I had the good fortune of mentoring as part of the 2018 Techstars Anywhere cohort.

Filtered got its start as a data science consulting firm trying to solve their own recruiting problems. Seeking to hire software developers, they received hundreds of unqualified resumes from recruiters. In response, they developed a simple coding test to be used as a filter.

Their minimum viable product (MVP) immediately paid dividends. Recruiters could only send resumes of engineers who achieved a specific score on the test, dramatically cutting the time and cost of recruiting new developers.

Unbeknownst to the consulting firm, some savvy recruiters started using the tool with their large enterprise clients. These corporations are often stuck in decades old HR recruiting systems and processes that cannot take advantage of modern technology.

One particular Fortune 500 chemicals manufacturer recruiting software engineers had hiring results that looked like this:

  • Out of 10,000 phone screens, they whittled down to 1,627 final round coding interviews, experienced 23% rejected offers, while only able to complete the hiring for 4% (the industry standard is 30%)
  • The average time to fill vacancies was 126 days
  • The cost of each hire was greater than $40k

This particular company asked Paul and his team to come in to discuss how their new technology might stop the bleeding.

Differentiating opinions from insights

While Filtered’s MVP solved their own small-scale problem, what does the MVP look like for a behemoth enterprise locked into a decades-old applicant tracking system and anachronistic HR practices? How does anyone figure that out, at the startup level or enterprise?

Some possible tactics, both surface-level and in greater depth:

  • request feature input using online tools such as UserVoice
  • send a demo or provide a trial to an MVP and ask for feedback with a survey
  • pitch solution by phone or in person and see if potential customer likes the idea
  • ask potential customers by phone or in person what they’d like to see in a modern recruiting software system
  • ask potential customers what their existing process looks like
  • ask potential users what they dislike about existing systems
  • interview other stakeholders, such as hiring managers, about existing systems, processes, and outcomes
  • go hang out on-site and observe the systems and processes at work
  • run experiments that measure behavior indicating whether or not user receives value

Filtered used a combination of these tactics, of course, but with the key objective of discovering customer insights. Insights are the competitive advantage, not feedback. Feedback can only be given in response to something you do, so it limits the context and your universe of potential insight.

Paul and team shadowed the HR people who brought them in. They didn’t merely rely on what they said; they observed problems in the process:

  • It took 5 days for HR to even look at resume after it was uploaded into the applicant tracking system. It took another 5 days to schedule a phone call.
  • HR asked technical questions they didn’t understand, like asking Java programming questions when they were recruiting for a Javascript developer.
  • Qualification questions boiled down to “why do you want to work here?”

When Filtered pointed this out, the HR person would say “Oh well, it wasn’t a good fit anyway.”

“HR people ask for everything under the sun,” says Paul. “They were asking for features they didn’t really need just because other products have them.”

Filtered was pressured to build a product with these same features causing problems in the first place. “[HR] would spend 10 minutes complaining about their legacy system, then say it’s a requirement to integrate with it. And of course the system was built in 1998 and didn’t have an interface to integrate with,” Paul recounted.

People have been conditioned to provide a laundry list of features without coupling those to real needs.

People have been conditioned to provide a laundry list of features without coupling those to real needs. This is the danger of “feedback.” But because Paul and his team has measurable insight into what the real problems were, they understood the worthlessness of the opinionated feedback.

Building for the end-user

Filtered learned that HR people are decision makers on purchasing their software, but that the technical hiring managers would be the real beneficiaries. So, they spent time with them to understand their experience and pain points.

Incredibly, what they found was extreme frustration, low morale, and culture-destroying behavior. The hiring managers were so distrustful of the process they were treating candidates callously. They had gone through so many interviews of unqualified candidates they’d walk in to the conference room and just have the candidate start coding.

“One hiring manager got up and walked out after 3 minutes and said, ‘This guy can’t code’, and left the candidate there”, Paul said. “Candidates were treated like cattle.” No wonder 23% of people who received an offer declined. While no one would excuse the hiring manager’s behavior, Filtered got tremendous insight into the depth of pain.

Furthermore, Paul and team noted that code testing was done onsite in an inconsistent manner. The grading was subjective and prone to biases based on mood or other impressions of the candidate like their prior work experience.

Based on all they learned, Filtered built a simple MVP that allowed candidates to record a video introduction, upload a resume and take a standardized code test. They observed users behavior, looked at analytics, and iterated based on what people were using, the usability of features, and so on.

Based on Filtered spending time on-site with potential customers, learning before building, iterating on their MVP, there were able to create a unique product that provides tremendous value to technical hiring managers.

The results

At the chemicals manufacturer, their interview to hire rate went from 4% to 57%. Cost per hire dropped below $400. Time to fill dropped to 5 days. Rejected offers fell 3%.

By going deep into the problem rather than simply asking for feedback, Filtered solved a real problem that positively affected not only hiring managers and HR personnel, but potentially addressed a larger corporate cultural issue. The passionate customer is now opening doors elsewhere for the Filtered team.

Even in a mostly, well-understood market such as recruiting, going deep rather than merely asking for feedback reveals insights which both current market product teams and long-view innovation teams can capitalize upon.

--

--

Brant Cooper
Brant Cooper

Written by Brant Cooper

NYT Bestselling Author. October 2021: Disruption Proof: Empower People — Create Value — Drive Change

Responses (2)