Putting the Fast in "Fail Fast"
To succeed, focus your efforts on minimal time, minimal cost, and maximum learning...here's how.
I believe there is great advantage in speed. In my book, Transformative, I define a time-based competitive advantage that organizations can create their own advantage in observing, orienting, making decisions, and acting, based on the principle of the OODA Loop.
This time-based competitive framework emphasizes the value of quick cycles of learning and adaptation, a competency that can distinguish any organization with a competitive advantage.
Similarly, the concept of "Failing Fast" provides another time-based marker for creating advantage, though I believe "Learning Fast" more accurately captures the intention.
What’s the difference? The former positions failure against success, creating a false dichotomy. The better framing is success versus learning – recognizing that learning itself is a form of success.
Eric Ries, the author of the Lean Startup, himself acknowledges this, stating: "The Lean Startup methodology is not about failing fast. It's about learning fast—testing your vision continuously with real customers and adapting before it's too late."
Unfortunately, many founders and executives have lost the true meaning of the phrase. Too often, "Fail Fast" devolves into testing sales persuasiveness or convincing ourselves that we can get people to buy a product. Teams may keep searching until they find their "ideal customer profile," even if it's a market of one – hardly a foundation for sustainable business growth.
At its core, failing fast is about learning quickly. Ries makes clear that failure and learning are inextricably linked: "If you're afraid to fail, you can't learn. The goal is not to fail, but to learn quickly what works and what doesn't." He further emphasizes that "Success is not delivering a feature; success is learning how to solve the customer's problem."
Done correctly, the Fail Fast approach involves three key components:
Minimal time: Learning rapidly to avoid wasted cycles
Minimal expense: Learning economically to preserve resources
Maximum learning: Extracting valuable insights from each iteration
Without all three components, there is no “Fast” in “Fail Fast.”
So, let’s talk about the challenges of failing fast, and how to get to minimal time, minimal expense, and maximum learning.
As Ries notes, "The biggest risk is not that your idea will fail, but that you will waste time and money on the wrong idea." The true lesson of the Lean Startup is "not to fail fast, but to fail cheaply—to maximize learning while minimizing wasted effort."
Where teams go wrong is leaning into an inherent desire to succeed and often strive to make their projects successful regardless of the odds. This natural tendency can lead teams to ignore or defer addressing major issues, or to keep searching until they find the rare customer who agrees with their preconceptions. The result: too much time spent convincing and not enough time learning.
Beyond MVP Testing: A Comprehensive Approach
Contrary to popular belief, failing fast isn't just about Minimum Viable Product (MVP) testing. It encompasses:
1. Identifying Problems
Before rushing to solutions, organizations must thoroughly understand the problems they're attempting to solve. This means engaging with users early and often, and embracing genuine curiosity about their needs rather than simply validating preexisting assumptions.
See my previous article about Getting the Problem Right for more insight.
2. Testing Assumptions
Every innovation is built on assumptions. The most successful innovators rigorously test their riskiest assumptions first – those that would completely undermine the project if proven false. This is the least expensive way to learn early, before you’ve even built an MVP.
Google exemplifies this through its "Riskiest Assumption Test" methodology. Two powerful tools for this kind of testing include:
Pre-mortems: Teams simulate hypothetical failures before launching to identify flawed assumptions. For example, Google+ assumed users wanted another social network like Facebook, but pre-mortems might have identified privacy concerns earlier.
Steelman (Steelperson) arguments: Unlike "straw man" arguments that attack weak versions of opposing viewpoints, steelman arguments address the strongest possible version of counterarguments. Assigning devil's advocates to challenge project assumptions can reveal critical weaknesses before market exposure.
3. Validating Solutions
Once problems are understood and assumptions tested, solutions require validation with real users. This validation should be rapid, economical, and focused on learning—in other words, it’s iterative.
What does "fast" really mean in the context of this phase of innovation? It's not merely about accelerated development timelines. Consider these dimensions:
Balance speed and precision: Understand when to move quickly and when to slow down to prevent costly mistakes.
Break things responsibly: Move fast and break things is a pithy cliché that simply doesn’t translate well for most companies. Different industries have different tolerances for error. Moving fast may be appropriate for a social media feature but potentially catastrophic in healthcare, financial systems, AI, or anything that is touching private data.
Measure learning velocity: The true measure of fast is getting to the right job to be done. So, track the speed of learning, not just development or release cycles. How quickly can your team incorporate new insights?
Innovation measurement is generally the time to build and launch. Consider alternative measurements for velocity, for example:
1. Time to validated learning - How long does it take from initial hypothesis to having actionable customer insights? This measures the efficiency of your learning cycles.
2. Hypothesis-to-experiment ratio - How many hypotheses are actually tested through experiments versus those that remain untested? Higher ratios indicate greater velocity.
3. Pivot speed - How quickly can your organization change direction when data suggests a new approach? This measures organizational agility.
4. Feedback loop duration - The time between releasing a feature/prototype and incorporating user feedback into the next iteration.
5. Decision latency - How long do critical decisions take from identification to resolution? Shorter latency typically indicates higher velocity.
6. Experiment throughput - The number of meaningful experiments conducted in a given time period.
7. Cost per validated learning - How efficiently are you spending resources to gain actionable insights?
8. Customer feedback velocity - The volume and frequency of customer feedback obtained, indicating how quickly you're gathering market intelligence.
9. Innovation pipeline flow rate - How quickly do ideas move through your development pipeline from conception to validation?
10. Fail fast ratio - The percentage of initiatives terminated early due to negative validation versus those that continue despite warning signs.
Unlike many business situations where speed comes at a premium, cost and speed can align in innovation contexts. For Ries, "failing fast" only works when it's low-cost. The goal is rapid learning without running out of money – hence his emphasis on MVPs, iterative testing, and strategic pivots.
Stepping back for a moment from individual validation to building a best practice for your organization, here are some best practices that will give you the needed capabilities to create a better, less pressured approach so your team isn’t focused on validating “sales persuasiveness.”
1. Build an Innovation Pipeline
Don't depend on any single innovation or outcome. Create a portfolio approach where multiple ideas are explored simultaneously, allowing for cross-pollination and reducing dependence on any single initiative.
2. Develop Learning Systems
Establish practices that facilitate rapid learning and iteration. This might include regular customer interviews, data analysis routines, or cross-functional review sessions.
As Ries observes, "A pivot is a structured course correction designed to test a new fundamental hypothesis about the product, strategy, and engine of growth." The key is to "Persevere on vision, but be flexible on details."
3. Know When to Slow Down
Challenge the assumption that speed always equals progress. Again, the "move fast and break things" mantra has limitations. Speed should be defined as the pace of learning rather than rushing development.
Organizations must distinguish between the speed of learning versus the speed of execution and recognize when acceleration is beneficial versus when deceleration is prudent.
Properly understood, the Fail Fast methodology should accelerate validation and help define problems and jobs-to-be-done sooner. It's not about celebrating failure but about creating conditions where learning happens rapidly and economically.
By focusing on the velocity of learning rather than the pace of execution, organizations can build sustainable competitive advantages. The goal isn't to fail quickly but to discover the path to success through disciplined, efficient learning cycles.
In a world of limited resources and boundless opportunities, those who learn fastest will ultimately prevail. That's the true essence of failing fast, er, learning fast.
Things to consider:
1. How do we collectively respond to failure in our organization? Are we genuinely celebrating learning, or do we subtly penalize teams whose initiatives don't succeed as planned?
2. How can we focus our teams on improving learning velocity instead of emphasizing output metrics that might encourage persistence with flawed approaches?
3. What organizational barriers have we created that slow down our ability to test ideas quickly with customers? How might we reduce these barriers?
4. How do we currently distinguish between valuable persistence and counterproductive attachment to failing initiatives? What criteria should we be using?
5. When was the last time we killed a project early based on learning rather than pushing forward despite warning signs? What did we learn from that experience?
6. How do we balance the pressure for immediate results from stakeholders with the need for rapid experimentation and learning? What tensions exist in our decision-making process?
7. How do our resource allocation practices reflect our stated values around innovation and learning? Are we truly investing in validation before development?
8. How might our organizational culture be creating fear around acknowledging when approaches aren't working? What signals are we sending about the value of early pivots?
9. What organizational blind spots might be preventing us from seeing early warning signs in our most cherished projects? How can we create mechanisms to counter these?
10. How do our measurement and reward systems align with a true "fail fast" approach? Are we measuring learning velocity or primarily focusing on output metrics?
Until next time, lead with purpose.
Will
About Leading Matters:
Leading Matters is the trusted source for aspiring and seasoned leaders alike, providing them with the tools, insights, and inspiration to become intentional leaders that build more innovative, engaging, and agile organizations.
#innovation #transformation #founders #CEOs #failure #leadingmatters #failfast #learning #biases #leaders #growthmindset #growth #MVP