Browsed by
Category: Leveraging data

start⬆Mngr handbook pt. 6: Before we start analyzing data…

start⬆Mngr handbook pt. 6: Before we start analyzing data…

First, a personal story.

I once helped a colleague on the customer success team (let’s call him Lou) analyze our retention data.

Lou asked me: “Can you help me get a report on the number of inbound service requests filed in the past quarter for each of our customers?” Easy enough I thought. I pulled the data from our help desk, created the report and sent it over to Lou. I thought that was the end of it.

A few days later, Lou came back and said: “Thanks for your help last time. Can you also get me a report on the amount of time that we spent responding to requests in the past quarter for each of our customers?”

The report wasn’t complicated to create, but we lacked the data. We did not track time spent servicing customers. After speaking with Lou, we decided to have the team start tracking their time. We recorded over a month of data before we created a first report. Lou looked happy with the results, so I thought this was again the end of that project.

Turns out, Lou didn’t really have a goal in mind..

Wrong. Over the following weeks, Lou requested half-dozen more reports, and we initiated the tracking of many new data points. A good amount of time and energy went into this retention analysis.

After Lou’s requests died down, I curiously asked: “So how did all the reports help you in the end? Did you find what you were looking for?”

Turns out, Lou didn’t really have a goal in mind… Lou was at first curious about how much resources we were spending per client, which led to follow up questions along the way. Based on the data, Lou eventually suggested to the customer success leadership to start setting limits to how many hours each client could access per month. Yet because of other priorities and constraints, the suggestion was never implemented. So nothing came out of the analysis.

The good news is that the new data points we tracked provided us a ton of useful information that eventually led to other changes that helped improve our retention goals. However, that took another few weeks. And fact is, the whole project could have been a complete waste of time.

Having worked on hundreds of analyses with dozens of data-driven companies, I can confidently say that teams without an analytical process in place have an extremely high chance of wasting time performing data analysis.

Start-up companies today have at their disposal an unprecedented amount of data, but it doesn’t guarantee good decisions. It doesn’t matter what BI tools we use. They are all useless if we don’t know what questions need to be answered.

To avoid wasting time and energy while pursuing analytics projects, this blog post will showcase an analysis process and framework to follow before any analytical work begins. Let’s make sure every analysis has a clear purpose.

For analysis projects to be successful, we need three main ingredients: the relevant data, people that can interpret that data, and an analytical process to ensure that we’re asking the right questions and creating the relevant reports. In part six of our startup manager handbook, we’ll thus be exploring the process of initiating data analyses to help:

  • Gather evidence to a problem;
  • Measure success and evaluate performance;
  • Take data-driven decisions;
  • Avoid performing the wrong analyses;
  • Avoid answering the wrong questions.

Before going further, let’s clarify that depending on the organization, analyses can also be referred to as measures, metrics, reports, and other quantitative or qualitative evidence based assessments.

We will not be discussing analytical/statistical methods (or data science methods), since there exists a ton of content on statistical methods out there already. However, to help those that are completely new to data analysis, I’ve included links to some of my favorite data science resources at the end of this blog post.

1. What is the problem and the goal?


The most critical step of an analysis is to ensure that it answers the right questionYet more often than not, we are so eager that we jump right into the data without a clear goal. The result is wasted time and resources in performing analyses that may not yield relevant insights, and doesn’t help with decision making.

This widespread behavior is likely from undergoing 15+ years as students where problems are defined for us, and all that’s expected of us is to solve them. Unintentionally, schools have failed to teach us how to define problems.

Strongly recommended reading on problem definition:

What is being asked?

Before going further, let’s first understand what is being asked. Here are the critical elements to acknowledge before any analysis work can be performed:

  • Is the question clear? There are often acronyms and ambiguous words used in describing a question, problem, or desired analysis. It’s important that there is clarity on how these words are interpreted to avoid miscommunication.
  • Does the requester have a specific vision for the end result chart or report? Analysis consumers may have an idea on the specific report(s) they’re looking for. So ask for it. While the envisioned report may not be best suited to their analysis goal, simply acknowledging it will help us understand the context and motivations behind the analysis. In addition, individuals with specific ideas on the end result often want to see their desired reports regardless of what we say. I thus recommend building the report, explain why it doesn’t answer their question, and then reveal the better analysis. It shows that we acknowledged their need and understand the context of the problem.
  • Can I explain the goal of the analysis in my own words? Repeat the analysis goal in our own words and validate with the stakeholder(s) – this ensures that there is agreement on the goal. (e.g. “The goal of the analysis is to assess whether cars primarily driven on the highway have a longer service life than cars primarily driven in urban centers. Is this accurate?”)
  • Do I understand the motivations behind the goal? Understanding why the analysis goal is relevant to the team or organization will provide a sense of direction when we start identifying analyses to perform. It also helps us validate any assumptions we may have about an analysis’s motives. (e.g. For a transport company, they may need to know if cars primarily driven on highways have longer service lives to see if there’s an opportunity to incentivize their drivers to take the highway more often than local routes. There may thus be opportunities to also analyze why drivers currently like to take local vs. highway roads.)
  • What potential actions or decisions will be made based on the results? Why would we spend time on an analysis that doesn’t translate into a decision or action?

What motivations lie behind the analysis?

Of the five points explored above, understanding motivations can be particularly challenging. To help, let’s remind ourselves of a tool we’ve used before: The 5-why method for root cause analysis. This method can be leveraged to understand Why an analysis makes sense to tackle. Questions such as “Why is _____ of interest” or “Why does your team focus on _____” will help kickstart the process.

2. Who exactly cares?

what is your role?

Stakeholders previously helped to explain the analysis goal. For the analysis results to be meaningful and used in decision-making, these same stakeholders need to participate in the analysis process as well. It’s therefore critical that responsibilities are agreed upon with stakeholders before an analysis begins. Let’s explore some common  stakeholder responsibilities (an individual may certainly wear multiple hats):

  • Decision-taker(s): These are individuals that need the insight to drive a decision or assess a situation. Among decision-takers, I’ve found it helpful to identify one individual that also serves as an advocate for this analysis: A person that will take part in reviewing all progress. This ensures that the analysis has continuous buy-in from its stakeholders and remains a priority throughout its duration.
  • Data warehouse developer(s): These are individuals that have deep knowledge of the data warehouse. Among other duties, they can help us access the relevant data points and track new data points.
  • Subject matter experts(s): These can be individuals that have contexts around the data that will help us make sense of questions that may come up when performing the analysis.
  • Observer(s): These are individuals that are curious about the analysis for reasons unknown. Perhaps they want to become decision-makers based on results, perhaps they are curious as to how an analysis is carried out at the organization, or perhaps they are simply looking for investigation. Independent of motive, these are individuals that the analysis team will need to update when major milestones are met.

Having these stakeholders participate in the analysis process ensures that everyone is on the same page throughout the exercise. In turn, they buy into the analysis and understand its nuance and caveats before final results are presented.

When stakeholders fail to participate in the analysis process, they may doubt results presented in the end, losing trust in what the data has to say. This must be avoided at all cost.

3. What analyses will help answer our questions?


Next, it’s time to envision (not yet perform) analyses that will help answer our analytical questions. This translates into an analysis plan, avoiding the risk of analyzing blindly.

A good way to start envisioning what analyses to perform is to ask: “If I had access to any dataset, what analyses would I want to perform to answer this question?”

Assuming that we have access to any dataset makes us more creative. In the context of data analysis, our creativity can often be limited by data not being available, or data not being in a format that we need. Yet chances are that once the ideal analysis is identified, a way to work-around existing constraints will also be found: e.g. by tracking the missing data, or finding a similar dataset stored in the required format. Even if there are no work-arounds, it is still valuable to acknowledge that there are important analyses we couldn’t perform due to ______.

Next, let’s review some characteristics of a good analysis:

  • Relevant: The analysis needs to directly relate to the goal. Every datapoint that does not answer the main question(s) or provide additional context become distractions. Distractions do not help stakeholders with their decisions, they should be avoided.
  • Trustworthy: Both methods and datasets used in the analysis need to be trustworthy. There should be no doubt as to whether the data is accurately recorded and properly formatted, while the methods used are relevant to the analysis goal. This means that reasons and explanations are available to support every decision surrounding the analysis. Decision-takers will appreciate the diligence, but most importantly, trust that they can rely on the analysis for their decision.
  • To the point: At least one of the reports needs to answer the question directly. It should be as black and white as possible, revealing a clear insight that helps decision-takers come to a conclusion. Even if the conclusion is that we need to perform more analyses, that report needs to unequivocally and quickly show why that’s the case.


  • Communicates a story: To effectively communicate an insight, analyses need to be exposed in the form of a story. To this effect, I highly recommend this book on Storytelling with Data to explore the basics of of data communication. I also recommend adopting the following flow to the story:
    1. Communicate the recommendation first: I usually start with the final recommendation and reveal at least one data report that clearly shows why I’m making this recommendation (see “To the point” note above). This ensures that people do not wait to discover the final insight that the analysis achieved. In addition, it also prevents conversations from sidetracking before the final insight has been shared.
    2. Explore caveats and supporting arguments next: If there are other reports that provide additional contexts to the recommendation, explore them next. I recommend starting with reports that illustrate caveats or go against the recommendation to address concerns and skepticism right away. Then we can proceed with analyses that support the recommendation to show how they outweigh the negative arguments.
    3. Close by re-iterate the recommendation: Finally, re-iterate initial recommendations by coming back to the main analysis, and allow the audience to raise questions.

As a final tip, I recommend to review results and rehearse the story with a colleague before the final presentation. This helps to anticipate questions and catch mistakes before they affect the analysis’s trustworthiness.

Other viewpoints on properties of a good metric:

So what’s the plan?

The outcome of the 3 steps approach to initiate analysis projects can be best summarized in the analysis canvas explored below.

Analysis Canvas
Analysis context

  • Goal: What is the main question that the analysis needs to answer?
  • Motivation(s): Why is the analysis goal and core question relevant?
  • Action(s) to drive: What are the decisions and/or actions that the analysis will empower?
Planned analyses (measures, metrics, reports…)

  • List of analyses to build
Stakeholders and participants

  • Decision-taker(s): Who needs this analysis to help take a decision?
  • Helper(s): Who can help answer questions with regard to the analysis goal and context?
  • Observer(s): Who are simply interested in the analysis with no stake in its results?
  • Analyst(s): Who will be performing the analysis?

I personally only start performing analyses after core stakeholders, especially decision-makers, review and agree to the analysis canvas. This ensures that there is an agreement from the get-go with regard to the analysis goal, individual responsibilities, potential actions to take, and analyses to build.

In my experience, starting to analyze data without agreement on these points can lead to future conflicts and missed expectations. There’s no time to waste on any of that.

Data analysis / data science / statistics resources

Finally, as promised, allow me to share some resources on analysis methods:

Recommended exercise

Let’s pick an analysis that we want to perform and fill out an analysis canvas. What’s our goal? Why is that important? What actions do we plan to take based on the results? What are relevant analyses? Who else needs to be involved?

Subscribe on the right for new insights every week!

start⬆Mngr handbook pt. 4: Intro to innovation & quality improvement

start⬆Mngr handbook pt. 4: Intro to innovation & quality improvement

First, a personal story.

Like most ambitious companies, our team is constantly looking to improve our performance. We’re never OK with the status quo. If sales conversion were at 90%, we’d want to take it to 100%. And as a start-up, there are so many things we can improve…

This meant that in the early days, before we had adult supervision and a clear company strategy, we changed processes a lot, switched focus weekly, and experimented with many different product features. On the customer success side, we’d setup retention email campaigns to keep churn rate down in one month, and  we’d leave that behind and work on up-selling customers the next month to boost revenue. Two quite different goals. On the product side, we’d get excited about an idea that a customer wants, spend a couple sprints developing it, but then never touch it or improve it again. This speed of change is not uncommon at early stage start-up companies.

…most successful innovation projects started with a problem definition stage before jumping to problem solving

On the good side, we were never bored. We always had something new and exciting to do. Yet on the downside, we had little consistency in our approach to improve the company’s performance, going in all directions and doing everything at once. Team members would be left wondering what our long-term goals were, and what set our product apart from the competition. We tried to be everything for everyone, without the resources necessary to do so.

This led to process changes and product features that didn’t improve our bottom line. We’d waste time and resources on non-coordinated initiatives that failed to convert more customers, did not attract more prospects, and didn’t retain users. Worst of all, we’d execute and move on from all these projects without documenting our learnings, so we’d repeat some of our mistakes.

To help correct for this lack of direction, I spent much time exploring how other companies managed innovation projects: From hospitals to grocery stores, from tech start-ups to auto-makers. To my surprise, I found a consistent theme: Almost all innovation projects were carried out in a systematic way, following scientific processes that relied on data and experimentation. More importantly, most successful innovation projects started with a problem definition stage before jumping to problem solving (very hard concept to grasp for engineers).

In this blog post, I’m going to explore how to properly challenge the status quo and innovate systematically. We will discuss a system to help us lead innovation and quality improvement (QI) projects in order to:

  • Increase effectiveness of a process or product;
  • Solve the root cause of a problem rather than the symptoms; and
  • Boost creativity in problem solving.

For a quick introduction to innovation and quality improvement, I highly recommend the following TED style video by Professor Russell Ackoff.

The purpose of quality improvement projects is to change the way a problem is currently solved for the better. Better can be defined as doing something right, increasing effectiveness / efficiency, or lowering cost / margin for error. Since the market continuously changes, and our competitors never stop evolving, innovation projects are necessary for organizations to stay relevant.

Now let’s explore the basic steps of carrying out an innovation or QI project in an agile environment:

quality improvement process

This process ensures that we solve the right problem, test different solutions, and validate results before scaling. It avoids jumping to solutions that solve the wrong problem and wasting resources (realize that finding solutions is the fourth step, not the first). Let’s explore this in detail.

I. What’s the problem?

As a first step, let’s find supporting evidence to the problem, understand why it is occurring, and whether it’s relevant to the company/team mission.

Does the problem really exist? Is there evidence?


Innovation projects tend to arise from frustrations and desires to improve existing systems or processes. Before jumping to solving what we believe is a problem, it’s critical to gather evidence in the form of quantitative or qualitative observations to prove that a problem actually exists. Through this step, I’ve often found that the actual problem is quite different than our initial perception.

Evidence can take the form of data reports, analyses, interviews, or surveys to name a few examples. With recent innovations around analytics, there’s no excuse to have at least one piece of statistical evidence.

For example, one may be frustrated at the slow speed at which team members answer and complete customer service calls. Evidence to support the fact that team members are working slower than possible can come from:

  • Listening in and interviewing call agents to understand if there are opportunities for further improvements on the call process.
  • Evaluating the average time it takes a group of fully trained agents to complete a call; and the average number of daily calls one agent can take.
  • Assessing whether the number of calls an optimal team can take in a day is higher/lower than the current volume.
  • Assessing the reasons customers call to receive help

With evidence, a more complete picture of the problem(s) will emerge. Again, we’re very likely to discover that initial assumptions on the problem are wrong, or that they are only a small part of the actual problem. That’s great. That’s why we gather evidence.

What’s the root cause?

root of the problem

Once all the necessary evidence has been gathered, a root cause analysis can be performed using the 5 why’s method. This ensures that the QI project focuses on addressing the source of the problem rather than its symptoms and side effects.

In our example, we may find through root cause analysis that the actual problem is that the agents are receiving too many calls due to business growth. This could result from the fact that the number of agents has remained unchanged, while the number of customers and calls has grown. Data could also show that agent’s response time and call duration have not varied over history. So the problem may not be that agents are too slow.

Ensuring that our problem is accurately defined is crucial. Otherwise, our solution will be useless. Yet more often than not, we are too eager, impatient, and jump right to identifying solutions. We enjoy solving things, coming up with ideas. Unfortunately, a good idea to an misdefined problem is not as effective as a bad idea to a well-defined problem. The result is wasted time and resources that may not bring the desired innovation and quality improvement.

This widespread behavior is likely from undergoing 15+ years as students where problems are defined for us, and all that is expected is for us to solve them. We get rewarded to solve problems, not defining them. Unintentionally, schools have failed to teach us how to pinpoint problems.

A good idea to an misdefined problem is not as effective as a bad idea to a well-defined problem

I personally have greatly benefitted from reading “Are Your Lights On?” in an attempt to get better at problem definition. I highly recommend it to everyone that solves problems, especially engineers.

What’s not part of problem?

Every project should have a defined scope. This specifies the part of the problem that we’re aiming to solve (Do we want to solve climate change, or do we want to reduce the amount of CO2 released by our fleet of vehicles?). It is critical to set a clear scope to our problem, as the 5 Whys process has a tendency to yield potentially irrelevant problems that we may not want to tackle just yet. We need to resist the urge to take on every problem we discover and focus on the problem that is most important, relevant, and that we can effectively make an impact on. If the 5 Whys reveals multiple large problems, the QI project can be broken down into phases, focusing on the most pressing problems first. There is more details on prioritization later on in this blog.

Is it relevant to the team goal and company strategy?


Next, compare the problem to the team’s goal and the company’s strategy: If they are aligned, the project will push the team and company toward its ultimate goal, but if they don’t align, the project will waste precious resources.

In our example, assuming that everyone agrees that “having too many inbound customer service calls that potentially don’t need agent help to resolve” is the problem, we will need to evaluate whether the behavior is desired or not in relation to the company strategy and team goal.

If the company strategy is to offer best in-class human support, then the problem identified may be irrelevant, and a potential business case should be made to hire additional agents to help with the lack of bandwidth. Yet if the company strategy is to empower customers to self-serve wherever possible and maximize the ratio of customers to agent, then this may be a project worth pursuing.

OUTCOME: At this stage, we’ve gathered evidence that validates the problem, a root cause analysis has been performed to identify the problem’s source, and we know whether the problem is relevant to the organization. There is thus enough preliminary information to decide whether this is a problem that the organization wants to pursue.

II. Is this a priority?


Most organizations are bound to have multiple innovation and QI projects running at once, yet limited resources. We thus need to prioritize. Here are some questions that can help us do that:

  • What’s the potential impact / value of this project?
  • What if we didn’t do it?
  • What if we didn’t do it now, but later?
  • What’s the cost? human, material, time, …
  • What’s the ROI? [Value / Cost]

Based on our answers to the questions above, we’ll be able to compare and rank projects by priority.

It’s important to realize that ultimately, a project’s priority is reflected by how much resources we allocate, so we need to ensure that each project has the appropriate allocation of time, people, and other resources. Generally speaking, important projects deserve more resources.

What about low priority projects? Again asking “what’s the impact if we didn’t pursue it?” will help us decide whether to:

  • Cancel the project and never revisit it; or
  • Re-assess its priority later when resources are freed.

When it comes to prioritization, choosing what not to do is more important than choosing what to do. This ensures that top priorities get the right amount of attention and resources, and that plans are realistic. This applies especially to ambitious teams that tend to take on too much. Let’s thus recognize that doing everything is not successful prioritization. It runs the risk of juggling too many balls, and consequently, dropping all balls. Our startup companies have limited resources, so let’s choose how we spend our time wisely.

OUTCOME: Our project is now prioritized against other initiatives, which gives a clear idea of when to pursue it.

III. What do I expect?


Now’s the time to envision our desired end result and answer: What do we hope to achieve?

Of the three categories of goals (individual goals, functional team goals, project goals), projects need the clearest goals and expectations at the start. For example, before building a house, a clear blueprint needs to be drafted to communicate to all stakeholders what is expected in the end. The same applies to innovation and QI projects. Setting expectations before work begins ensures that all stakeholders  can agree or debate the desired outcome before it is implemented. Being specific in outlining expectations is key, so I recommend adopting a SMART approach.

Note that this is not yet the time to envision solutions or think of how to achieve the desired expectations. Doing so would be like starting to drive toward California before figuring out that we should actually be going to Florida. 

Based on our example case of a company wanting to decrease the number of inbound calls that can be resolved with self-service, the expectations for the project can be to:

  • Decrease the percentage of self-serviceable requests addressed on calls from X% to Y% in three months.
  • Lower customer call wait time to less than 5 minutes for 90% of all calls by X date.
  • Maintain Net Promoter Score at X and above beginning on Month/Day/Year. 

When it comes to measuring our goals, there are two types of metrics that we can leverage: progress metrics and outcome metrics.

  • An outcome metric assesses whether the desired outcome was achieved. e.g. In our example, a call’s wait time is an outcome metric, necessary to check whether 90% of all calls are answered within 5 minutes.
  • A progress metric tracks the impact of specific solutions, which in turn may contribute toward the desired outcome. e.g. In order to achieve the desired expectation around response time, one change or solution may be to have all calls completed or resolved within 10 minutes to ensure a high level of bandwidth availability. To that effect, the percentage of calls that are resolved within 10 minutes is a progress metric, which may in turn impact the desired outcome of answering calls within 5 minutes.

OUTCOME: A clear set of expectations has now been identified, including, measures that will help gauge success objectively.

IV. What are potential solutions?

multiple solutions

To explore potential solutions, we’ll first need to identify relevant stakeholders that can help imagine solutions, and then use divergent thinking to dream of all possible solutions.

Who should participate?

Having the right people on the project will maximize the potential of finding successful solutions.

In my experience, it can be a good idea to have people that are frustrated by the current process, that may not be top achievers, to participate in innovation projects. This is because top achievers work so well in the current process that they may not want to change. It may be hard for them to think outside the box. In other words, we need to find people that feel and understand the pain of the current problem to participate.

I also recommend the use of RACI to identify key players that need to be involved:

  • The product owner(s) / sponsor(s) / stakeholder(s): Who needs this project?
  • The scrum master(s) / project manager(s): Who’s managing progress of this project?
  • The team member(s): Who’s helping execute on this?
  • The subject matter expert(s): Who has special knowledge or skills that we need to consult? Who are the ones that live this process? Customers? Agents?

Without a clear solution or detailed project plan just yet, it is normal to involve additional players later on. Right now, let’s focus on involving the core people that are impacted by this problem.

Exploring solutions


Now comes the fun part. It’s time to identify solutions.

I will not discuss in too much detail processes by which teams and individuals solve problems – this demands deep knowledge of a specific industry, and special skills that I probably do not have. I do however recommend familiarizing ourselves with the concepts of Idealized Design and divergent thinking as frameworks to help think of solutions and answer the following questions:

  1. What’s the ideal solution? (If we don’t know what we want in an ideal world, how will we know what we want under constraints?)
  2. What’s preventing us from getting there today? (What constraints exist?)
  3. What’s the first step that we can take toward that solution? (Work backwards from the ideal solution to what is possible today under constraints. There may be multiple steps to eventually achieve the ideal solution, but what’s the first step that we can take? What’s the second?)

Another valuable resource that has helped me with problem solving is The Art of Problem Solving

Once a series of potential solutions have been scoped out, I recommend for the project manager to work with relevant players to:

  1. Describe the solutions in detail including:
    • What it is;
    • What impact is expected (both positive and potentially negative);
    • What resources are necessary;
    • Potential risks
  2. Rank solutions by a set of criteria relevant to the problem and the organization
  3. Seek agreement with stakeholders on which solutions to test
  4. Design experiments that can prove the effectiveness of the solutions chosen. This can take the form of a scaled-down version of the full solution, a fake back-end where the solution is launched without the full infrastructure or resources necessary to support it, or even vaporware if one is testing for market need.

The reason that I recommend testing solutions before changing processes is to minimize risk. Should a solution not produce the expected results, only limited time and resources would have been sacrificed.

Once the list of solutions to test are chosen, the project manager will draft project plans including a timeline with clear deadlines and deliverables. This ensures that all necessary tasks are outlined for accurate implementation.

OUTCOME: We now have a list of solutions to test, including a detailed project plan.

V. Does the solution work?

plan vs reality

Managing progress

To ensure that the project runs as planned, the project manager will be responsible to manage and report on progress. Questions that the project manager needs to ask include:

  1. Are we on path to the next milestone(s)?
  2. What challenges are we facing?
  3. Do adjustments have to be made to the project plan based on latest progress?
  4. Are we discovering new insights as work is done on this? Should the end solution be adapted now that we know more?

Many project management methods exist out there to help answer the above questions. In the context of start-ups that need quick validation and delivery of solutions, I recommend adopting an agile approach to project management.

Reviewing test results and identifying next steps

After experiments around solutions are done, it’ll be time to assess results.  Did we meet our SMART goals?

Specifically, all stakeholders and players need to:

  • Compare final results to initial expectations;
  • Understand where things went wrong, why, and how we can prevent that in the future;
  • Understand where things went well, why, and how we can consistently achieve this in the future.

It is especially important to review results of both progress metrics, that assess how specific initiatives and solutions performed, and outcome metrics, that assess whether the desired outcome was achieved. The ideal scenario is one where both progress and outcome metrics changed as expected, but there can certainly be scenarios where both were not impacted, or where the progress metric changed as expected yet failed to impact the outcome.

Insights gathered at the review meeting will help us decide whether to:

  • Cancel a solution due to its ineffectiveness; or
  • Iterate a solution and further experiment to see if additional improvements can be achieved before scaling its implementation; or
  • Scale a solution to the entire organization / process.

OUTCOME: By the end of this QI process, there is evidence to support the effectiveness of different solutions, and decisions are ready to be made as to whether to scale their implementation.

VI. We solved it. Are we done?

We’re never done. It’s human nature to never be satisfied. Once we’re on top of a peak, we will naturally shift focus onto the next peak to conquer. If we became market leaders in one segment, we’d shift focus to widen our lead or conquer other segments. We have the same mindset in our personal lives: The moment that I find parking in a downtown area where parking is scarce, my focus naturally shifts to whether I could have found parking closer to wherever I was going.

Fact is that most problems that we’re solving today are the same ones that our ancestors were solving hundreds of years ago. Love, communication, food, shelter,… Each generation solves it differently, using technologies available at the time. We used to communicate with smoke signals, and now we have emails and Twitter. We used to farm our own food, and now we have food delivered via apps. We used to listen to neighborhood announcements for news, and now we have email newsletters (so many of them). These fundamental problems will never ever disappear, and we will continuously innovate to solve them differently.

Whether our company will be leading these innovations and disruptions is another story. In the “Innovator’s Dilemma,” Clayton Christensen makes the argument that some of the best companies risk becoming irrelevant because they are too well managed. Sounds ironic? I highly recommend reading his book.

How do I project manage?

Considering that project management skills are essential to the successful execution of innovation projects, I recommend reading up on the subject should anyone be unfamiliar:

Recommended exercise

The next time that a problem comes up, let’s take a moment to investigate the problem, and accurately define it, before starting to solve it.

Subscribe on the right for new insights every week!