00:00:07 Negative knowledge and its importance.
00:00:37 Positive vs. negative knowledge and their applications.
00:02:15 Why businesses are more interested in negative knowledge.
00:04:11 Capturing negative knowledge and learning from past failures.
00:06:54 Changing company culture to embrace negative knowledge.
00:08:01 The importance of failing fast in supply chain management.
00:08:54 Factors that hinder the ability to fail fast and learn from failures.
00:11:02 Cherry-picking numbers and focusing on the wrong things.
00:12:44 Learning from high-profile failures and their causes.
00:15:33 Cultivating knowledge of past failures to avoid repeating mistakes.
00:17:48 Amazon’s approach to negative knowledge through written memos.
00:19:02 The lack of documentation on failures in the supply chain industry.
00:21:24 Leadership’s role in embracing and documenting failures.
00:23:00 Anecdote from the Manhattan Project about learning from failure despite dire consequences.

Summary

Kieran Chandler and Joannes Vermorel discuss the importance of negative knowledge in learning from mistakes, especially in complex domains like supply chain management. Vermorel highlights the need to embrace and document failures, as human flaws make mistakes inevitable. He shares an anecdote from the Manhattan Project, where a researcher’s documentation of a critical incident led to valuable findings. Vermorel emphasizes the importance of adopting a scientific mindset and addressing failures to ensure they become valuable lessons, rather than negative experiences. Acknowledging and learning from failures can help future team members benefit from this knowledge.

Extended Summary

In the interview, Kieran Chandler and Joannes Vermorel discuss the concept of negative knowledge and its importance in learning from mistakes. Negative knowledge refers to understanding what does not work, which can be especially relevant in complex or “soft” domains like sociology, business, and supply chain management.

Vermorel highlights the difference between positive knowledge, which comes from discovering and understanding laws and principles, and negative knowledge, which focuses on identifying and learning from what does not work. He uses the example of fundamental sciences, such as physics, where positive knowledge has been incredibly successful. In these fields, brilliant minds have made predictions about phenomena like black holes, which were later proven to be true.

However, in complex domains like supply chain management, positive knowledge is less helpful, as the situations encountered are often irreducibly complex. Supply chains involve many simple components, such as crates, trucks, and pallets, as well as various combinations of people, software, machines, and networks. These factors interact to create opacity and complexity, making it difficult to apply positive knowledge.

Negative knowledge is not fashionable or popular, but Vermorel suggests that it becomes increasingly important as one moves into more complex areas. He contrasts the super-complicated nature of quantum mechanics, governed by a few complex physical laws, with the complex nature of supply chains, where the individual components are simple but their interactions create complexity.

To capture and learn from negative knowledge, Vermorel recommends acknowledging the omnipresence of failures in supply chain management. He points out that despite vendors claiming a string of successes, failures are common. When he started Lokad, supply chain optimization was not a new concept, but the industry was still rife with failures and inefficiencies.

Vermorel talks about his experiences running Lokad, where he discovered multiple generations of failures in supply chain optimization projects. He highlights the difficulty of gaining knowledge about these failures, as they are often left undisclosed due to the fear of blame and the potential negative impact on resumes.

Vermorel then explores the concept of changing company culture to embrace failure. He argues that toxic mottos like “get it right the first time” create a culture where failures are hidden and not communicated. On the other hand, he promotes the “fail fast” mentality, popularized in Silicon Valley, which encourages people to take risks and iterate quickly towards success. The challenge in supply chain management is to fail fast without endangering the company or creating massive risks.

One way to enable a “fail fast” culture is to avoid long-term commitments with large suppliers, which can inhibit the ability to pivot quickly in the face of failure. Vermorel points out that he has seen multi-year contracts in industries like aerospace, which he believes is detrimental to the ability to recognize and address failure.

Another issue Vermorel raises is the focus on return on investment (ROI), which can incentivize vendors to exaggerate their success stories. Instead, he suggests that companies should focus on identifying failures quickly and stopping non-value-adding activities as soon as they are identified.

To move away from the negative connotations of failure, Chandler asks Vermorel how companies can learn from their mistakes. The first step, according to Vermorel, is acknowledging that even large, successful companies like Amazon and Google have had failed projects. Companies should focus on learning from these failures and iterating quickly to move forward in a more “capitalistic” way, meaning they should constantly seek to create value for the company.

Vermorel highlights the need to study past failures, such as the case of a half-billion euro wasted on a failed project in Germany. He also mentions Accenture’s legal battle with Hertz over a failed web platform project, emphasizing the importance of understanding the reasons behind these costly mistakes.

According to Vermorel, people tend to focus on optimistic views of the future, as seen in TED Talks, rather than investigating the problems and failures that have occurred. The primary causes of these failures are human flaws, such as laziness, lack of curiosity, and sometimes incompetence. To learn from these mistakes, individuals need a different mindset, one that involves spending time researching and understanding what went wrong.

The supply chain industry is particularly vulnerable to costly mistakes, which often remain hidden due to their impact on company finances. Vermorel questions whether any industries are better at learning from negative knowledge, mentioning the Journal of Negative Results in Biomedicine as an example of a controversial attempt at promoting the publication of negative results in scientific research.

Vermorel praises Amazon for its effective handling of negative knowledge, attributing this success to the company’s culture of written memos instead of PowerPoint presentations. He believes that plain text and full sentences make it harder to obscure the essence of a problem and that this approach can help identify evasive language.

However, the host raises the concern of information overload if every failure were to be documented. Vermorel disagrees, stating that in the supply chain industry, there is often very little documentation. He believes that this lack of documentation is related to the ubiquity of failures, as executives and management may not want to leave a trail of their mistakes, even if those failures ultimately led to success.

The key message is that leaders should accept and even celebrate failures within their teams, as human beings are fundamentally flawed and mistakes are expected. Vermorel emphasizes the need to approach failures with a scientific mindset, documenting the situation for future reference.

Vermorel shares an anecdote about a criticality incident during the early days of the Manhattan Project. Although the researchers involved ultimately died from radiation poisoning, one researcher had the foresight to take notes on the positions of everyone in the room. This information later led to important findings on radiation poisoning and its correlation to distance from the radiation source.

The discussion also touches on the stress and pressure people face when things are not going well. Vermorel acknowledges that it can be challenging to think clearly and maintain composure in such situations. However, he emphasizes the importance of making an effort to document and learn from failures so that future team members can benefit from this knowledge. This approach ensures that failures become valuable lessons, rather than just negative experiences.

Full Transcript

Kieran Chandler: Today on Lokad TV, we’re going to understand how many companies can actually make mistakes, but only the best learn from them using something called negative knowledge. So Joannes, this is something we’ve touched on before, but what do we mean by negative knowledge?

Joannes Vermorel: Negative knowledge is about knowing what doesn’t work. It’s kind of puzzling because there are some elements of knowledge, like fundamental science, where we have incredible success with positive knowledge. For example, we have uncovered laws of physics, and some incredibly brilliant minds predicted almost a century ago that black holes should exist. And lo and behold, literally like a century afterward, we finally managed to have the first direct observation of a black hole. That’s positive knowledge, and it works incredibly well for fundamental sciences like physics. But the reality is that in soft domains like sociology and business, it’s typically the negative knowledge that works best. You can never be sure that something is completely true or absolutely works, but you can be fairly confident that something is deeply wrong or doesn’t work. This negative aspect is the opposite of something being true; you don’t know for sure, but you know that something else is very incorrect or dysfunctional.

Kieran Chandler: Let’s look at some of those soft domains, as you called them. Why is negative knowledge so interesting to them?

Joannes Vermorel: Well, the main problem is that businesses are not that interested in negative knowledge. It’s very uncool and not fashionable at all. My own take is that negative knowledge is gradually more important as you move to fuzzy, complex topics. Quantum mechanics is super complicated, but there aren’t that many physical laws that govern quantum physics. On the other hand, a supply chain is kind of the opposite. There’s nothing that is fundamentally super complicated; you have crates, trucks, pallets, machines that produce things. All of the elementary parts are quite simple, but you have a lot of them, and combinations of people, software, machines, and networks of staff generate a lot of opacity. It’s very hard to approach these super complex domains with positive knowledge because you can’t frame your supply chain into a few axioms or a tiny few elements that would explain everything. It’s irreducibly complex, and there’s almost nothing where you could say, “this is it, this is a recipe, and it will work.” Simplistic ideas tend to fail when confronted with the real world due to many edge cases that do not fit this simplistic perspective.

Kieran Chandler: Okay, so you talk about capturing this negative knowledge. What are the ways we can capture this knowledge?

Joannes Vermorel: The first thing is to acknowledge that failures are omnipresent in supply chains and IT. Most vendors claim they have only a stream of successes, but that’s not the reality. The most puzzling thing is, when I started Lokad about ten years ago, even at that time, supply chain optimization wasn’t a new thing.

Kieran Chandler: Was already ancient, it was, I mean, ancient by software standards. So, it’s like, let’s say, four decades old, which was as old as it can get, you know, as far as enterprise software is concerned. And thus, there have been already, even before I started, generations of failure. What is very interesting is that during my ten years running Lokad, I just discovered all those kind of untold failures that happened before us. And when we had successes, it took a lot of effort to realize that we were actually succeeding. But there were like six previous attempts, mostly undocumented, like silently buried, that did happen before us.

Joannes Vermorel: It was very difficult to even gain knowledge about those situations when we did fail, and it happened frequently. What was even more maddening is that when you’re failing, and at the end of the project when you kind of debrief, you finally managed to get some nuggets of information. It appears that the two previous iterations failed for the same reasons that were left completely undisclosed.

It’s very hard for companies, which are like large groups of people, to put forward their previous failures because, you know, who gets the blame? You imagine you’re the executive that carried forward a large initiative that turned out to be wasting tens of millions of euros or dollars in your company. It’s not exactly the sort of thing you want to put forward on your resume. And despite the idea of let’s be tough on subjects and soft on people, in the end, it’s only people that get the blame. You’re not going to blame the mathematical formula, even if it was incorrect. In the end, you blame whoever came up with this formula and yielded something wrong, which costed a lot of money or just failed to deliver what was expected.

Kieran Chandler: How can you possibly change this culture because, at the end of the day, we’re all human. We don’t want our names associated with things that have gone badly. So how can you actually change that culture?

Joannes Vermorel: There are many ways. First, there are ways to not make the problem worse. For example, there are super toxic mottos such as “get it right the first time.” That’s a recipe to have endless streams of problems. Supply chain is very complex, so it means that if you want to achieve success, you need to iterate over something that is failing, failing, failing, and then you iterate, rinse, and repeat. And maybe, if you’re super fast at iterating, you will get success. It’s very iterative. It means that you’re going to keep failing until it works. So the idea of doing “let’s do it right the first time” is completely poisonous because it forces people to make all their failures as discreet as possible and not communicate on any failure that happens.

The opposite of that is, for example, “fail fast,” which is like a Silicon Valley spirit of “move fast and break things.” It does encourage people to take risks, and I think that’s the first step. What makes it very complicated in supply chain is that, typically, the cost of failure can be very high. You need to find ways to fail fast without endangering your company and without generating massive risk for your company in the process.

Kieran Chandler: It’s quite a funny concept, isn’t it? You want to fail and actually fail quickly. Are there a lot of ways in which you can speed that process up?

Joannes Vermorel: First, you want to put yourself in a position where it’s possible to fail. There are many ways that

Kieran Chandler: For a multi-year contract, I think the worst situation I’ve seen was in aerospace where there were like 10-year deals being signed. That’s, for me, astonishing. It means that people around the table acknowledge that there is no potential failure that can happen any faster than that multi-year horizon. So, you need first to think about how much time will it take to shut down the initiative if we see that it doesn’t pan out.

Joannes Vermorel: As you move forward, people frequently ask the wrong questions, such as “prove to me that you have an ROI.” This basically gives a huge incentive to the vendor to come up with blatant lies. If you press a vendor to give you case studies on the ROIs they’ve generated, any large vendor will come up with fantastically high numbers. For example, if you deploy SAP in a company with a very bad starting point, you might see fantastic results, but it could be due to a new management team or an economic cycle that helped the company. Instead of focusing on the wrong things, you should try to identify what is failing super fast so that you can change and move forward in a capitalistic way.

But that means you need to acknowledge when you’re doing something that seems to be not capitalistic and stop doing it fast, like when you’ve identified something that doesn’t accrue value for the company. This goes against having a waterfall planning with a six-month roadmap rolled out, which is the opposite of failing fast.

Kieran Chandler: Let’s move away from the failing side of things and more towards the positive side. I mean, big companies like Amazon and Google have had projects that failed. How can companies learn from those failures, and what should they be doing to learn from them?

Joannes Vermorel: The first thing is that many failures nowadays are not secret. For example, Lidl wasted something like half a billion euros on a failed SAP setup in Germany. There is something to learn from that, and any single client who wants to do a similar large-scale project should probably start by spending more than a few hours to do whatever they can to learn from this failure to see what went wrong. Those are the spectacular ones, but there are many other situations.

I know that, for example, Accenture is now going to court facing Hertz, one of their clients in the US, because they disagree on the delivery of their new website. It’s hard to pinpoint the blame, as it’s probably much more complicated than that, but no matter who is to blame, there is a failure to learn from in this situation.

Kieran Chandler: So, we’re talking about failures and how important it is to understand the root causes. What are your thoughts on this, Joannes?

Joannes Vermorel: There are plenty of failures that are accessible, but you need to spend time studying them. People are happy watching TED Talks and have an enthusiastic view of the future, but it takes a different mindset to dig up the failures. Human flaws are usually the root cause, such as laziness, lack of curiosity, and complacency. It takes a certain mindset to cultivate this knowledge and understand all the things that went wrong.

Kieran Chandler: Interesting. It sounds like a somewhat pessimistic mindset is required to really understand and learn from failures. In the supply chain industry, mistakes are costly, and many of them go unnoticed. Are there any industries that are doing better in this regard?

Joannes Vermorel: Negative knowledge is still fairly new. For example, the Journal of Negative Results in Biomedicine was launched in 2002, and it triggered a massive controversy at the time. Negative work is needed, but by publishing it, researchers were demolishing the work of their colleagues. Even in hard science, communities have difficulty coming to terms with negative knowledge, although I believe it’s fundamental. Amazon is one company that’s quite good at it. They have a culture of written memos instead of powerpoints, which conveys the logic and connectors more naturally.

Kieran Chandler: That’s fascinating. So, you believe that written memos are more effective in conveying the essence of a problem?

Joannes Vermorel: Yes, because bullet points can be vague, and the message can be left untold. If you use plain text and full sentences, it’s much harder to disguise the essence of the problem. This approach helps to identify when a piece of text is evasive, which is a common problem in industries that generate a lot of documentation.

Kieran Chandler: Out there, if people started producing all the documentation of what went wrong and what didn’t work, would you not end up in a scenario where there’s just too many things out there to possibly read?

Joannes Vermorel: I disagree with the fact that there is too much documentation. The thing that is maddening is that, for example, in supply chain, I mean most of our clients, even the clients that operate very large supply chains with over a billion dollars worth of stock, frequently rely completely on oral traditions. Nothing is written ever, and the only thing that gets written are hundreds of slides of PowerPoints with thousands of disconnected bullet points. It’s like you have to read between the lines between the bullet points to even start to make sense of what worked well and what didn’t.

That’s the crux of the problem: there is very little documentation. And again, I believe that is also related to the fact that failures are omnipresent. Most things fail, and management does not want to leave extensive tracks of all their failures, even if those failures were actually the reason why they managed to succeed in the end. For example, Edison said that he had learned a thousand ways to not produce a light bulb before figuring out how to produce a light bulb. It doesn’t matter that you’ve been failing a thousand times. What matters is that your failures were not terminal failures, that you did not bankrupt your company, and that you learned something each time, eventually leading to success.

But that means, what about documenting all those failures? My own experience in the supply chain industry is that there is very little documentation, and the few that exist are like IT documentation, written for machines not humans. They can be thousands of pages long but are completely worthless, certainly not worth the time you would take to read them.

Kieran Chandler: So, let’s start bringing this all together now. The key message is that from a leadership perspective, our leaders should be celebrating the times when their teams fail and accepting that these things happen because humans are fundamentally flawed and it’s to be expected.

Joannes Vermorel: I would say more than that. It’s about being like a scientist and documenting your stuff. I have a terrible anecdote that comes to my mind from the early days of the Manhattan Project. They had what’s called a criticality incident, where they were playing with nuclear materials like uranium. In the lab, they brought two pieces of uranium too close and got a criticality incident, like a tiny nuclear explosion, a burst of radiation. The researcher who faced the burst told everyone to freeze, don’t move. He took a piece of chalk and noted on the ground the exact position where everybody in the room was standing. By doing that, a couple of weeks later, everybody died of radiation poisoning. However, thanks to this researcher who had the insight to have everyone freeze and take notes on their positions, they were able to prove that every meter counts. If you were one meter further away from the radiation burst, you would actually die two days later. They had the first empirical confirmation that radiation poisoning is deadly and that its intensity is completely correlated to the speed at which you’re going to die. But you see, it takes a relatively resilient mindset when facing catastrophically bad events to have such insight.

Kieran Chandler: There’s a spectrum where you’re facing mortal danger, but even in companies where nobody’s life is at risk, people can still be under a tremendous amount of stress. It’s very difficult when things are falling apart and you’re not succeeding. Even if the company is not in danger, you as a person can still be under a tremendous amount of stress. It’s very hard to think straight and to make the effort to steel your mind, and say, “Well, we are still going to take notes. It’s depressing, but we are still going to write it so that the people that come after us might benefit from this knowledge.”

Joannes Vermorel: Right. It’s not just about celebrating the failure, but acting in a way that this failure can be capitalized upon. It’s not about being happy or content about it; it’s more about facing a catastrophic situation and asking, “How do we make sure that our future selves, or people that come after us, will not repeat that?” It takes a very specific sort of effort to achieve that.

Kieran Chandler: Those are some nice, cheerful anecdotes. Well, that’s everything for this week. Thanks very much for tuning in. We’ll be back next week with hopefully a more cheerful episode. Until then, thanks for watching.