00:34 Today, we explore the concept of negative knowledge. What do we mean by this?
02:14 Let’s look at the soft domains. Why are those businesses so interested in negative knowledge?
04:09 What are the ways in which we can capture the negative knowledge?
06:51 How can you change this culture?
08:42 Which are the ways in which you can speed the failing process up?
12:39 How can companies learn from failures?
16:08 Are there any industries that are taking advantage from negative knowledge?
19:01 Are we not going to end up with lots of reports that nobody would ever read?
21:23 From a leadership perspective, should the leaders celebrate when their team is failing?
The Supply Chain Antipatterns represent negative knowledge, that is, knowledge about what doesn’t work. Surprisingly, negative knowledge tends to be more robust and more lasting than its positive counterpart, that is, knowledge about what works.
When it comes to assessing previous projects, companies are very good at recognising their successes, but often not so good at keeping track of their failures. In this episode, we learn why discussing failure is so difficult and how often it can end up being somewhat of a taboo subject, particularly in big companies.
To err can be considered human and here we discuss the impact of this negative approach and learn how it often can result in future teams repeating the mistakes of their predecessors. We talk about how some of the best companies actually learn and document their failures - and also why relying on PowerPoint to articulate understanding can be dangerous…
In Supply Chains there is lots of simple information that is not very difficult “in isolation”. However, the high volume of data and possible decisions makes understanding the whole picture challenging. We discuss why in Supply Chains there are no guarantees for success, but a number of guarantees for failure and how without negative knowledge, having an understanding of these can be far more difficult. We try to see how different industries approach negative knowledge and in particular why the computer science domain and software initiatives can allow themselves to take more risks than a typical Supply Chain initiative.
We also discuss toxic mottos such as “Get it right the first time”, which is actually a recipe for disaster, forcing people to hide their failures instead of communicating them and building on them. Instead, we need to be ready to take risks: “Fail fast”, a Silicon Valley motto, is a much more positive and constructive way of seeing things. Hence the importance of documenting our failures, poignantly summed up with a rather tragic anecdote from the Manhattan Project, where two pieces of uranium got too close.