00:00:07 Scientism and its prevalence in supply chains.
00:00:34 Explanation of scientism and its consequences in various fields.
00:02:16 Real-world examples of over-reliance on scientific results and shortcuts.
00:04:07 Demand forecasting in supply chain as an example of scientism.
00:06:50 Alternatives to scientism in supply chain management and considering complex systems.
00:08:01 Oversimplification and its impact on supply chain professionals.
00:09:45 Addressing the issue of naive rationalism in supply chain management.
00:12:38 The importance of human judgment in supply chains and the limitations of quantitative models.
00:14:48 Balancing data-backed insights with high-level judgment in decision-making.
00:15:38 Ensuring client understanding and avoiding naive rationalism at Lokad.
00:17:06 The need to consider significant factors like Chinese New Year in supply chain planning.
00:18:30 Being skeptical of rational organizations and the importance of combining science with common sense.
00:19:51 Highly educated engineers can be clueless about real-world applications.
00:21:02 Need for expert insights, common sense, and avoiding nonsense at an industrial scale.

Summary

The founder of Lokad, Joannes Vermorel, discusses the concept of scientism in supply chain optimization in an interview with Kieran Chandler. Scientism is defined as an overreliance on science, assuming that all problems can be solved through a scientific approach. Vermorel cautions against an overemphasis on algorithmic approaches to demand forecasting, as these models can be overly simplistic and lead to self-fulfilling prophecies. He emphasizes the importance of high-level human judgment in supply chain management, as well as the need to consider second-order effects and acknowledge the complexity of the systems and humans involved. Vermorel advises clients to rely on expert insights and common sense when implementing supply chain optimization solutions.

Extended Summary

In this interview, host Kieran Chandler and Joannes Vermorel, the founder of Lokad, discuss the concept of scientism and how it impacts supply chain optimization. Scientism is described as an extreme belief in science, assuming that all problems can be solved through a scientific approach. In the context of supply chain management, this term is synonymous with naïve rationalism, suggesting that a single recipe, algorithm, or technology can solve all supply chain problems.

Vermorel points out that large organizations often mimic the scientific method in their supply chain management, focusing on numbers, formulas, and hiring people with PhDs. However, these attributes alone do not necessarily make their approach scientific. He likens this phenomenon to a façade, creating an illusion of scientific rigor without truly adhering to the scientific method.

As an example of scientism gone wrong, Vermorel discusses the controversy surrounding p-values in social sciences. P-values are used to measure the confidence in a hypothesis, but when millions of hypotheses are tested, some will inevitably show significant results due to randomness. This issue illustrates how the overreliance on seemingly scientific results can lead to misleading conclusions.

Applying this concept to supply chain management, Vermorel identifies demand forecasting as an area where scientism can be problematic. There are numerous books and models on demand forecasting, which may give the impression of a well-established, rational approach. However, when one examines these models more closely, their rationality becomes questionable.

They discuss the limitations of certain algorithmic approaches to demand forecasting and the need for high-level human judgment in supply chain management.

Vermorel explains that many forecasting algorithms can be overly simplistic and can lead to self-fulfilling prophecies. For example, if a forecast predicts zero demand for a product, the product may not be stocked, and thus no sales will occur, reinforcing the initial forecast. He also mentions that for fashion brands, the number of units produced is often equal to the number of units sold, due to heavy discounts on unsold items. These examples illustrate the issue of self-fulfilling prophecies in naive forecasting models.

To overcome the limitations of these models, Vermorel suggests that supply chain professionals should acknowledge the complexity of the systems they are working with and the humans involved. They should also consider second-order effects, such as how offering discounts can create customer expectations of future discounts. He argues that scientific methods that focus on first-order effects and simplifications may be easy to measure and benchmark, but are not necessarily relevant to the business.

Vermorel emphasizes the importance of human judgment in supply chain management. He believes that high-level human intelligence is needed to determine if a mathematical model is useful or if it is oversimplifying the problem. While it may be possible to approach supply chain problems scientifically, it requires subtlety and cannot be reduced to simple measurements.

He notes that supply chain professionals are typically aware of these issues, but the allure of scientism and rationalism can be tempting, especially for highly educated individuals. Such professionals may have been exposed to the success of quantitative models in other fields, like thermodynamics, and may hope to achieve the same predictive power in supply chain management. However, Vermorel cautions that humans are not particles and their behavior is more complex, necessitating a more nuanced approach to supply chain management.

One example of naive rationalism in supply chain management is the Sales and Operations Planning (S&OP) process. Vermorel points out that divisions within a company may simply send back forecasts to meet their own incentives, rather than producing accurate demand forecasts.

They touch upon the human tendency to be risk-averse, the importance of backing up forecasts with data, and the role of high-level judgment in supply chain optimization.

Vermorel acknowledges that people are naturally risk-averse and often rely on data to back up their claims and forecasts. He agrees that it is essential to use data to support one’s insights but cautions that one must be careful when dealing with statistics, especially when multiple variables are involved in highly complex, high-dimensional problems.

Vermorel emphasizes the importance of high-level judgment in overseeing the calculations and models used in supply chain optimization, arguing that it is crucial to ensure that there is a suitable perspective at the core of the analysis. He believes that true rationalism involves this high-level judgment to prevent potential pitfalls in the analysis that could arise from overlooking important aspects.

The conversation then moves to the concept of naive rationalism, with Chandler pointing out that Lokad’s approach might have some elements of it. He mentions that many of their clients may not fully understand the “black magic” behind the company’s supply chain optimization techniques. Vermorel responds by saying that it is not the technicalities that are most important but rather the high-level understanding of the supply chain processes.

According to Vermorel, many of the technicalities that clients do not understand are often inconsequential in the larger scheme of things. He argues that it is more important for Lokad to focus on getting the broader aspects of the supply chain right, such as accounting for events like the Chinese New Year, which can have significant impacts on lead times.

Vermorel begins by admitting that the field of supply chain optimization is still very much in its infancy. While there have been significant improvements compared to previous methods, the industry has only scratched the surface. He emphasizes the importance of combining scientific approaches with common sense in order to achieve the best results.

He then cautions against the belief in a perfectly rational and well-defined organization, consisting of separate teams for forecasting, planning, and purchasing, each optimizing specific metrics. Vermorel argues that such systems might appear rational and scientific but are often nothing more than a mirage. He stresses the need for skepticism when encountering such organizations, as they may not be as rational or effective as they appear.

Looking back, Vermorel recognizes these failures as the obvious consequences of naive rationalism. He advises clients to rely on expert insights and common sense when implementing supply chain optimization solutions. No matter how many buzzwords or technical jargon are thrown at a problem, without a grounded understanding and practical approach, the result will be nothing more than industrial-scale nonsense.

Full Transcript

Kieran Chandler: Today on Lokad TV, we’re going to understand why this is such an easy hole to fall down and discuss whether something that looks clever on the surface actually is in reality. So Joannes, today we’re talking about scientism. It sounds a bit theoretical, but what is it exactly?

Joannes Vermorel: Scientism is, I would say, an extreme belief in science. It’s the idea that you can literally solve all your problems in life and society with a scientific approach, which kind of sounds good but actually is a bit naive. In the specific case of supply chain, it would be a synonym for naive rationalism, an approach where we have a recipe, algorithms, and a bit of technology to solve the problem with a definitive method. I have observed that, especially in supply chain, large organizations tend to mimic the scientific methods. They do things that share the attributes of science, like involving a lot of numbers, formulas, people with PhDs, metrics, measurements, and some kind of process. But sometimes, or actually very frequently, you don’t have anything that would actually, in my eyes, qualify as scientific. So you have the attributes, but it’s like a facade, an illusion.

Kieran Chandler: Before we get on to the supply chain side of things, do you have any real-world examples of how people have become overly reliant on scientific results and got a bit too carried away? Where have we seen shortcuts being taken?

Joannes Vermorel: There is a huge controversy nowadays in social sciences because the majority of the papers published during the last five decades in social sciences just do not reproduce, which is a massive problem. One of the root causes is p-values, a way to establish how much confidence you can have in a hypothesis. For example, let’s say I postulate that eating strawberries is good for your health. I make a measurement and validate this hypothesis. The problem is that if you test thousands or millions of hypotheses, you will generate tons of hypotheses that exhibit nice p-values, so things that seem very confident. But the issue is that you’ve tested so many hypotheses that according to the limited data, some of them happen to be true almost completely out of sheer randomness. In supply chain, you have many similarities where methods have the look of something very rational, but when you scratch under the surface, it’s profoundly irrational.

Kieran Chandler: Let’s look at supply chain examples. What are those things that look easy on the surface but actually, once you get deeper into it, are not at all and are a lot less rational?

Joannes Vermorel: Probably demand forecasting. You have entire books written about how you can build demand forecasting models. We have an entire literature on that, from the old vintage models of exponential smoothing, Holt-Winters, and whatnot. You would think statistical demand forecasting is something very established. You have a forecast, you can do a benchmark, you can do backtesting. Testing, it looks like the archetype of something that is super scientific, and my point is that frequently, it’s not. It’s not scientific at all. It’s very naive actually. It’s one of the first mistakes I made at Lokad a decade ago, was to think that this kind of algorithmic approach to demand forecasting was actually working. It’s not, for actually quite a lot of reasons. One of the reasons is that you end up with a self-prophetic effect. You know, if you forecast that you’re not going to have any demand for a product in a store, then maybe you’re not even going to put the product in the store, and thus you will end up with the self-fulfilling prophecies. If you forecast zero demand, then you don’t put any stock, and then you don’t sell anything, and then your forecast is 100% correct. And then I guess you’d have more confidence in the forecast. It’s statistically proven, and yet when you think about it, it’s kind of dumb business-wise, and actually, it is dumb. So that’s an extreme example. A slightly more advanced example but still fairly dumb is like when you want to do a forecast for a fashion brand. If you want to forecast how many units are going to be sold for a given product, well, you just have to look at how many units were produced in the first place. If you produce 1,000 shirts, well guess what, you are going to sell 1,000 shirts, minus shrinkage. But how does it come? Well, it comes from the fact that if you don’t sell all those 1,000 shirts, you’re going to heavily discount your products, put them on sale, and ultimately they will get sold. So, you have this naive self-fulfilling prophecy where whatever you produce is basically what you end up selling, no matter what.

Kieran Chandler: So what’s the alternative then? Because it sounds like you’ve got a method, the method kind of works, it’s getting in the right ballpark.

Joannes Vermorel: The alternative is first, you have to acknowledge that the situation is complex, that you have humans involved, that you have a complex system with feedback loops all over the place, that you have second-order effects. Second-order effects are, for example, when you give a discount to a client. What do you create? Obviously, this discount costs you. These are euros or dollars in margin that you don’t get. That’s the first-order effect of the discount. Another part of the first-order effect is that you have probably some kind of boost of the demand. You put a product on sale with a lot of discounts, sales are going to increase, usually. But the second-order effect is that you create an expectation in your customer base of actually buying products at a discount.

So again, nationalism or scientism is sort of a method where you focus mostly on first-order effects, where you take shortcuts, you kind of simplify the situation, and so you end up with something where it’s easy to do measurements, it’s easy to produce metrics, and it’s easy to do benchmarks and test hypotheses. But the fact that it is easy doesn’t mean that it’s relevant. It’s not because something is easy that it’s actually good for your business.

Kieran Chandler: So how can oversimplification of this kind of impact supply chain professionals? What’s the result of it? I have a question about mathematical models and quantitative modernization. How important is human judgment in this process, especially when considering complex systems?

Joannes Vermorel: Human judgment is crucial. High-level human intelligence is needed to assess whether a model is the right way to look at a problem. This is part of science, but it’s not just a matter of simple measurement and testing hypotheses. It requires high-level judgment to ensure that the model is aligned with reality.

When tackling a certain domain, you need the right perspective. For example, when trying to tackle supply chain issues, you need a perspective that makes sense for large groups of people and societies. This cannot be proven in a scientific way, but rather through a difficult discussion among people of good faith who are trying to get closer to the truth. It’s not about having a naive measurement to prove someone right or wrong, but there is more subtlety involved.

Kieran Chandler: Have supply chain professionals noticed the shortfalls of these models?

Joannes Vermorel: Supply chain professionals are indeed highly educated and interested in their field. The issue with scientific and rational models is not that they happen to uneducated people, but rather that they occur among highly educated individuals. Someone with a limited educational background might not be impressed by formulas or protocols and would be skeptical of complicated things they don’t understand. On the contrary, highly educated people are more likely to face these issues because they’ve witnessed the effectiveness of certain scientific models, like the laws of thermodynamics.

The problem arises when people try to apply the same approach to supply chains, expecting the same level of predictive power as in other scientific domains. However, humans are not particles, they think and react differently. For example, customers will adapt to discounts and anticipate future actions, effectively gaming the system. This happens repeatedly in supply chain management.

Kieran Chandler: The issue I have with this is that humans are naturally risk-averse. If there’s proof to back up their claims and forecasts, they will use them. So, what do you think about the role of science in this context? Is there really an alternative to using data in supply chain optimization?

Joannes Vermorel: On the surface, I tend to agree with the sentiment that it’s better to back up your claims with data. However, you have to be very careful, especially when dealing with complex, high-dimensional problems involving multiple variables and agents like humans or companies that can react to whatever you do. These problems can become quite wicked. Of course, you want to back up your insight with as much data as you can, but there is a difference between true rationalism and naïve rationalism. You still need to have high-level judgment supervising all your calculations and models to make sure there is a suitable perspective for the analysis and that there are no extreme loose angles that would undo everything you’ve just done.

Kieran Chandler: Let’s talk about that high-level judgment. Some might argue that what we do at Lokad involves a fair amount of naïve rationalism, and many of our clients don’t completely understand all of the “black magic” that our supply chain scientists use. How do you ensure that our clients understand what’s going on?

Joannes Vermorel: What I really care about is the high-level understanding, rather than the technicalities. The technicalities are mostly inconsequential. For example, I know that when I use a specific function, the calculation I get is an approximation that may be off by one part per million, but when I’m dealing with a supply chain where the uncertainty is around 40%, this level of approximation is inconsequential. For our clients using Lokad, the things they do not understand are often very technical but largely inconsequential. It’s much more important to make sure that Lokad gets it right when it comes to factors like lead times and taking into account events like the Chinese New Year, which adds four weeks of lead time every year.

Kieran Chandler: Weeks of extra lead times are going to be straight in your lead times. It’s not subtle, and that’s the sort of thing where this high-level judgment lets you decide that, no, I need to take that into account. I can literally, through naive observation, judge whether you’re even taking that into account. You don’t need a microscope; the effect is strong. Potentially, you know, one century from now, we will have refined the methods so fully that people will start to analyze the effect that has like half of their earthly time in some specific configuration and can be really scientific about it. But right now, we are still very much scratching the surface, and getting the system approximately right is a fantastic improvement compared to what we had before. So, if we start drawing things together today, what’s the key lesson we should take? There’s good to have these scientific approaches, but they need to be combined with a lot of common sense.

Joannes Vermorel: Well, yes. I mean, first, be very skeptical about having a rational organization where you could say, “Oh, we have a forecasting team, a planning team, and a purchasing team.” All of them have their formulas and are optimizing simple metrics. It looks perfectly well-defined, rational, and customer-driven or whatever. But when I face this sort of situation, my immediate observation is that this system is not rational. It’s only displaying the attributes of rationality and science. It’s just something that looks very scientific, but it’s not at all.

Another lesson would be not to underestimate the fact that highly dedicated engineers can still be impossibly clueless about the real world. When I say highly dedicated engineers, I’m including myself in this category. I started Lokad straight out of university, very proud of my mathematical background, and thrilled with the idea of applying all those beautiful high-dimensional statistics to real-world situations. It turned out that it didn’t work beautifully. In fact, it systematically blew up in seemingly impossibly surprising ways. Looking back, it was the obvious consequence of naive rationalism at play.

So, my suggestion for wrapping it up is that you need expert insights in what you’re doing, and you need to have this common sense of, “Okay, does this roughly make sense?” If not, no matter how many buzzwords and keywords you throw at the problem, what you’re going to get is nonsense at an industrial scale.

Kieran Chandler: I never thought we’d hear you say that science and math can sometimes not be all you need.

Joannes Vermorel: That’s true.

Kieran Chandler: Okay, that’s everything for this week. Thanks very much for tuning in, and we’ll see you again next time. Thanks for watching.