00:00:07 Demand-Driven Material Requirements Planning (DDMRP)
00:00:39 The big idea behind DDMRP and traditional MRP
00:02:21 Decoupling lead time in DDMRP and its impact on the supply chain
00:05:42 Net flow equation in DDMRP and its effectiveness
00:07:48 The importance of distinguishing known and unknown demand in DDMRP
00:09:00 Decoupled explosion and its consequences.
00:10:25 Manual selection of decoupling points and concerns about human involvement.
00:12:02 The importance of machine-driven numeric optimization.
00:14:00 DDMRP’s relative priority and issues with maintaining core assumptions.
00:16:01 Criticism of optimizing percentages instead of focusing on economic drivers.
00:17:18 Comparing the effectiveness of DDMRP and Flow Casting.
00:18:37 The lack of remaining insights from DDMRP when dysfunctional numerical recipes are removed.
00:19:48 The usefulness of moving average in the frequency domain as an insight from DDMRP.
00:22:12 Closing thoughts.

Summary

In an interview, Kieran Chandler and Joannes Vermorel discuss Demand-Driven Material Requirements Planning (DDMRP), a method for improving supply chain efficiency using decoupling points or stock buffers. While DDMRP has innovations like strategic decoupling, net flow equation, decoupled explosion, and relative priority, Vermorel raises concerns about its reliance on manual intervention and optimization focus. He emphasizes the need for automation and prioritizing economic drivers over percentages. Vermorel suggests modern numerical optimization algorithms would render DDMRP redundant, but acknowledges its valuable insight of using moving averages in the frequency domain for erratic demand patterns. Overall, he believes modern techniques are better for supply chain optimization.

Extended Summary

In this interview, Kieran Chandler, the host, discusses Demand-Driven Material Requirements Planning (DDMRP) with Joannes Vermorel, the founder of Lokad, a software company specializing in supply chain optimization. They explore the big idea behind DDMRP, its practical applications, and the four major innovations that it claims to offer.

DDMRP is a multi-echelon planning and execution method that aims to improve supply chain efficiency by strategically placing decoupling points or stock buffers. These decoupling points are designed to help organizations overcome the limitations of classic Material Requirements Planning (MRP) software, which can struggle to make accurate calculations for complex supply chains.

Vermorel explains that MRP software works by representing the relationships between various components and sub-components in a product, such as a car, as a graph. This graph represents the dependencies between different parts and helps calculate the requirements for producing the finished product. However, MRP software often lacks accuracy and can produce poor results.

DDMRP attempts to improve upon these limitations by introducing decoupling points into the graph. These points represent components or parts that have inventory, meaning they can be assumed to be always available. This allows for the calculation of lead times that are numerically much lower than what classic MRP software would produce. Vermorel notes that while this approach can improve upon the baseline provided by traditional MRP, it is still far from what could be achieved with modern numerical methods.

One of the criticisms Vermorel raises about DDMRP is that while the decoupling points may reduce the calculated lead times, the supply chain still retains a significant amount of inertia. This means that despite the appearance of improvements, the actual performance of the supply chain may not be as optimized as it seems.

Strategic decoupling involves introducing points in the supply chain where lead times can be reduced, thus shortening the overall lead time. Vermorel argues that while this approach can numerically reduce lead times, it does not significantly reduce the inertia in the overall network. The challenge lies in the semantic problem of understanding how strategic decoupling points affect the supply chain as a whole.

The net flow equation, the second point of discussion, is a simplistic method for maintaining buffer points in the supply chain. It considers stock on hand, subtracting guaranteed demand or qualified units, to determine the stock remaining to serve uncertain demand. Vermorel believes that DDMRP (Demand Driven Material Requirements Planning) is correct to distinguish between known and unknown demand. Many early ERP (Enterprise Resource Planning) implementations would naively forecast all demand, including the portion that is already guaranteed. Vermorel argues that this approach is fundamentally flawed, as it attempts to predict a future that is already known, leading to forecasting difficulties.

The third key innovation discussed is the decoupled explosion, which deals with the consequences of introducing two types of nodes in the supply chain graph: master nodes and decoupling points. Decoupling points are locations in the supply chain where lead time propagation is halted in calculation (but not in reality), and a certain degree of inventory is maintained. The decoupled explosion involves simplifying the Bill of Material (BOM) by skipping over secondary nodes and directly connecting to the decoupling points. This graph simplification aims to streamline the supply chain process.

Vermorel expresses concern over the reliance on manual intervention in supply chain management, specifically when it comes to the introduction of a “Yokai” in the graph to mitigate nonsensical consequences of simplistic numerical recipes. He explains that supply chain practitioners are often responsible for choosing the decoupling points, which may not be stable or consistent over time. This is due to the constantly changing nature of the supply chain environment, as well as the potential for suppliers to change their strategies or locations.

The discussion emphasizes the need for automation in this process, as relying on human intervention can lead to inefficiencies and inaccuracies. Vermorel points out that it is not a good use of practitioners’ time to manually select decoupling points for complex products with thousands of parts. This is especially true given that market conditions are constantly changing, making it difficult for practitioners to accurately predict or account for every variable.

Moving on to the concept of relative priority in supply chain management, Vermorel explains that this involves ranking items in terms of targeted stock quantity. While this method has merit, he believes it would be more effective to rank items based on their economic strengths. The introduction of DDMRP (demand-driven material requirements planning) style first-class citizen nodes, or decoupling points, in the supply chain graph relies on the assumption that stock is always available. When this assumption is violated, the entire system can falter.

Relative priorities aim to address this issue by prioritizing items that are diverging the most from the core assumption of ongoing availability. While Vermorel acknowledges that this is a sensible part of the overall methodology, he also points out that it still involves a level of human intervention and prioritization, which may not be the most efficient or accurate approach.

They discuss the effectiveness of Demand Driven Material Requirements Planning (DDMRP) in supply chain optimization. Vermorel criticizes DDMRP, stating that it focuses on optimizing percentages rather than optimizing financial aspects such as cost of stock, waste, and non-service. He argues that supply chain decisions should be prioritized according to overall business goals expressed as economic drivers.

Vermorel compares DDMRP to flowcasting, saying that while flowcasting has some fundamentally incorrect math, it offers valuable insights that would remain relevant even after fixing the math. On the other hand, DDMRP is seen as an incremental improvement on a flawed baseline. Vermorel suggests that using modern numerical optimization algorithms would render DDMRP redundant.

Despite the criticisms, Vermorel acknowledges one positive insight from DDMRP: the use of moving averages in the frequency domain, as opposed to the time domain. He explains that averaging demand over a fixed period (time domain) is less effective than averaging demand over the last 100 units served (frequency domain). This approach is more numerically well-behaved when dealing with erratic and spiky demand patterns. In conclusion, Vermorel sees value in the frequency domain analysis in DDMRP, but he believes that modern numerical optimization techniques are better suited for supply chain optimization.

Full Transcript

Kieran Chandler: Today on Lokad TV, we’re going to understand if this method actually works in practice by looking at the four big innovations. So Joannes, we kind of touched upon it in the introduction, but what’s the big idea behind DDMRP?

Joannes Vermorel: The big idea is that you start from a very classic MRP perspective, where it all boils down to the analysis of a graph of dependencies. Just to clarify for people who are listening, let’s say you want to build a finished good, like a car. You need parts, but the parts you need for a car are themselves assemblies that also need parts of their own. So you have a hierarchy of components, like a car needing an air conditioning unit, and the air conditioning unit needing a pump, a valve, and so on. When you think of a product and all the parts it needs, it’s basically a mathematical graph, similar to a subway map with edges.

This graph starts with the finished product at the top and explodes to the subcomponents, and then every component has sub-subcomponents and so on, recursively. If you have a very complex product, you can have a very complex graph representing all the parts down to the very basic materials. The MRP, the manufacturing resource planning software, first represents this information, so you can have this graph of dependencies represented. Then, it does a series of calculations to help you produce and execute all those requirements to end up with the finished goods. Typically, it doesn’t do such a good job at making many of those calculations, and DDMRP provides a series of recipes to make it work better.

Kieran Chandler: So how does that actually work in practice, and would you say it’s somewhat of an oversimplification?

Joannes Vermorel: The first innovation they claim is the decoupling of lead times. We have to realize that their baseline for improvement is nonsensical numerical recipes, incredibly naive from a numerical optimization perspective. If you choose the decoupling points right, you will improve compared to a very poor baseline. You become less dysfunctional, but it doesn’t mean you’re anywhere close to what you could get with actual modern numerical methods.

The key idea of the decoupling points is that instead of having every node be just like any other node, we decide that we have first-class citizens, the decoupling points, and second-class citizens where we don’t decouple. At every point that is decoupled, this part or component is going to have inventory, and thus you can assume that this thing is always available. Instead of taking the longest path for manufacturing, you take the longest path to manufacturing until you hit one of those decoupling points.

But my first criticism for the decoupling lead time is that, yes, when you introduce those decoupling points, you do end up with a lead time that is numerically much lower. However, your supply chain still has a lot more inertia. You’ve gamed the way you compute the lead time by introducing those decoupling points.

Kieran Chandler: But the inertia still exists, we’re beyond what you say. So that’s how they end up with saying we introduced decoupling strategic points and we can reduce the lead time by 80%. Numerically speaking, you end up with a lead time that is much shorter, but the reality is that you have not reduced the inertia that you have in your overall network by a factor that is as large as what you have with those decoupling points. There is a semantic problem here, and I will maybe get to that in the concepts. Let’s move on to the second point of iteration in DDMRP, the net flow equation. It’s basically a way of maintaining those buffer points and so using things like pre-orders, things that we already know that are going to happen. How well does this actually work in practice?

Joannes Vermorel: The net flow equation makes a bit of sense really. It is an incredibly simplistic equation: stock on hand minus already guaranteed demand, what they call qualified units. So, the demand that is kind of a sure thing. What you have with that is the stock that remains available to serve the uncertain demand. The net flow equation gives you the amount of stock that you have to cover things that are not already a pure matter of execution because you already know that it’s coming with almost no uncertainty.

I think DDMRP is correct to distinguish the very separate things that are already known from the unknown. For example, if you have a complex manufacturing process and you’re maybe serving other industrial clients and the client can tell you in two months from now, “I want a thousand units to be delivered at this date,” and you have time to do that, you have at this point to execute this delivery. There is no forecast involved. If your lead times are less than two months in total, then basically it’s a pure matter of execution with zero uncertainty involved.

Of course, people can still cancel their orders and whatnot, but let’s say it’s fairly safe. It’s very different from maybe two months from now, there will be a client that shows up and actually asks for a thousand units. I believe DDMRP is completely correct in saying that you should not try to have this kind of super naive approach, which is to forecast everything including what you already know.

The question is, why are they even stating that? Well, it’s because most of ERP systems, many early implementations, were doing incredibly naive things. They were saying, “We are just going to take the easy path, which is kind of the dumb path,” and they would forecast the demand, all of it, including the portion that is guaranteed already. But it’s very dumb because then you’re trying to guess a future that you already know, and guess what? It’s very hard to forecast. So if you know something about the future, you should not even try to use statistics to discover that, you already know it.

Kieran Chandler: If we move on to the third key innovation, this so-called “decoupled explosion” sounds really dramatic. What’s going on here?

Joannes Vermorel: This is another consequence of introducing two types of nodes in your graph of requirements. Remember, we introduced master nodes in the graph that are those decoupling points, which are the points that stop the lead time propagation in the calculation, not in reality, but in the calculation, and that are the points where you will want to ensure some degree of inventory. What they say is that instead of having the bill of materials propagate from node to node directly by saying, “I take the bill of materials and it propagates to my parent nodes, the subcomponents that

Kieran Chandler: My parents, the subcomponents that I need to build the finished good, I’m saying that basically when they decouple present, they say that the Bill of Material, we are going to just completely skip all the second-class citizen nodes to directly jump to the decoupling points. So, in a way, again, it’s a graph simplification technique. I mean, it’s based on this hierarchy in the graph that was introduced with first-class citizen nodes and second-class citizen nodes. And who’s actually choosing those first-class nodes, if you’re looking at something like an aeroplane, it’s got millions of different levels to it. I mean, who’s actually making those choices?

Joannes Vermorel: Supply chain practitioners, which is also for me a big cause of concern. Because basically, yes, you can manually introduce a hierarchy in a graph to kind of mitigate the nonsensical consequences of very simplistic numerical recipes. So yes, that will kind of work. But indeed, you end up with supply chain practitioners who need to manually introduce such decoupling points. And guess what? It’s not actually stable. What is like a good choice for those decoupling points is not an environment. Why? Because if there is a part that you decide to externalize, you know, to buy from a supplier or to buy from a supplier that is closer or on the contrary, much more distant, you can change quite profoundly what is happening around whatever depends on this part in your supply chain network.

So, your decoupling points should technically, even if it kind of works to introduce this hierarchy in the graph, there is no reason to think that it’s stationary and that you can choose them once and that it will be good forever. So my perspective is this should be something done completely automatically. You know, here we are talking about numerical recipes, and we say we have a dysfunctional numerical recipe, and we say with a lot of human insights and tweaks, we can have the numerical recipe that is a little bit better.

Kieran Chandler: Okay, so basically, when humans get involved, we always manage to screw things up somehow.

Joannes Vermorel: Yes, but also, it’s not making such good use of the time of those practitioners. I mean, as you were describing, if you have a complex product with thousands of parts, why do you want to invest potentially hundreds, if not thousands, of man-hours of your supply chain experts to manually select those decoupling points? You could say, oh, they have such incredible insights, right? But the reality is that it’s very noisy. Thousands of parts, market conditions are changing kind of all the time, not necessarily radically, but at least changing a bit all the time. So, we need to refresh that. It’s profoundly something that should be done by the machine. You know, there is no added value. It’s a pure case of numeric optimization.

Kieran Chandler: Okay, let’s move on to the final innovation here, which is relative priority. It’s basically all about ranking in terms of the targeted stock quantity, and I assume there are big criticisms here. We’d prefer to rank by economic strengths, would you agree with that?

Joannes Vermorel: Yes, but again, also several things. First, those relative priorities, why are they introduced? I mean, first, they start with the idea that classical MRP has a binary perspective on things like, “Am I okay or not okay?” And they say, “Why? It’s, you know, crude, super crude.” And the answer is, yes, it’s crude to the point that

Kieran Chandler: That’s completely absurd and, again, back in the 50s, people in terms of numerical optimization were already doing things that were smarter than that. So, very, very bad baseline. Okay, now the whole DDMRP style introduces first-class citizen nodes called those decoupling points in your graph, and you have an assumption that comes with that, which is stock is always available. So, when this assumption is violated, obviously, everything falls apart because your decoupled explosion is built on this assumption, your decomposed time horizons are on this assumption. So, you need to basically get your supply chain system back on track with your core assumption, and basically the price-relative priorities say that you should act swiftly for the things that are diverging the most from your core assumption, which is ongoing availability for those decoupling points.

Joannes Vermorel: It’s good indeed, that is something that makes sense, part of the recipe. But also, guess what, you are ending up with a prioritization which is partly incorrect. I’m challenging the very motivation, the very motivation is to get the system back on track with respect to the assumption that you need to have DDMRP to work in the first place. It’s like the snake eating its own tail. You introduce a methodology, this methodology comes with assumptions, and your numerical calculations do not guarantee that those assumptions will be maintained during the course of the system. So, you need to introduce tweaks so that you have some kind of feedback loop so that you can get back on track with respect to your own assumptions. But that doesn’t mean that we are back on track with something aligned with the end game of the business, and that’s where my criticism is. You are optimizing against percentages, like percentage of accuracy, percentage of fulfillment, percentage of service levels, which are, again, optimization in percentage is kind of bad. You want to have percentages in Euros, and that connects to the very end perspective, which is the cost of stock, cost of wastage, cost of non-service. And on the other side, you have all the costs and rewards of serving clients on time.

So, I very much agree with the idea of prioritizing the decisions, but I very much disagree with the idea of prioritizing decisions so that you can loop back with your methodology. You need to prioritize the decisions so that you loop back with the overall business goals expressed as economic drivers for the business as a whole. So, what your supply chain is delivering as a whole.

Kieran Chandler: Okay, now let’s start drawing things together. We’ve described a lot of the flaws with DDMRP. Should we completely discount it as a technique?

Joannes Vermorel: It’s interesting because, last week, we were discussing flow casting. The thing with flow casting had some math that was dramatically incorrect, and so they were even degrading the situation compared to the baseline, which was kind of really bad. But some of the insights that it had were profoundly true and would actually survive if we fixed the math to make it work. It’s very funny because DDMRP is kind of the opposite. It’s basically something that works incrementally on top of a very, very bad baseline. If you step back and say, instead of trying to duct-tape something that is very bad, let’s directly start with good foundations, which is doing numerical optimization the right way with proper algorithms like proper probabilistic graph-based algorithms. Then, I’m not sure that once you’ve moved to a proper modern numerical framework to actually do an optimization, because that’s the thing, classical MRP systems are not really optimizing anything in a modern sense where you actually

Kieran Chandler: Doing all of that to basically duct-tape profoundly dysfunctional numerical recipes, now if we remove the dysfunctional numerical recipes, what remains?

Joannes Vermorel: The answer is very, very little. That’s where, for example, flow casting was very different because if you remove the dysfunctional numerical parts of flow casting, the other ways inside that remain are profoundly interesting, and I believe profoundly correct. DDMRP, much less so.

Kieran Chandler: If we were going to finish on a more positive note, are there any insights that DDMRP gives us that are actually pretty good?

Joannes Vermorel: Yes, I believe one is that moving average works, and it even works frequently better in the frequency domain as opposed to the time domain. Let’s go back into that. For those of you who have maybe learned at engineering school about Fourier transform, you know you can study time series in the time domain or in the frequency domain. That’s something that is done very frequently in acoustics.

When people think about forecasting the demand frequently, the moving average can work when you have demand that is stationary. Typically, when people think of moving average forecasts, they are thinking of doing an analysis in the time domain. So, what does that mean? Averaging demand over the last few weeks is a fixed period; that’s my time domain.

The frequency domain is to think instead of averaging over the last few weeks, and three weeks is fixed, I say I’m going to average my demand over the last 100 units served. Good news is that this thing of the last 100 units will behave much more numerically with respect to demand that is kind of super erratic and spiky.

Moving average in the frequency domain is actually interesting. By the way, DDMRP with those buffers, they are actually forecasts, moving average forecasts done in the frequency domain instead of being done in the time domain. They kind of rediscovered it, but it’s a very good insight. It is very valuable that analyzing things in the frequency domain works and has profound implications for the supply chain. It is a very interesting angle to optimize.

I think that’s the angle. I’m not sure if people from DDMRP see it like this, but I believe that’s a very cool and very good insight that came from DDMRP.

Kieran Chandler: Hopefully you’ve gone some way to repairing a few of those relationships. Anyway, that’s everything for this week. Thanks very much for tuning in. If you agree or disagree, make sure you leave us a comment, and we’ll see you again next time. Bye for now.