Given current advancements in computer technology, perhaps the time has come to once again reflect, evaluate, and challenge our conventional paradigms regarding “best practices” in the manufacturing environment.
Traditional “best practices”
Two decades ago, I worked in the materials organizations of a couple of major automotive tier-one companies. One of our “best practices” was to eliminate the keying of material issue transactions along with the inherent input of erroneous transactions by “backflushing.” Backflushing is the automatic issuing of components and raw materials from inventory to a production order based upon a product’s bill of material.
This was certainly much easier than keying in each individual component issue transaction, and it eliminated many part-number and quantity errors attributable to manual transaction entry.
Second, we had the ability to skip production operations with the creation of “phantom assemblies.” This meant we only had to post-production at the final assembly process, and the manufacturing system would automatically backflush all the raw materials and components consumed in all proceeding operations. Thereby, we also reduced the number of transactions that had to be manually keyed into the system.
Third, we further reduced the burden of capturing and inputting transactions by posting batches of production quantities instead of posting each individual item produced. We could wait until the end of the shift or the end of the production day, and simply post our total production of end items in one batch. The manufacturing system would then add those items to the perpetual inventory and automatically relieve all the consumed materials used throughout the shift or day.
At the end of the day, and I emphasize, at the end of the day, our perpetual inventory records were acceptably accurate. The conventional and often-used practices of 1) material backflushing, 2) phantom assemblies, and 3) batch production posting allowed us to spend almost no effort in capturing transactions and greatly reduced the number of errors input to the system.
Based upon my many conversations with manufacturing practitioners, these practices are still prevalent today and considered a “lean” approach for setting up manufacturing systems.
Taking the bad with the good
The use of these “lean” approaches to transaction entry certainly saved us the burden of continuous input of thousands of inventory transactions. But what did we give up in order to execute production in this manner?
First and foremost, we gave up visibility of what was actually happening on the shop floor. Receipts of finished goods into inventory usually happened hours after the products were actually produced. According to our perpetual inventory system, we shipped product before it was produced. We had negative inventory balances. We had hundreds of items in our component inventories that had been consumed hours earlier. When we combined the results of these practices with delayed input of inbound supplier shipments, we could never look at our perpetual inventory and really know what inventory we actually had on the shop floor. Cycle-counting was challenging and could easily cause more inaccuracy since it was hard to reconcile between the physical and perpetual inventory quantities.
This led to ongoing firefighting, expediting, and the use of premium freight to make up for our lack of inventory visibility. Manual processes were used and relied upon every day to find problems, diagnose problems, and fix problems. In reality, we mostly extinguished fires with short-term fixes that allowed us to make the day’s shipments. But the problems we had were symptoms of the bigger problem of not having the visibility we needed to see potential issues and address them before they became bigger real problems.
Changing business requirements and changing technology
Manufacturing companies are increasingly focusing their improvement efforts on more tightly monitoring and controlling their operations. Daily performance reports may be adequate for upper management, but the people in the trenches need real-time information about what is actually happening in the supply chain. If a key customer calls for a status update, a company representative needs to have the answers. An inappropriate delay sends an unfavorable signal to the customer. Future business or programs may be awarded to the suppliers that can provide high-quality products along with high-quality customer service.
The concept of a digital twin addresses this need. The digital system needs to accurately reflect what is actually happening on the shop floor, in the warehouse, and anywhere else that is depended on for satisfying the customer’s requirements.
The manufacturing system needs to reveal what is happening and not only what did happen. Managers at all levels need to have access to this data in order to make informed decisions throughout the production day. Without this level of real-time information, managers will encounter sometimes frequent surprises; unplanned events and circumstances that threaten on-time delivery and cost containment, and perhaps even product quality.
Further, companies are also collecting more data than ever before. Examples include machine data (mold temperature, oil pressure, line speed, etc.) collected at the time of production, which could be used to correlate varying production characteristics to product quality issues for items actually built at that time. This is near impossible if the actual time of product manufacture is not known. The collection of product-related data such as component lot/serial numbers may also be required during sub-assembly operations. The scanning of lot numbers can trigger the production posting and associated backflush of non-serialized parts.
Fortunately, new technological advancements have made real-time transaction processing practical:
- Faster computing: Lot sizes of ONE combined with many more production reporting points very likely would have been too much for past-generation hardware and software to handle. This would almost certainly be the case for manufacturers of complex products with broad and deep product structures.
- Transaction collection: Great advancements in data collection methods such as barcoding, RFID, and BLE technology have made transaction entry more reliable and less costly.
- Real-time analytics: Today’s software not only provides passive analytics but can also proactively monitor system data and detect issues and potential issues, then alert the right people in time to take corrective action. This can help avoid incremental costs and customer service issues.
Are the past best practices still valid?
Backflushing, phantom assemblies, and batch reporting are practices that can corrupt the real-time nature of system data and the information it produces. The value of a real-time system is in the visibility it provides to the user community for proactively monitoring operational status, to be alerted to issues and able to adequately evaluate the issues to determine appropriate corrective action, and then to execute the corrective action. The lack of real-time information impairs the ability to do any of these.
However, these old, established methods still have a place in our manufacturing systems. The key, as always, is balancing the costs of real-time data acquisition with the derived benefits of real-time information. Twenty years ago, we did not even attempt to capture real-time data because we did not depend on the computer system for real-time information. When we needed real-time information, we walked out to the plant floor to see what was actually happening. That was sufficient and acceptable in those days. This was how we were taught to do things.
Today, however, employees want and need real-time data from their manufacturing systems in order to do their jobs and maintain a company’s competitiveness. Does this mean that every single material movement or conversion needs to be captured in real time? Probably not; the answer is somewhere between absolute real time and no real time. It is apparent, however, that the more innovative, most aggressive companies are leaning towards increasing the amount of real-time data and information made available for their employees.
In other words, manufacturing companies are moving from merely maintaining “systems of record” as required by their accounting team, and moving rapidly towards “systems of interaction” which mirror the events happening within their operations. The creation of this “digital twin” is essential if the information is intended to be of value to all of those people working in materials, sales, quality, maintenance, and practically everyone else within the manufacturing enterprise.
While the use of backflushing, phantom assemblies, and batch processing continue to be widespread approaches to setting up manufacturing systems, evolving business requirements and technology improvements are requiring all of us to reevaluate the use of these approaches. New paradigms are quickly evolving in the world of the digital twin.
For more on digital twin technology, see Digital Twin Excellence: Two Shining Examples.