C. HARP, Seeq, New Orleans, Louisiana
Business philosophies often stress the importance of continuous improvement to gain sustainable competitive advantages. One of these philosophies—lean manufacturing, christened after Taiichi Ohno’s lean principles—was born out of necessity during the first oil crisis in 1973, and it became the foundation of the Toyota Production System, or the “Toyota Way.” In response to a sudden drop in customer demand, Toyota homed in on eliminating waste, leading to the widespread adoption of just-in-time production systems.
Lean manufacturing has evolved and iterated numerous times since its introduction in the 1970s. However, despite advancements, the manufacturing industry still wastes an estimated 20% of every dollar spent.1 While reducing operating expenses (OPEX) and improving return on capital employed remain crucial objectives for continuous improvement and process optimization, some 9,000 companies have commitments to shareholders to significantly reduce carbon intensity by 2030, according to the United Nations.2 Additionally, 929 of the Forbes 2,000 companies have stated commitments to achieve net-zero sustainability goals by 2050.3
The evolving business landscape is placing increased emphasis on optimization and is striving to achieve both economic and environmental goals by utilizing process data. Beyond traditional cost-saving measures, companies are recognizing the pivotal role of process optimization in reducing carbon intensity and achieving sustainability targets.
Identify and eliminate waste. The concept of the “seven wastes” listed below is one framework within lean manufacturing that directly aligns with sustainability efforts. Waste can be defined as any activity or action that adds no worth to a product or service from the customer's perspective. The seven wastes comprise:
Each of these wastes has economic or environmental impacts, including energy consumption, emissions or resource depletion. For example, overproduction can lead to excess inventory, which may result in waste if products become obsolete or if they are disposed of prematurely; unnecessary transportation contributes to carbon emissions and resource usage; and motion waste and waiting time can increase energy consumption and reduce operational efficiency.
By identifying and eliminating these wastes, organizations not only improve operational efficiency and reduce costs, but also minimize their environmental footprint. This aligns with the broader goal of sustainability, making the concept of the seven wastes a powerful driver for achieving both economic and environmental objectives.
Data plays a crucial role in identifying and addressing these wastes. By leveraging data analytics and insights, organizations can pinpoint areas of waste within their processes and take targeted actions to eliminate them. This data-driven approach provides concrete evidence to support the need for change, helping to overcome complacency among stakeholders and secure leadership buy-in for sustainability-focused process optimization initiatives.
Once nonvalue-added activities are identified through process mapping or value stream mapping, the next step is to prioritize process optimization efforts. By analyzing the underlying data related to each activity, organizations can determine which activities have the most significant impact on the process's overall performance and value.
Prioritize process optimization. Collecting baseline data is a key step for data-driven prioritization, which necessitates comparing ongoing operational data with the baseline to measure improvement. This requires access to potentially years' worth of process and production data to normalize for seasonality and demand variability.
Steam generation and consumption optimization is a common high-value activity for manufacturing organizations, with the potential to significantly reduce greenhouse gas (GHG) emissions. This procedure requires first addressing key sources of waste, such as overproduction, unnecessary transportation and waiting time. Steam overproduction can lead to excess inventory and wasted energy, and unnecessary transportation over long distances results in energy loss and increased costs.
Conversely, insufficient steam capacity can create waiting time waste, causing production delays, inefficiencies and shortfalls. By optimizing steam generation and distribution processes to match demand and reduce excess, manufacturers can minimize these forms of waste, improving overall efficiency and sustainability.
Process optimization—such as with steam generation and consumption—in manufacturing requires an understanding of seasonality, demand-based usage and fixed energy consumption requirements. These components are challenging to track using outdated tools (e.g., spreadsheets, basic databases), but modern advanced analytics platforms significantly simplify this task. These software platforms empower users of varying process expertise to easily create empirical models and soft sensors that provide understanding of the energy demand within processes and help operators identify anomalies for further investigation.
Increasing steam recovery to reduce GHG emissions. Allnex, a global producer of industrial coating resin and additives, used the author’s company to develop a supervised machine-learning (ML) algorithm to calculate total steam demand based on instrument data and steam valve positions. FIG. 1 displays area steam flowrates, control valve on/off statuses and steam demand from equipment (e.g., steam ejectors) throughout the facility. To account for seasonal impacts, ambient temperature was also considered for model training, and contextualized data identifying offline time periods enabled the quick elimination of insignificant periods.
The advanced analytics platform empowered a simple visualization of the model’s fit (FIG. 2) in addition to statistical model performance metrics, providing insights and parameters to understand the underlying patterns in the data.
These insights enabled engineers to prioritize investigations of failed valves and steam traps, based on statistical significance in the model. The resulting energy model also enabled the company to identify anomalies throughout the entire automation system and process equipment to improve energy efficiency and reduce OPEX.
In one instance, the model returned an unexpected result in which the steam ejector isolation valve did not provide statistical significance, indicating potential equipment malfunction. The valve was tested, and the team determined it was not properly closing, causing waste during idle periods. High fixed-energy loads indicated opportunities centered around steam trap failures and other equipment issues.
After creating and deploying a baseline energy model, deviations from the baseline were used to prioritize maintenance efforts and initiate investigations, suggesting that an instrument or valve likely failed.
As a result, the company improved steam recovery by 30% while reducing steam consumption by 15%. Additionally, it reduced GHG emissions by approximately 4,600 t of CO2e/yr,4 which is equivalent to the carbon footprint of 1,000 passenger vehicles on the road in a year, or 86,600 MMBtu of natural gas.5
Optimizing water removal during distillation. A global manufacturer of oil and gas additives also used the author’s company to create a first-principle energy model to determine vacuum distillation completion in a batch process. In the application, multi-stage steam ejectors were employed to create the vacuum necessary to remove excess moisture used in the slurry process for dry bulk charge. The recipe maintained the distillation step for a fixed duration after the batch achieved a pressure threshold, even though indications showed that water removal had ceased. This scenario exemplified overprocessing and waiting waste. Reducing the duration of the distillation step would enhance unit capacity, while simultaneously decreasing the steam requirements needed to achieve the necessary vacuum.
During the distillation step, the heat input was kept constant to prevent excessive carryover. As moisture was removed, evaporative cooling caused a decrease in reactor temperature—when the heat input to the reactor temperature returned to the original rate, the amount of water removed became negligible (FIG. 3).
The temperature rate of change was used to calculate energy removed from the reactor contents, and steam properties available in the advanced analytics platform's extensive formula tool were used to calculate the mass rate of moisture removed. Totalizing water removal over the active distillation period facilitated a direct comparison between the required and actual removal, and this was validated by analytical testing of the water concentration (FIG. 4).
The data conclusively demonstrated the feasibility of process optimization, enabling cross-functional teams to efficiently prioritize their implementation efforts. As a result, the optimized process reduced batch cycle time by 60 min and minimized vacuum ejector steam usage. This success not only validated the optimization strategies’ effectiveness, but it also underscored the tangible benefits of collaborative, data-driven decision-making for driving substantial operational efficiency improvements.
From the “Toyota Way” to the present day. The journey of process optimization, driven by lean principles and data-driven insights, highlights the profound impact of enhancing efficiency and promoting sustainability. From lean manufacturing’s inception during the oil crisis to the modern-day focus on reducing carbon intensity, the evolution of process optimization has showcased an ability to facilitate tangible operational improvements.
Advanced analytics platforms place process optimization in employees’ hands, fostering a culture of continuous improvement that drives long-term sustainability and competitiveness. By embracing lean principles and prioritizing data-driven process optimization efforts, organizations can strategically allocate resources, streamline decision-making, reduce waste and significantly increase process efficiency. HP
LITERATURE CITED
Chris Harp is a Senior Analytics Engineer at Seeq. He has a process engineering background and earned a BS degree in chemical engineering from Louisiana State University, and a BA degree in economics from the University of Alabama. Harp has nearly a decade of experience working for and with oil and gas companies, including ExxonMobil and Chevron, to solve high-value business problems. In his current role, he enjoys supporting industrial organizations as they maximize value from their time series data.