As artificial intelligence (AI) and machine learning (ML) have become more widespread and accessible, manufacturers have become increasingly interested in this new capacity to increase productivity (use less, waste less), precision (drive quality and specification), and performance (use workforce and equipment resources better and more flexibly). 

With a focus on Smart Manufacturing (SM), it is useful to think about AI as software systems that can recognize, explain, and forecast situations, conditions, and properties for real-time human and machine control, management, optimization, and automation. By applying algorithms that map, interpret, and interpolate numeric data-centered relations, ML is a subset of AI that uses prior data to identify current state and predict future state. Other kinds of AI already useful in manufacturing operations include "feature recognition" used for image analysis and "natural language processing," which is important for workforce and machine interactions. Digital twins, and a full spectrum of other kinds of simulation, are used as dynamically-synchronized virtual models to analyze physical equipment and operations. ChatGPT is getting a lot of attention for applications involving text (i.e. large language models – LLMs). ChatGPT’s AI engine can also be used with numeric data. In general, AI depends on access to enough of the right data, the domain knowledge to interpret it, and the skills to engineer it for sustainable manufacturing solutions.

From a manufacturing perspective, it is essential that SM (defining breadth and scale of application) and AI (providing important methods for learning from data) are adopted by manufacturers. Large as well as small and medium-sized manufacturers (SMMs) alike can leverage SM and AI capabilities to improve individual factory operations, drive product quality, increase flexibility, gain new insights, and pursue market growth. And, SM has significant benefit beyond just factories. Supply chains are becoming digitally connected and interoperable for performance and resilience. Environmental sustainability depends on factory, supply chain, and industry performance together. SM and AI capabilities within factories facilitate cross industry requirements for the needed data.

Smart Manufacturing (SM) and AI

SM is about having the right data whenever and wherever it is needed for people and machines to control, manage, and optimize manufacturing operations at scale. Scale and interoperability span equipment, factories, supply chains, and ecosystems. The consistent organization of operational data foundationally rests with the factory, but is critical to using AI/ML more broadly. With a factory’s commitment to data, AI/ML grows by implementing solutions that deepen operational understanding, improve forecasts, help respond more quickly and proactively to issues, and support higher level KPIs in less costly ways. Automation is the use of data and models in those well-defined situations where there is sufficient trust and understanding of the data, operation, and machine action for human involvement to move into an oversight role.

What to Know Before Implementing AI

SM is about the real-time integration and orchestration of business, physical, and digital processes. AI provides the capability to learn from and expand the use of data. Successful SM and AI solutions require changing business and operational approaches so that the collection, availability, and use of data are all managed as key assets.

Engaging an AI strategy is best started with small, simple, and low-cost solution objectives. There are plenty of situations where data from a few sensors can produce significant benefit including practical ways to add sensors to existing operations, including older equipment. These “low hanging fruit” solutions are equally important for simultaneously building experience and infrastructure for data collection and contextualization for expansion and scaling. This includes identifying consistent tag names, data types, and functional naming conventions for data reusability. The challenge remains in ensuring enough of the “right data” is available for the solution of interest. For example, recognizing a process deviation or excursion is different than diagnosing the excursion or forecasting its probability.

Also, the number of algorithms packaged into different AI products is overwhelming. This is navigated more effectively, however, keeping in mind that a clear problem description and the nature and availability of the data are the better starting points. It will be the case that the more quality data available, the more robust the solution. Although not widespread yet, there is great benefit in sharing data to increase robustness and to avoid “reinventing the wheel.”

Applications of AI

Asset management and product quality are good starting applications.

For example:

  • Assessment monitoring can be done by monitoring groups of measurements as snapshots in time to identify patterns of excursions or irregularities, classify situations, or extract features. This kind of pattern classification is useful for identifying when maintenance is needed, when there has been an excursion from normal operation, and when product quality is expected to pass or fail.

  • Estimation/prediction modeling is used to model the relations between sets of input parameters and output attributes so that desired product attributes or operational performance metrics can be estimated. One useful class is the soft sensor (i.e., making an offline measurement into an on-line measurement).

  • Operational mapping can be used to relate operational outputs back to inputs for control and management purposes. A management example would be mapping product material requirements to raw material supplies to reduce downtime when having to resupply.

  • Feature recognition (from images) is used to recognize observable conditions (e.g., product defects, material conditions, and equipment conditions). By using technologies similar to facial recognition, defects can be spotted as unwanted features. Changes or problems with operations can be observed.

Creating AI Solutions

Settling on an AI application is challenging because the possibilities are endless, and there is no such thing as counting on the algorithm to figure out the solution. A systematic approach to creating AI solutions can further avoid pitfalls like building algorithms that look like they are working when they are not.

It is most important to start small and expand AI solutions by building on first successes. It is equally important to build necessary experience with the nature, condition, contextualization, and engineering of the data when the scope and complexity are lower. Every solution foundationally involves a data and AI model building lifecycle that includes:

  1.  Starting with a well-defined problem that can be addressed with available data
  2.  Collecting, contextualizing, and selecting relevant data and features
  3.  Preprocessing, cleaning, and aggregating selected data
  4.  Selecting an algorithm; dividing data for training, evaluation, and testing; iteratively training and refining the ML model by tuning algorithm parameters
  5.  Evaluating the model for performance, accuracy, precision, and recall
  6.  Deploying the model with a phased approach to build confidence and experience
  7.  Being prepared to tune and adjust as new data become available or changes occur

Although established, ready-to-use AI algorithms and computational engines exist, manufacturers must be involved with the data needed to train and tune an algorithm. Off-the-shelf AI solutions, cloud services, AI providers, and consultants are important resources. Be wary of providers (and the resulting AI model) that attempt a solution without the domain experts.

With so many AI resources out there, manufacturers need to also keep in mind that infrastructure for managing data through the lifecycle steps will be needed. There are multiple infrastructure solutions, including cloud solutions that can minimize the on-premise investment. Like algorithm development, these decisions also depend heavily on the nature of the data, the solution objectives, and the AI experience level. Lastly, and repeating a key point, there needs to be an appreciation that a factory operation, including a single unit operation or machine tool, is not likely to generate enough of the right data for many objectives of interest. Aggregating data will likely be needed to build sufficiently robust algorithms.

Conquering AI With Trusted Partners

Recognizing that the application of AI is a journey, it is wise for SMMs to start small and partner with SM providers who can guide their development. Going the "DIY route" tends to be unproductive. There is no reason to re-invent the data, algorithm, or infrastructure requirements of an AI solution. Furthermore, doing so can easily lead to a one-off, unscalable solution with data that are not reusable. It can also increase incompatibilities and security vulnerabilities. Instead, trusted partnerships or coalitions can provide training and help tackle many common requirements that can be costly and time consuming. For example, they can suggest algorithms, configurations, and data processing methods that have been field-tested for similar applications. Partnerships are particularly valuable for the hardware/software infrastructure decisions as well as decisions on how to securely connect and network data.

CESMII, The Clean Energy Smart Manufacturing Innovation Institute is specifically set up to provide this kind of training and support. Other public-private partnerships available to assist SMMs include: the Department of Energy’s Industrial Assessment Center Network and California Manufacturing Technology Consulting (CMTC).

Smart Manufacturing, AI, and the need for a data savvy workforce are here to stay. Smart Manufacturing is essential for competitiveness, product precision, supply chain resilience, and safety — and for social, economic, and environmental sustainability.

The most important takeaway is don’t wait — get started on your AI journey today!

About the Author

Jim Davis

Jim Davis is Vice Provost Emeritus of IT at UCLA’s Office of Advanced Research Computing (OARC) and Special Advisor on Smart Manufacturing and Data Science in the Office of Research and Creative Activities. Jim co-founded the Smart Manufacturing Leadership Coalition (SMLC) and spearheaded UCLA’s leadership role in forming today’s national Manufacturing USA Institute, called the Clean Energy Smart Manufacturing Innovation Institute (CESMII), sponsored by the Department of Energy. Jim co-chaired the development of the report “Towards Resilient Manufacturing Ecosystems Through AI,” which was sponsored by NSF/NIST to address recommendations for AI in the National Strategy for Advanced Manufacturing. He was on the study committee for the report, “Options for a National Plan for Smart Manufacturing,” just released by the National Academies of Sciences, Engineering and Medicine.

Leave a Comment

Leave a Comment