Resources Listing

Converting knowledge into Insights – Generative AI

Introduction

Generative AI, LLMs – Hearing a lot about It these days. 

Let us try to uncover what it is? 

Simply put – It’s an AI algorithm which has human-like thinking abilities, like the way humans can interpret text and number. This interpretation not only includes analysis, and summarization of the data but also generative capabilities, where the model can next line of actions. Generative models have gained significant intelligence, which can predict the next action by looking at the pattern of the complex multi-dimensional dataset. 

Believe it or not, Generative AI is going to change the way manufacturing industries are functioning today.  

People act, processes react, and people again act. That’s what is called experience-based decision-making. Whereas, during act-react-act process, there are several dimensions of data being generated of which some are relational, some are time series, and some are of document type format. All these data are stored in different systems of which hardly any data is being accessed for deriving insights other than relational and time-series.  

Though this might look lucrative, Gen AI doesn’t have any visibility of the domain or specifics of an organization. This means it will not have any understanding of specific domain keywords or any underlying construct of the data. In that case, how do we infuse it? 

Every gen AI needs context to allow the extraction of meaningful insights. In manufacturing language context is what becomes the wisdom for the generative models. These contexts are presented to the language models using the knowledge graphs which stores the relevant information about the organization’s processes and operations. 

Let us go a little deeper into it. 

The underlying foundation of LLMs are the constructs of language – semantics and syntactics. Semantics reveals the sentiments and the meaning of a text / number in the given context, and syntactic references the grammatical syntax as per the English standards. This is what we also term as natural language processing (NLP). 

Let's start building on top of the example use cases – compressors discussed in my previous article for Knowledge graphs. In this continued article, I will take the example of equipment maintenance and troubleshooting workflows. 

Current Practice:

magine, there had been an abnormality observed in one of the compressors. The maintenance team would need an investigation which would demand the analysis of multiple data from multiple systems. This process is time-consuming and data-intensive, and beyond all this is experience-driven. 

Enhanced workflow: (Gen AI Based):

With Generative AI, powered by knowledge graphs, the system will allow users to ask questions in human language. LLM (Gen AI) will process the user query to chunk/tokenize into smaller contexts. Contexts are the essential elements which LLM uses for comparison with the underlying data. Contexts are used to perform complex data analysis and similarity search from all the data sources. Domain context is handled by knowledge graphs which is used to interlink all types of data. This domain know-how is leveraged by LLM to extract the required insights. 

 

Following shall be the key comparison between the 2 workflows:

Current workflow 

Generative AI-based workflow 

Manual, time-expensive 

Automated 

Experience-driven decision making 

Data-driven decision-making (Captures all historical data) 

Data not available in real-time 

Data available in real-time 

Reactive actions 

Predictive insights