UBC Theses and Dissertations
Reconstruction of process topology using historical data and process models Ongalbayeva, Aigerim
Modern process industries are large and complex. Their units are highly interconnected with each other. If there is an abnormal situation in the process, the faults might propagate from one part of the process to another. To keep the process safe, it is vital to know a causality and connectivity relationship of the process. Alarms control all process variables and let operators know if there is any fault in the process. During a process malfunction, alarms start from a single process variable and quickly propagate to other variables. This leads to alarm flooding showing continuous appearance of alarms in the monitoring panels. During alarm flooding, it is difficult for operators to find the root cause and solve the problem on time. Causality analysis between different variables is one of the methods to avoid alarm flooding. The method helps to provide a process topology based on the process models and data. Process topology is a map that shows how all units and parts of the process are connected; it helps to find root causes of the fault and to predict future abnormalities. There are many techniques for causality detection. Transfer entropy is a popular method of causality detection that is used for both linear and nonlinear systems. The method estimates the variables’ entropy using their probabilities. This thesis focuses on the transfer entropy based on historical data of the Tennessee-Eastman process. The Tennessee-Eastman is a widely used benchmark in process control studies. The thesis aims to detect the causality and connectivity map of the continuous process measurements. Particle filters or Sequential Monte Carlo methods are also considered to approximate density functions of the filtering problem by spreading particles.
Item Citations and Data
Attribution-NonCommercial-NoDerivatives 4.0 International