An International Publisher for Academic and Scientific Journals
Author Login 
Scholars Journal of Engineering and Technology | Volume-11 | Issue-08
Fault Detection Method based on Local and Global Attention Mechanisms
LU Zhen Jie
Published: Aug. 21, 2023 | 124 150
DOI: 10.36347/sjet.2023.v11i08.004
Pages: 177-182
Downloads
Abstract
In recent years, due to the wide application of Distributed control system, a large number of production process data can be collected and stored, which provides a solid data foundation for process monitoring technology based on deep learning. The Transformer model is a fully connected attention mechanism model that captures the global dependencies of data by calculating the correlation between any two items. This paper proposes a Transformer model based on local and global attention mechanisms. Firstly, after the data is standardized to eliminate the impact of different dimensions, positional encoding is used to mark the position information. Then, the data is divided into two equal parts from the feature dimension. One part enters the standard attention mechanism to capture the global information of the sequence, and the other part enters the local attention mechanism to capture the local information of the sequence, Then, the captured local information and global information are fused to reduce computational complexity and compensate for the shortcomings of the Transformer model in capturing local information. By applying the model proposed in this paper to the penicillin fermentation process for fault detection, it has been experimentally verified that the proposed model has an improved fault detection accuracy compared to the standard Transformer model.