Transactions of China Electrotechnical Society  2023, Vol. 38 Issue (5): 1390-1400    DOI: 10.19595/j.cnki.1000-6753.tces.220285
Current Issue| Next Issue| Archive| Adv Search |
Incremental Partial Discharge Recognition Method Combining Knowledge Distillation with Graph Neural Network
Zhang Yi, Zhu Yongli
School of Electrical and Electronic Engineering North China Electric Power University Baoding 071003 China

Download: PDF (1802 KB)   HTML (1 KB) 
Export: BibTeX | EndNote (RIS)      
Abstract  Partial discharge (PD) is the primary hidden danger threatening the insulation safety of high-voltage power equipment. Typically, there is some correlation between discharge type and insulation damage, so that by identifying the PD types, a large number of insulation faults can be predicted or detected in a timely manner. Recently, deep learning (DL) technology has been gradually applied in PD, showing excellent performance in PD pattern recognition. However, its learning process terminates after learning all the current data at once, which means that those PD recognition models cannot be gradually trained on the new PD data collected later. To address it, an incremental learning method combining knowledge distillation and graph neural network (GNN) for PD recognition is proposed in this paper, which can gradually expand the generalization ability of the original recognition model.
Firstly, a deep neural network (DNN) is trained as the original model M with the old PD data set. Then, according to the knowledge distillation theory, the prior knowledge from M is transferred to avoid forgetting in the process of incremental training by replaying a small amount of old PD data, and meanwhile, the new PD data can be learned with the prior knowledge assistance, which improves the generalization ability of the M. Finally, to adapt to the uncertainty of the new data size, it adopts the GNN layers to extract the abundant correlation information among various types of PD data, making up for the information shortage of limited samples. In this way, the DL-based PD model learns the continuously increasing PD data efficiently without retraining on all the old PD data and achieves better incremental recognition with different set sizes of the new data.
The experimental results show that with sufficient new PD data, the proposed incremental PD recognition method increases the accuracy by roughly 18%. In contrast to the traditional knowledge distillation, the proposed method with GNN increases the recognition accuracy by 2.75% to 9.25% on several new datasets with fewer samples, and reduces the adverse effects of unbalanced categories that are normally caused by the randomness of insulation defects. Moreover, the method has excellent generalization properties and is also effective on the incremental updates of other PD recognition models such as AlexNet, ResNet or MobileNet based models. More significantly, it requires less computational resources than retraining, reducing its GPU and RAM footprint by 67.9% and 72.7%, respectively.
The following conclusions can be drawn from the experiments analysis: (1) By introducing the knowledge distillation theory, the DL-based PD recognition model can inherit the recognition ability of original PD model as well as learn the new PD data gradually arriving at the monitoring platform, which is beneficial to improve the generalization ability of PD models. (2) The added GNN builds the graph data by randomly combining multiple types of PD samples, which increases the diversity of training samples. Therefore, it is appropriate to apply GNN to incrementally learn limited samples or imbalanced datasets in categories. (3) Compared to retrain model, this method requires less hardware resources in incremental training, making the deployment and local maintenance of the DL-based PD recognition models possible. (4) Furthermore, the proposed method is a universal incremental method so that it is effective on numerous common PD recognition models based on classical DNNs.
Key wordsPartial discharge      deep learning      incremental learning      knowledge distillation      graph neural network     
Received: 27 February 2022     
PACS: TM85  
Service
E-mail this article
Add to my bookshelf
Add to citation manager
E-mail Alert
RSS
Articles by authors
Zhang Yi
Zhu Yongli
Cite this article:   
Zhang Yi,Zhu Yongli. Incremental Partial Discharge Recognition Method Combining Knowledge Distillation with Graph Neural Network[J]. Transactions of China Electrotechnical Society, 2023, 38(5): 1390-1400.
URL:  
https://dgjsxb.ces-transaction.com/EN/10.19595/j.cnki.1000-6753.tces.220285     OR     https://dgjsxb.ces-transaction.com/EN/Y2023/V38/I5/1390
Copyright © Transactions of China Electrotechnical Society
Supported by: Beijing Magtech