Options
TinyML-enabled edge implementation of transfer learning framework for domain generalization in machine fault diagnosis
Journal
Expert Systems with Applications
ISSN
09574174
Date Issued
2023-03-01
Author(s)
Asutkar, Supriya
Chalke, Chaitravi
Shivgan, Kajal
Tallur, Siddharth
Abstract
TinyML has the potential to be a huge enabler of smart sensor nodes for fault diagnosis of machines by embedding powerful machine learning algorithms in low-cost edge devices. Static models trained with pre-collected data may not perform satisfactorily when deployed in use cases that present data with a significantly different distribution, due to changes in machine instances, installation, and environment and operating conditions. In this work, we present a transfer learning framework in conjunction with TinyML-powered convolutional neural network (CNN) architecture for vibration-based fault diagnosis of vastly different machines. We report that fine-tuning (retraining) of parameters (weights and biases) in convolutional layers and transfer of parameters in dense layers of CNNs results in higher accuracy for domain generalization than the conventional approach of transferring parameters for convolutional layers and fine-tuning dense layers. Using transfer learning also results in high accuracy when the total number of training samples used for fine-tuning on the target domain data sets are as low as 25 i.e. less than 5 % of the size of the data set, which makes the proposed architecture viable for scenarios with a limited number of data points available for training the model. Furthermore, we have also demonstrated memory-efficient retraining at the edge by fine-tuning only the biases of hidden layers.
Volume
213
Subjects