DCA-BiGRU
The pytorch implementation of the paper Fault diagnosis for small samples based on attention mechanism
However, in fact, the title Fault diagnosis for small samples based on interpretable improved space-channel attention mechanism and improved regularization algorithms fits the research content of the paper better.
The dataset comes from 12khz, 1hp
Contributions:
- 1D-signal attention mechanism [code]
- AMSGradP [code]
- 1D-Meta-ACON [code]
- At the beginning, I found that many model designs did not connect GAP operation after BiGRU/BiLSTM, which is the basically routine operation. I found that GAP works very well. [code]
- 1D-Grad-CAM++ [code]
- AdaBN [code]
Attention Block(SCA)
How does it work?
If it is helpful for your research, please kindly cite this work:
@article{ZHANG2022110242,
title = {Fault diagnosis for small samples based on attention mechanism},
journal = {Measurement},
volume = {187},
pages = {110242},
year = {2022},
issn = {0263-2241},
doi = {https://doi.org/10.1016/j.measurement.2021.110242},
url = {https://www.sciencedirect.com/science/article/pii/S0263224121011507},
author = {Xin Zhang and Chao He and Yanping Lu and Biao Chen and Le Zhu and Li Zhang}
}
Our other works
@article{HE,
title = {Physics-informed interpretable wavelet weight initialization and balanced dynamic adaptive threshold for intelligent fault diagnosis of rolling bearings},
journal = {Journal of Manufacturing Systems},
volume = {70},
pages = {579-592},
year = {2023},
issn = {1878-6642},
doi = {https://doi.org/10.1016/j.jmsy.2023.08.014},
author = {Chao He and Hongmei Shi and Jin Si and Jianbo Li}
}
Environment
pytorch == 1.10.0
python == 3.8
cuda == 10.2
Contact
- Chao He
- chaohe#bjtu.edu.cn (please replace # by @)