Human Muscle sEMG Signal and Gesture Recognition Technology Based on Multi-Stream Feature Fusion Network
DOI:
https://doi.org/10.4108/eetpht.10.7230Keywords:
Multi-stream Characteristics, Convolutional Neural Networks, Surface Electromyography Signal, Gestures, RecognitionAbstract
Surface electromyography signals have significant value in gesture recognition due to their ability to reflect muscle activity in real time. However, existing gesture recognition technologies have not fully utilized surface electromyography signals, resulting in unsatisfactory recognition results. To this end, firstly, a Butterworth filter was adopted to remove high-frequency noise from the signal. A combined method of moving translation threshold was introduced to extract effective signals. Then, a gesture recognition model based on multi-stream feature fusion network was constructed. Feature extraction and fusion were carried out through multiple parallel feature extraction paths, combined with convolutional neural networks and residual attention mechanisms. Compared to popular methods of the same type, this new recognition method had the highest recognition accuracy of 92.1% and the lowest recognition error of 5%. Its recognition time for a single-gesture image was as short as 4s, with a maximum Kappa coefficient of 0.92. Therefore, this method combining multi-stream feature fusion networks can effectively improve the recognition accuracy and robustness of gestures and has high practical value.
Downloads
References
[1] F. A. Farid, N. Hashim J. Abdullah, R. Bhuiyan, J. Uddin, M. A. Haque and M. N. Huse,“A structured and methodological review on vision-based hand gesture recognition system,” J. Imaging,vol. 8, no. 6, pp. 153-154, May. 2022, doi:10.3390/jimaging8060153.
[2] J. Qi, L. Ma, Z. Cui, and Y. Yu, “Computer vision-based hand gesture recognition for human-robot interaction: a review,"Complex Intell. Syst, vol. 10, no. 1, pp. 1581-1606, June. 2024, doi: 10.1007/s40747-023-01173-6.
[3] D. Ryumin, D. Ivanko, and E. Ryumina, “Audio-visual speech and gesture recognition by sensors of mobile devices,” Sens, vol. 23, no. 4, pp. 2284-2285, November. 2023, doi: 10.3390/s23042284.
[4] J. Yu, M. Qin, and S. Zhou, “Dynamic gesture recognition based on 2D convolutional neural network and feature fusion,”Sci. Rep, vol. 12, no. 1, pp. 4345-4346, March. 2022, doi: 10.1038/s41598-022-08133-z.
[5] A. Calado, P. Roselli, and V. Errico, “A geometric model-based approach to hand gesture recognition,” IEEE Trans. Syst. Man Cybern. Part B Cybern, vol. 52, no. 10, pp. 6151-6161, January. 2022, doi: 10.1109/TSMC.2021.3138589.
[6] A. S. M. Miah, M. A. M. Hasan, and J. Shin, “Multistage spatial attention-based neural network for hand gesture recognition,” Comput, vol. 12, no. 1, pp. 13-14, December. 2023, doi: 10.3390/computers12010013.
[7] J. P. Sahoo, A. J. Prakash, P. Pławiak, and S. Samantray, “Real-time hand gesture recognition using fine-tuned convolutional neural network,” Sens, vol. 22, no. 3, pp. 706-707, January. 2022, doi: 10.3390/s22030706.
[8] S. Bhushan, M. Alshehri, I. Keshta, A. K. Chakraverti, J. Rajpurohit, and A. Abugabah, “An experimental analysis of various machine learning algorithms for hand gesture recognition,” Electron, vol. 11, no. 6, pp. 968-967, March. 2022, doi: 10.3390/electronics11060968.
[9] R. Gao, W. Li, Y. Xie, E. Yi, L. Wang, D. Wu, and D. Zhang, “Towards robust gesture recognition by characterizing the sensing quality of WiFi signals,” Proc. ACM Interact. Mobile Wearable Ubiquitous Technol, vol. 6, no. 1, pp. 1-26, March. 2022, doi: 10.1145/3517241.
[10] M. A. A. Faisal, F. F. Abir, and M. U. Ahmed, “Exploiting domain transformation and deep learning for hand gesture recognition using a low-cost dataglove,”Sci. Rep, vol. 12, no. 1, p. 21446, December. 2022, doi: 10.1038/s41598-022-25108-2.
[11] A. Fatayer, W. Gao, and Y. Fu, “sEMG-based gesture recognition using deep learning from noisy labels,"IEEE J. Biomed. Health. Inf, vol. 26, no. 9, pp. 4462-4473, June. 2022, doi: 10.1109/JBHI.2022.3179630.
[12] H. Wang, Y. Zhang, C. Liu, and H. Liu, “sEMG based hand gesture recognition with deformable convolutional network,” Int. J. Mach. Learn. Cybern, vol. 13, no. 6, pp. 1729-1738, March. 2022, doi: 10.1007/s13042-021-01482-7.
[13] X. Lv, C. Dai, H. Liu, Y. Tian, L. Chen, Y. Lang, R. Tang, and J. He, “Gesture recognition based on sEMG using multi-attention mechanism for remote control,” Neural Comput.Appl., vol. 35, no. 19, pp. 13839-13849, July. 2023, doi: 10.1007/s00521-021-06729-6.
[14] Y. Jiang, L. Song, J. Zhang, Y. Song, and M. Yan, “Multi-category gesture recognition modeling based on sEMG and IMU signals,” Sens, vol. 22, no. 15, pp. 5855-5856, August. 2022, doi: 10.3390/s22155855.
[15] R. Zhang, C. Jiang, and S. Wu, “Wi-Fi sensing for joint gesture recognition and human identification from few samples in human-computer interaction,” IEEE J. Sel. Areas Commun, vol. 40, no. 7, pp. 2193-2205, March. 2022, doi: 10.1109/JSAC.2022.3155526.
[16] J. Yang, S. Liu, and Y. Meng, “Self-powered tactile sensor for gesture recognition using deep learning algorithms,” ACS Appl. Mater. Interfaces, vol. 14, no. 22, pp. 25629-25637, May. 2022, doi: 10.1021/acsami.2c01730.
[17] L. Liu, W. Xu, and Y. Ni, “Stretchable neuromorphic transistor that combines multisensing and information processing for epidermal gesture recognition,”ACS Nano, vol. 16, no. 2, pp. 2282-2291, January. 2022, doi: 10.1021/acsnano.1c08482.
[18] L. Guo, Z. Lu, and L. Yao, “A gesture recognition strategy based on A-mode ultrasound for identifying known and unknown gestures,”IEEE Sens. J, vol. 22, no. 11, pp. 10730-10739, April. 2022, doi: 10.1109/JSEN.2022.3167696.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Xiaoyun Wang
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.
Funding data
-
University Natural Science Research Project of Anhui Province
Grant numbers 2022AH040280