Feature extraction of dance movement based on deep learning and deformable part model
DOI:
https://doi.org/10.4108/eai.5-1-2022.172783Keywords:
DPM, dance movement feature extraction, deep neural network modelAbstract
This article has been retracted, and the retraction notice can be found here: http://dx.doi.org/10.4108/eai.8-4-2022.173790.
In complex scenes, the accuracy of dance movement recognition is not high. Therefore, this paper proposes a deep learning and deformable part model (DPM) for dance movement feature extraction. Firstly, the number of filters in DPM is increased, and the branch and bound algorithm is combined to improve the accuracy. Secondly, deep neural network model is used to sample points of interest according to human dance movements. The features extracted from the DPM and deep neural network are fused. It achieves a large reduction in the number of model parameters and avoids the network being too deep. Finally, dance movement recognition is performed on the input data through the full connection layer. Experimental results show that the proposed method in this paper can get the recognition result more quickly and accurately on the dance movement data set.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 EAI Endorsed Transactions on Scalable Information Systems
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.