To improve the recognition rate of the speaker recognition system, a model scheme combined with the Additive Margin--Softmax loss function is proposed from the perspective of model differentiation and based on the fusion of Convolutional Neural Network and Gated Recurrent Unit, which not only reduces the distance of similar sample features and increases the distance among different types of sample features simultaneously but also uses layer normalization to constrain the distribution of high-dimensional features. In order to address the problem of poor robustness of the speaker recognition system in real scenes, the SpecAugment data enhancement method is proposed to train the speaker model to combat external environmental interference. Based on the experimental data, the speech recognition performance of the proposed and traditional methods is analyzed. The experimental results show that, compared with other models, the equal error rate based on the Additive Margin--Convolutional Neural Network--Gated Recurrent Unit method is 4.48%, and the recognition rate is 99.18%. Adding layer normalization to the training model can improve the training speed to a certain extent, and the speaker model has better robustness.
Lan, Chaofeng; Wang, Yuqiao; Zhang, Lei; Zhao, Hongyun
Affiliations: College of Measurement and Communication Engineering, Harbin University of Science and Technology, Harbin, China; College of Measurement and Communication Engineering, Harbin University of Science and Technology, Harbin, China; Beidahuang Industry Group General Hospital, Harbin, China; College of Measurement and Communication Engineering, Harbin University of Science and Technology, Harbin, China(See document for exact affiliation information.)
JAES Volume 70 Issue 7/8 pp. 611-620; July 2022
Publication Date: July 19, 2022
No AES members have commented on this paper yet.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.