Mutual-optimization Towards Generative Adversarial Networks for Robust Speech Recognition

No Thumbnail Available
File version
Author(s)
Ding, K
Luo, N
Xu, Y
Ke, D
Su, K
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2018
Size
File type(s)
Location

Beijing, China

License
Abstract

In the context of Automatic Speech Recognition (ASR), improving the noise robustness remains an intractable task. Speech enhancement, combined with Generative Adversarial Networks (GAN), such as SEGAN, has effective performance in denoising raw waveform speech signals. Instead of waveforms, using Mel filterbank spectra in GAN is proposed, which has better performance in the task of ASR. However, these techniques will still miss useful information when GAN is used in them. In this paper, we investigate to protect the useful information in GAN, and propose a novel model, called Discriminator Generator Classifier-GAN (DGC-GAN). While normal GAN combining just two networks will lead the model to denoising rather than recognition, DGC-GAN has another network called classifier, which is an ASR system that will tune GAN to be recognized easier. By adding a classifier into previous GAN to get DGC-GAN, we achieve 29.1% Phone Error Rate (PER) relative improvement in a tiny dataset and 47.4% PER relative improvement in a large dataset.

Journal Title
Conference Title

Proceedings - International Conference on Pattern Recognition

Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject

Deep learning

Neural networks

Science & Technology

Computer Science, Artificial Intelligence

automatic speech recognition

Persistent link to this record
Citation

Ding, K; Luo, N; Xu, Y; Ke, D; Su, K, Mutual-optimization Towards Generative Adversarial Networks for Robust Speech Recognition, Proceedings - International Conference on Pattern Recognition, 2018, pp. 2699-2704