I-Smooth for improved minimum classification error training

  1. (PDF, 237 KB)
DOIResolve DOI: http://doi.org/10.1109/ICASSP.2010.5495109
AuthorSearch for: ; Search for:
Proceedings titleIEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2010
ConferenceIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2010), March 14-19, 2010, Dallas, Texas, USA
Pages49064909; # of pages: 4
SubjectHidden Markov Model; Speech Recognition; Minimum Classification Errors
AbstractIncreasing the generalization capability of Discriminative Training (DT) of Hidden Markov Models (HMM) has recently gained an increased interest within the speech recognition field. In particular, achieving such increases with only minor modifications to the existing DT method is of significant practical importance. In this paper, we propose a solution for increasing the generalization capability of a widely-used training method – the Minimum Classification Error (MCE) training of HMM – with limited changes to its original framework. For this, we define boundary data – obtained by applying a large steep parameter, and confusion data – obtained by applying a small steep parameter on the training samples, and then do a soft interpolation between these according to the number points of occupancies of boundary data and the number points ratio between the boundary and the confusion occupancies. The final HMM parameters are then tuned in the same manner as in MCE by using the interpolated boundary data. We show that the proposed method achieves lower error rates than a standard HMM training framework on a phoneme classification task for the TIMIT speech corpus.
Publication date
AffiliationNational Research Council Canada (NRC-CNRC); NRC Institute for Information Technology
Peer reviewedYes
NPARC number15236563
Export citationExport as RIS
Report a correctionReport a correction
Record identifier4115fe80-e050-489d-9cf8-45e945938a65
Record created2010-06-10
Record modified2016-05-09
Bookmark and share
  • Share this page with Facebook (Opens in a new window)
  • Share this page with Twitter (Opens in a new window)
  • Share this page with Google+ (Opens in a new window)
  • Share this page with Delicious (Opens in a new window)