Publication
INTERSPEECH - Eurospeech 2001
Conference paper
Language models conditioned on dialog state
Abstract
We consider various techniques for using the state of the dialog in language modeling. The language models we built were for use in an automated airline travel reservation system. The techniques that we explored include (1) linear interpolation with state specific models and (2) incorporating state information using maximum entropy techniques. We also consider using the system prompt as part of the language model history. We show that using state results in about a 20% relative gain in perplexity and about a 9% percent relative gain in word error rate over a system using a language model with no information of the state.