0000000000876560

AUTHOR

A.j. Buchsbaum

showing 1 related works from this author

Shrinking language models by robust approximation

2002

We study the problem of reducing the size of a language model while preserving recognition performance (accuracy and speed). A successful approach has been to represent language models by weighted finite-state automata (WFAs). Analogues of classical automata determinization and minimization algorithms then provide a general method to produce smaller but equivalent WFAs. We extend this approach by introducing the notion of approximate determinization. We provide an algorithm that, when applied to language models for the North American Business task, achieves 25-35% size reduction compared to previous techniques, with negligible effects on recognition time and accuracy.

Theoretical computer scienceFinite-state machineNested wordComputer scienceQuantum finite automataAutomata theoryLanguage modelAlgorithmNatural languageAutomatonProceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181)
researchProduct