Using counterexamples, we show that vocabulary size and static and dynamic branching factors are all inadequate as measures of speech recognition complexity of finite state grammars. Information theoretic arguments show that perplexity (the logarithm of which is the familiar entropy) is a more appropriate measure of equivalent choice. It too has certain weaknesses which we discuss. We show that perplexity can also be applied to languages having no obvious statistical description, since an entropy‐maximizing probability assignment can be found for any finite‐state grammar. Table I shows perplexity values for some well‐known speech recognition tasks.

Perplexity Vocabulary Dynamic

Phone Word size branching factor

IBM‐Lasers 2.14 21.11 1000 1000

IBM‐Raleigh 1.69 7.74 250 7.32

CMU‐AIX05 1.52 6.41 1011 35

This content is only available via PDF.