Optimal estimation of a coin's bias using noisy data is surprisingly different from the same problem with noiseless data. We study this problem using entropy risk to quantify estimators' accuracy. We generalize the "add β" estimators that work well for noiseless coins, and we find that these hedged maximum-likelihood (HML) estimators achieve a worst-case risk of O(N−1/2) on noisy coins, in contrast to O(N−1) in the noiseless case. This increased risk is unavoidable and intrinsic to noisy coins, which we demonstrate by numerical construction of minimax estimators. Practically speaking, the minimax criterion is not a good fit for noisy coins - minimax estimators introduce extreme bias in return for slight improvements in the worst-case risk. We introduce a pointwise lower bound on the minimum achievable risk, and use it to show that HML estimators are pretty good. Finally, we survey scientific applications of the noisy coin model in social science, physical science, and quantum information science.

This content is only available via PDF.
You do not currently have access to this content.