Maximum entropy is an approach for obtaining posterior probability distributions of modeling parameters. This approach, based on a cost function that quantifies the data-model mismatch, relies on an estimate of an appropriate temperature. Selection of this ”statistical temperature” is related to estimating the noise covariance. A method for selecting the ”statistical temperature” is derived from analogies with statistical mechanics, including the equipartition theorem. Using the equipartition-theorem estimate, the statistical temperature can be obtained for a single data sample instead of via the ensemble approach used previously. Examples of how the choice of temperature impacts the posterior distributions are shown using a toy model. The examples demonstrate the impact of the choice of the temperature on the resulting posterior probability distributions and the advantages of using the equipartition-theorem approach for selecting the temperature.

This content is only available via PDF.