We investigate in a Bayesian framework the performance of two alternative modifications of the 200 years old method of least squares. The first modification considers arbitrary real positive exponents α instead of α = 2 in the distance measure. This modification leads to estimates that are less outlier sensitive than traditional least squares. Moreover, even when data are simulated with a Gauss random number generator the optimum exponent α may well deviate from α = 2. The second modification consists of abandoning the assumption that data uncertainties entering the distance measure are exact. We replace this assumption by assuming that the experimentally determined uncertainties si are point estimates of the unknown true uncertainties σi. The remarkable result of this modification is a likelihood which is, unlike traditional least squares, perfectly robust against outliers in case of inconsistent data, but approaches least squares results for consistent data. These properties render data selection by reason of their numerical value unnecessary.

This content is only available via PDF.
You do not currently have access to this content.