The equations for a least‐squares fit when the data have errors in both X and Y are developed using a Taylor expansion of both the fitting function y and the ‘‘independent’’ variable x. The method is an extension of the usual treatment for fitting data with a nonlinear function. It is not especially complex and can be implemented in a small program. Examples show strengths and weaknesses of the approach.
Skip Nav Destination
Research Article| March 01 1993
Least‐squares fits when there are errors in X
Peter L. Jolivette; Least‐squares fits when there are errors in X. Comput. Phys. 1 March 1993; 7 (2): 208–212. https://doi.org/10.1063/1.168460
Download citation file: