The chapter presents a rigorous mathematical foundation for the theory of geometric correction and the theory of parametric fitting. For both, the problem is stated in general terms without assuming Gaussian noise: the role of the covariance matrix for a Gaussian distribution is played by the Fisher information matrix. A lower bound is derived that corresponds to the Cramer–Rao lower bound in traditional statistics, on the covariance matrix of the unbiased estimator of the parameter. Then, the maximum likelihood estimator is proved to attain it in the first order if the problem belongs to the exponential family. The maximum likelihood estimation process is expressed in a computationally convenient form, where the rank-constrained generalized inverse is used to discuss the ill-posedness of the problem and the numerical instability of the solution. The statistical problem closely related to the parametric fitting problem is what is known as the Neyman-Scott problem: observing multiple data, each having a distribution characterized by a common parameter, called the “structure parameter” or the “parameter of interest,” and a distinct parameter, called the “nuisance parameter,”) that depends on each observation, one has to estimate the structural parameter without knowing the nuisance parameters.