next up previous contents
Next: Raw to Calibrated Data Up: Basic Concepts Previous: Noise distributions

   
Estimation

A number of different statistical methods are used for estimating parameters from a data set. The most commonly used one is the least squares method which estimates a parameter $\theta$ by minimizing the function :

 \begin{displaymath}S(\theta) = \sum_i ( y_i - f(\theta;x_i) )^2
\end{displaymath} (2.6)

where y is the dependent and x the independent variables while f is a given function. Equation 2.6 can be expanded to more parameters if needed. For linear functions f an analytic solution can be derived whereas an iteration scheme must be applied for most non-linear cases. Several conditions must be fulfilled for the method to give a reliable estimate of $\theta$. The most important assumptions are that the errors in the dependent variable are normal distributed, the variance is homogeneous, and the independent variables have no errors and are uncorrelated.

The other main technique for parameter estimation is the maximum likelihood method where the joint probability of the parameter $\theta$ :

 \begin{displaymath}l(\theta) = \prod_i P(\theta,x_i)
\end{displaymath} (2.7)

is maximized. In Equation 2.7, P denotes the probability density of the individual data sets. Normally, the logarithm likelihood $L = \log(l)$ is used to simplify the maximization procedure. This method can be used for any given distribution. For a normal distribution the two methods will give the same result.


next up previous contents
Next: Raw to Calibrated Data Up: Basic Concepts Previous: Noise distributions
Petra Nass
1999-06-15