Maximum Likelihood

Posted in Statistics, Total Reads: 894
Advertisements

Definition: Maximum Likelihood

Maximum Likelihood is one of the fundamental methods in statistical estimation theory for constructing estimators of unknown parameters. It was introduced by Fisher, an English mathematician in 1912.


For given data-set and its statistical model, maximum likelihood estimation provides estimate of model’s parameter. When sample size is large, the method yields an excellent estimator of model parameter.


Mathematical representation:

Assume, random variables X1, X2, X3 ... Xn form a random sample from a distribution f(x|θ).

Where, X1, X2 .. Xn are random continuous variables and f(x|θ) is pdf. θ is the unknown real valued parameter of distribution.


Now, for every random sample,

f(x1,x2,x3..xn | θ ) = f(x1|θ) .... f(xn|θ)

Where, f(x|θ) is pdf and f(x1 ..xn|θ) is joint density function.


This joint density function is likelihood function and is denoted by L(θ) assuming that Xi are random independent variables.


Hence,

Maximum Likelihood Function L(θ) = f(x1,x2,x3..xn | θ )


Now the idea is to find a good estimate of the unknown parameter θ that maximizes the likelihood of getting the data we observed. Generally taking the derivation of L(θ) and comparing it with zero will yield the estimate value of unknown parameter i.e. θ


Advertisements



Looking for Similar Definitions & Concepts, Search Business Concepts