Maximum Likelihood

Posted in Statistics, Total Reads: 1852

Definition: Maximum Likelihood

Maximum Likelihood is one of the fundamental methods in statistical estimation theory for constructing estimators of unknown parameters. It was introduced by Fisher, an English mathematician in 1912.

For given data-set and its statistical model, maximum likelihood estimation provides estimate of model’s parameter. When sample size is large, the method yields an excellent estimator of model parameter.

Mathematical representation:

Assume, random variables X1, X2, X3 ... Xn form a random sample from a distribution f(x|θ).

Where, X1, X2 .. Xn are random continuous variables and f(x|θ) is pdf. θ is the unknown real valued parameter of distribution.

Now, for every random sample,

f(x1,x2,x3..xn | θ ) = f(x1|θ) .... f(xn|θ)

Where, f(x|θ) is pdf and f(x1 ..xn|θ) is joint density function.

This joint density function is likelihood function and is denoted by L(θ) assuming that Xi are random independent variables.


Maximum Likelihood Function L(θ) = f(x1,x2,x3..xn | θ )

Now the idea is to find a good estimate of the unknown parameter θ that maximizes the likelihood of getting the data we observed. Generally taking the derivation of L(θ) and comparing it with zero will yield the estimate value of unknown parameter i.e. θ

Hence, this concludes the definition of Maximum Likelihood along with its overview.

Browse the definition and meaning of more terms similar to Maximum Likelihood. The Management Dictionary covers over 7000 business concepts from 6 categories.

Search & Explore : Management Dictionary

Share this Page on:
Facebook ShareTweetShare on Linkedin