\[f({\bf y};{\bf\theta})=\prod\limit_{i=1}^n f_i(y_i;{\bf\theta})=L({\bf\theta};{\bf y})\]
The $L({\bf\theta};{\bf y})$ is a function of the unknown parameter $ {\bf\theta} $ given the data $ {\bf y} $, is called the likelihood function .
Analytically, maximizing the product is more difficult than maximizing the sum, so we take a natural logarithm of the likelihood function, thus it is called the log likelihood function
\[log L({\bf\theta};{\bf y})=\sum\limit_{i=1}^n log f_i(y_i;{\bf\theta})\]
For example, the outcome of an event is found as a series of random value, what is the (log) likelyhood of this outcome coming from $X \sim Bino(10,0.5)$?
First, generate some random values but let's assume that we do not know the parameters generating the random values, i.e. we don't know that it was generated from Bino(10,0.5).
X=binornd(10,0.5,1,100);
The log likelihood value of X with parameter (10,0.6) is
>> sum(log(binopdf(X,10,0.6)))
ans =
-200.0399
ans =
-200.0399
The log likelihood value of X with parameter (10,0.5) is
>> sum(log(binopdf(X,10,0.5)))
ans =
-184.4945
ans =
-184.4945
And, The log likelihood value of X with parameter (11,0.5) is
>> sum(log(binopdf(X,11,0.5)))
ans =
-187.4201
ans =
-187.4201
From three values above, the log likelihood of the parameter (10,0.5) is highest, from MLE standpoint, we should select this model.
No comments:
Post a Comment