Wednesday, June 9, 2010

Bayes Decision Rule

Given a feature vector $\bf x$, we assign it to class $\omega_j$ if

\[P(\omega_j|{\bf{x}})>P(\omega_i|{\bf{x}});\;\;\;i=1,...,n;i\ne j\]
or
\[P({\bf{x}}|\omega_j)P(\omega_j)>P({\bf{x}}|\omega_i)P(\omega_i);\;\;\;i=1,...,n;i\ne j\]
The $P(\omega)$ is the prior and $P({\bf{x}}|\omega)$ is the likelihood or class-conditional probability. Using these two quantities, we should whatever class $\bf{x}$ that has the highest posterior probability $P(\omega|{\bf{x}})$.

The following MATLAB code demonstrates an application for Bayes decision rule. Suppose that the class-conditionals are given by $\phi(x;\mu,\sigma^2)$:

\[P(x|\omega_1)= \phi(x;-1,1)\]
\[P(x|\omega_2)= \phi(x;1,1)\]
and the priors are
\[P(\omega_1)=0.6\]
\[P(\omega_2)=0.4\]
What class does the $x = -0.75$ belong to?
>> x=-0.75;
>> normpdf(x,-1,1)*0.6
ans =
    0.2320

>> normpdf(x,1,1)*0.4
ans =
    0.0345
Since 0.232 > 0.0345, then x belongs to class 1.

No comments:

Post a Comment