Naive Bayesian Algorithm
Given some conditional probability, how to solve the conditional probability when we exchange positions of the two events? Or given P(A|B), how to solve P(B|A)?
P(B|A)=P(A|B)P(B)P(A)
Formal Definition
- Let x={a1,a2,…,am} be a sample, every a stands for an attribute.
- Given a classes set C={y1,y2,…,yn}.
- Calculate P(y1|x),P(y2|x),…,P(yn|x).
- If P(yk|x)=max{P(y1|x),P(y2|x),…,P(yn|x)}, then x∈yk.
Then, how to calculate the conditional probabilities? Follow the steps below:
- Find a training set.
- Calculate the conditional probability of each attribute under each class, i.e. P(a1|y1),P(a1|y2),P(a1|yn),P(a2|y1),P(a2|y2),P(a2|yn),⋯,⋯,⋯,P(am|y1);P(am|y2);P(am|yn);
- Suppose that those attributes are independent of each other, by Bayesian theorem,P(yi|x)=P(x|yi)P(yi)P(x)Since the denominator is always a constant, we need only maximize the numerator. By supposition, those attributes are (conditionally) independent of each other, we can write:P(x|yi)P(yi)=P(a1|Yi)P(a2|Yi)⋯P(am|Yi)P(yi)=P(yi)∏j=1mP(aj|yi)