A brief insight into conditional probability

Before you go further

Event in probability is just a non-deterministic output of some random experiment. Tossing a coin is a random experiment whose output is an event and it cannot be determined before tossing it. Therefore, it is a non-deterministic machine experiment. Generating a random number using a code in computer is deterministic because it uses some algorithm behind curtains whose output can be determined using certain practices. Tossing a coin never uses an algorithm behind it. You just flip it and get output.

Dependent and Independent events

Output of a dependent event depend on some other event. For instance, if a bag contains 5 black balls and 5 red balls, you pick any ball from the bag. Next time when you pick some ball from the bag, it will depend on previous event because the probability of second time ‘picking a ball’ is depending on previous event that too was ‘picking a ball’ . In other words, the probability of second time ‘picking a ball’ is changed due to first event. Independent events can be tossing a coin two times. The output in both experiments won’t depend on each other. In other words, probability in second event won’t change due to previous one.

Conditional Probability

Suppose we have two events A and B. Then, the probability of both events happening at the same time is given by — P(A and B).

For independent events, P(A and B)= P(A)*P(B) — (i)

Suppose, A and B are dependent events now . The probability of one of them will now be depending on other event. More formally, the probability of one of them will change depending on other, as discussed previously. The equation (i) now becomes P(A and B)=P(A)*P(B|A) which means probability of event B will change in our case due to event A. The symbol ‘|’ just denotes that probability of event B is not the same now, it has changed due to event A. Note that we could also write probability of event A depending on event B. They both mean the same. Previous equation can also be written as P(B|A)=P(A and B)/P(A). This is the formula for conditional probability of event B, given that A has already been occurred. The Venn diagram below shows the concept more clearly.

The extension of conditional probability is the Bayes theorem that is one of the strongest algorithms in modern era. It can be applied in classification models like Email spam detection, Twitter sentiment classification and medical domain using Machine Learning.

I write blogs, poems and codes | I find myself in Machine Learning | Electronics Engineering (ZHCET)