PROBABILITY Part-2
|
Main points to be discussed here |
|
|
1 |
Random Experiment |
|
2 |
Different types of
events |
|
3 |
Variable
description of the events |
|
4 |
Conditional
Probability |
|
5 |
Multiplication Theorem
on Probability |
|
6 |
Laws of Total
Probability |
|
7 |
Baye's Theorem |
|
8 |
Bernoulli's Trial |
|
9 |
Mean and Variance of
the Distribution |
|
10 |
Shortcut Method of
finding Mean, Variance and Standard Deviation. |
|
\[P(A|B)=\frac{P(A\cap
B)}{P(B)}\] |
|
\[P(A\cap
B)=P(A).P(B)\] \[P(A\cap B\cap C)=P(A).P(B).P(C)\] |
Explanation of Some symbols and Terms used in the problems
|
Variable Description of the event ⇒ Set
theory notation |
|
A or B (At least one of A or B) = A ∪ B |
|
A and B = A ∩ B ,
Not A = Ā |
|
\[A \: but \: not\: B =
A\cap\overline{B}\] |
|
\[Either\;
A\: or\: B=A\cup B\] |
|
\[Neither\: A\: nor\:
B=\overline{A}\cap \overline{B}\] |
|
\[All
\: three\: of\: A, B \: and \: C = A\cap B\cap C\] |
|
\[At\: least\: one\: of\: A, B,
or\: C=A\cup B\cup C\] |
|
\[Exactly \: \: one\: of\: A\:
and\: B = (A\cap \overline{B})\cup (\overline{A}\cap B)\] |
|
\[D' Morgan's Law:-\: A'\cap
B'=(A\cup B)'\]\[OR\: \: \: A'\cup B'=(A\cap B)'\] |
|
\[P(A\cup B)=P(A)+P(B)-P(A\cap
B)\]\[P(A\cup B\cup C)=P(A)+P(B)+P(C)-P(A\cap B)\]\[-P(B\cap C)-P(C\cap
A)+P(A\cap B\cap C)\] |
If A and B are two events associated with a random experiment, then
\[If A_{1},A_{2},A_{3},.....,A_{n}\: are\: n \: events\: associated\: with\: the\: random\: experiment,\: then\]
\[P(A_{1}\cap A_{2}\cap A_{3}.....\cap A_{n})=P(A_{1})P(A_{2}/A_{1})P(A_{3}/A_{1}\cap A_{2})P(A_{4}/A_{1}\cap A_{2}\cap A_{3}).....\]
\[........P(A_{n}/A_{1}\cap A_{2}\cap .....\cap A_{n-1})\]
|
BAYE'S THEOREM |
|
Let S be a sample space and let E1, E2,
E3, ………, En be n mutually exclusive and exhaustive events
associated with a random experiment. If A is any even which occurs with E1 or E2 or E3 or ……or En
If n = 2 then Baye's Theorem written as: \[P(E_{1}|A) =\frac{P(E_{1})P(A|E_{1})}{P(E_{1})P(A|E_{1}) +P(E_{2}) P(A|E_{2})}\] \[P(E_{2}|A)=\frac{P(E_{2})P(A|E_{2})}{P(E_{1})P(A|E_{1})+ P(E_{2}) P(A| E_{2})}\] If n = 3 then Baye's Theorem written as: \[P(E_{1}|A)=\frac{P(E_{1})P(A|E_{1})}{P(E_{1})P(A|E_{1})+P(E_{2})P(A|E_{2}) +P(E_{3})P(A|E_{3})}\] \[P(E_{2}|A)=\frac{P(E_{2})P(A|E_{2})}{P(E_{1})P(A|E_{1})+P(E_{2})P(A|E_{2}) +P(E_{3})P(A|E_{3})}\]
\[P(E_{3}|A)=\frac{P(E_{3})P(A|E_{3})}{P(E_{1})P(A|E_{1})+P(E_{2})P(A|E_{2}) +P(E_{3})P(A|E_{3})}\] |
|
|
|
\[P(X=r)=\:^{n}C_{r}\: p^{r}q^{n-r}, where r=1,2,3....n\] |
|
Bernoulli’s Trial |
|
Probability of r successes in ‘n’
Bernoulli trial is given by \[P(r\: successes)=^{n}C_{r}p^{r}q^{n-r}\] \[P(r\:successes)=\frac{n!}{r!(n-r)!}.p^{r}q^{n-r}\] Where n = number of
trials r = number of successful trials = 0, 1, 2, 3, ………., n p = probability of success in a trial q = probability of failure in a trial and p + q = 1 |
The probability distribution of number of successes for a random variable X can be written as:
This probability distribution is called Binomial distribution with parameters n and p.
|
Shortcut Method of finding
Mean, Variance and Standard Deviation |
|
The binomial distribution with n Bernoulli trials and success p is also denoted by B(n, p) Where ‘n’ number of Bernoulli trial p denotes the probability of success q denotes the probability of failure then, Mean = np Variance = npq Standard Deviation = Square
root of npq |


