Mathematics Revision Notes Of Probability For NDA
Click for Only Video

Important Terminology

Here, we are introducing some important terminology, which are used in probability.

(i) Experiment- An activity which results in some well-defined outcomes, is called an experiment.
(ii) Random Experiment - The experiment about which no confirm prediction is possible, is called the random experiment.
(iii) Sample Space - The set of all possible outcomes of an experiment, is called the sample space, which is denoted by `S.`
(iv) Event - Every subset of sample space is defined as the event. It is denoted by `E.`

Types of Events

According to possibility of outcomes and sample points, events are defined as follows:
(i) Null Event or Impossible Event - It has no element and is denoted by `phi`
(ii) Sure Event - Since, `S subseteq S` , so `S` is an event which is called sure event.
(iii) Simple Event or Elementary Event - It has only one sample point.
(iv) Mixed Event or Composite Event or Compound Event - This event has two or more than two sample points or elements.
(v) Equally Likely Events - Outcomes are said to be equally likely when we have no reason to believe that one is more likely to occur than the other.
(vi) Mutually Exclusive Events - Events `E_1` and `E_2` are mutually exclusive, if `E_1 cap E_2 = phi`.
Events `E_1, E_2, E_3,......, E_n` are said to be mutually exclusive events , if `E_1 cap E_2 cap..... cap E_n = phi`
(vii) Mutually Exclusive and Exhaustive Events - Events `E_1, E_2 , E_3,......... , E_n` are mutually exclusive and exhaustive events, if
`E_i cap E_j = phi` where, `i ne j` and `i, j = 1, 2, 3,... , n` and `E_1 cup E_2 cup E_3 cup ........ cup E_n = S`
(viii) Complement of an Event `E` - It is denoted by `E'` or `E^c` or `bar E` and it is the set of all sample points of sample space other than sample points of event `E`.

Probability of an Event

Let `E` be an event and `P(E)` denotes the probability of occurrence of the event `E`. Then, we have
`P(E)=(n(E))/(n(S))`

`= ("Number of cases favourable to event E")/("Total number of cases in the sample space")`

It must be clearly understood that, all the cases considered in the above definition should be equally likely.

Note : (i) `P(s)=1; P(phi)=0`
(ii) `0 le P(E) le 1`
(iii) If `A subseteq B` then `P(A) le P(B)`
(iv) `P(A) + P(bar A) =1`
(v) If `P(A) = m/n` then

(a) Probability of odds in favour of `A= (P(A))/(P(bar A)) = m/(n-m)`
(b) Probability of odds against `A = (P(bar A))/(P(A)) =(n-m)/m`

Addition Theorems on Probability

If `P(A +B)` or `P(A cup B)=` Probability of occurrence of atleast one event `A` or `B` and

`P(AB)` or `P(A cap B)=` Probability of happening of events `A` and `B` together, then

(i) When Events are Not Mutually Exclusive If `A` and `B` are two events which are not mutually exclusive,
then

`P(A cup B)= P(A) + P(B)- P(A cap B)`

or `P(A +B)= P(A) + P(B)- P(AB)`

For any three events `A, B, C,`

`P (A cup B cup C)= P(A) + P(B) + P(C)- P(A cap B)- P(B cap C)- P(C cap A)+ P(A cap B cap C)`

or `P(A + B +C)= P(A) + P(B) + P(C)- P(AB)- P(BC) - P(CA) + P(ABC)`

(ii) When Events are Mutually Exclusive - If `A` and `B` are mutually exclusive events, then

`n (A cap B)=0 => P(A cap B)=0`

`P(A cup B)= P(A) + P(B)`

For any three events `A, B, C` which are mutually exclusive,

`P(A cap B)= P(B cap C)= P(C cap A)`

`= P (A cap B cap C)= 0`

`:. P(A cup B cup C)= P(A) + P(B) + P(C)`

The probability of happening of anyone of several mutually exclusive events is equal to the sum of their probabilities, i.e. if `A_1, A_2 , ... , A_n` are mutually exclusive events, then

`P(A_1+A_2 +...+A_n) = P(A_1) +P(A_2)+...P(A_n)`

i.e., `P(sum A_i) = sum P(A_i)`

(iii) Some Other Theorems

(a) Let `A` and `B` be two events associated with a random experiment, then

`P(bar A cap B)= P(B)- P(A cap B)`

`P(A cap bar B)= P(A)- P(A cap B)`

(b) If `B subset A`, then `P(A cap bar B)= P(A)- P(B)`

`P(B) le P(A)`

Similarly, if `A subset B`, then `P(bar A cap B)= P(B) - P(A)`

`P(A) le P(B)`

NOTE : Probabiity of occurrence of neither `A` nor `B` is

`P(bar A cap bar B) = P(bar(A cup B)) = 1- P(A cup B)`

(iv) Generalisation of the Addition Theorem If `A_1, A_2 , ... , A_n` are n events associated with a random experiment, then

`P( cup_(i=1)^n A_i ) = sum_(i=1)^n P(A_i) - sum_(i,j =1)^n P(A_i cap A_j)+ sum_(i,j,k=1)^n P(A_i cap A_j cap A_k) + ....+ (-1)^(n-1) P(A_1 cap A_2 cap....cap A_n)`

If all the events `A_i = 1, 2, ... , n` are mutually exclusive, then `P(cup_(i=1)^n A)= sum_(i=1)^n (A_i)`

NOTE

`• P(bar A cap bar B) = 1- P(A cup B)`
`• P(bar A cup bar B = P(A cap B)+ P(A cap B)`
`• P(A) = P(A cap B)+ P(A cap bar B)`
`• P(B) = P(B cap A)+ P(B cap bar A)`
`P` (exactly one of `E_1,E_2` occurs)
`= P(E_1 cap E_2' ) + P(E_1' cap E_2)`
`= P(E_1)- P(E_1 cap E_2 ) + P(E_2) - P(E_1 cap E_2)`
`= P(E_1) + P(E_2 )- 2P(E_1 cap E_2)`

`• P` (neither `E_1 `nor `E_2` ) `= P(E_1' cap E_2') = 1- P(E_1 cup E_2)`

`• P(E_1' cup E_2') = 1- P(E_1 cap E_2)`

Multiplication Theorems on Probability

If `A` and `B` are two events associated with a random experiment, then

`P(A cap B)= P(A) * P(B/A)` , if `P(A) ne 0`

or `P(A cap B) = P(B) * P(A/B)`, if `P(B) ne 0`

Extension of Multiplication Theorem

If `A_1, A_2 ,.... , A_n` are `n` events related to a random experiment, then

`P(A_1 cap A_2 cap A_3 cap .... cap A_n)=P(A_1) * P(A_2/A_1) * P (A_3/(A_1 cap A_2)) ....P (A_n/(A_1 cap A_2 cap....cap A_(n-1)))`

Conditional Probability

Let `A` and `B` be two events associated with a random experiment. Then, the probability of occurrence of A under the condition that `B` has already occurred and `P(B) ne 0`, is called the conditional probability and it is denoted by `P(A/B)` Thus,
`P(A/B)=` Probability of occurrence of `A` given that `B` has already happened `= (P(A cap B))/(P(B)) =(n (A cap B))/(n (B))`

Similarly, `P(B/A) =` Probability of occurrence of `B`, given that `A` has already happened

`P(B/A) =(P(A cap B))/(P(A)) =(n(A cap B))/(n(A))`

Properties of Conditional Probability

Let `A` and `B` be two events of a sample space S or an experiment, then

(i) `P(S/A) = P(A/A)=1`

(ii) `P((A')/B) =1- (A/B)` where `A'` is the complement of `A`.

Total Probability Theorem

If an event `A` can occur with one of then mutually exclusive and exhaustive events `B_1, B_2 , .... , B_n` and the probabilities

`P(A/(B_1)), P(A/(B_2)),....., P(A/(B_n))` are known, then

`P(A) = sum_(i=1)^n P(B_i) * P(A/B_i)`

Baye's Theorem

If an event `A` can occur with one of then mutually exclusive and exhaustive events `B_1, B_2 , ... , B_n` and the probabilities `p(A/(B_i))` are known , then

`P((B_i)/A) = (P(B_i) * P(A/(B_i)))/(sum_(i=1)^n P(B_i) * P(A/(B_i)))`

Independent Events

Two events are said to be independent, if the occurrence of one does not depend upon the other. If `E_1, E_2.........E_n`
are independent events, then `P(E_1 cap E_2 cap E_3 cap ... cap E_n)= P(E_1) * P(E_2) ... P(E_n)`

If `E` and `F` are independent events, then the pairs `E` and `bar F, bar E` and `F, bar E` and `bar F` are also independent.


`"Properties of Independent Events"`

If `A` and `B` are two independent events, then
(i) `A'` and `B` are also independent events.
(ii) `A` and `B'` are also independent events.
(iii) `A'` and `B'` are also independent events

Important Results

If `n` letters corresponding ton envelopes are placed in the envelopes at random, then

(i) Probability that all letters are in right envelopes `= 1/(n!)`

(ii) Probability that all letters are not in right envelopes `=1- 1/(n!)`

(iii) Probability that no letter is in right envelopes `=1/(2!) + 1/(3!) + 1/(4!) +....+ (-1)^n 1/(n!)`

(iv) Probability that exactly r letters are in right envelopes
`=1/(r!) (1/(2 !) - 1/(3!) + 1/(4!) -......+ (-1)^(n-r) 1/((n-r) !))`

Probability Distribution

A random variable is a real valued function whose domain is the sample space of a random experiment. A random variable is usually denoted by the capital letters `X ,Y,Z, ...` and so on.

The random variable may be of two types as given below.

Discrete Random Variable

A random variable which can take only finite or countably infinite number of values is called a discrete random
variable.

Continuous Random Variable

A random variable which can take any value between two given limits is called a continuous random variable.

Probability Distribution of a Random Variable

If the values of a random variable together with the corresponding probabilities are given, then this description is called a probability distribution of the random variable.

If a random variable `X` takes values, `x_1, x_2 , x_3 , ... ,x_n` with respective probabilities `p_1, p_2 , P_3 , .... p_n` , then is known as the probability distribution of `X`.

Mean

If `X` is a discrete random variable which assumes values `x_1,x_2 ,x_3, ... ,x_n` with respective probabilities

`p_i ,p_2 , p_3 , ... ,p_n` then the mean `bar X` of ` X` is defined as

`bar X = p_1x_1 +p_2 x_2 + .........+ p_i x_i ` or `bar X = sum_(i=1)^n p_1 x_1`

The mean of a random variable `X` is also known as its mathematical expectation and it is denoted by `E' (X)`.

Variance

If `X` is a discrete random variable which assumes values `x_1, x_2 , X_3, ... ,x_n` with the respective probabilities `p_1, p_2 , ...... ,p_n`, then variance of `X` is defined as

Var `(X) = sum_(i=1)^n p_i x_i^2 -(sum_(i=1)^n p_i x_i)^2`

NOTE :
`=>` The mean of a random variable X is also known as its mathematical expectation or expected value arid is denoted by `E (X)`

`=>` The variance and standard deviation of a random variable are always non-negative.

Bernoulli Trials

Trials of a random experiment are called Bernoulli trials, if they satisty the following conditions:

(i) There should be a finite number of trials.

(ii) The trials should be independent of each other.

(iii) Each trial has exactly two outcomes i.e. success or failure.

(iv) The probability of success (or failure) remains the same in each trial

Binomial Distribution

The probability of `r` successes in `n` independent Bernoulli trials is denoted by `P(X =r) `and is given by

`P(X=r) = text()^n C_r p^r q^(n-r)`

where, `p =` Probability of success
and `q =` Probability of failure and `p + q = 1`

(i) Mean `= np`

(ii) Variance `= npq`

(iii) Mean is always greater than variance.

 
SiteLock