Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Bayes’ Theorem - Discrete Mathematics and its Applications - Lecture Slides, Slides of Discrete Mathematics

During the study of discrete mathematics, I found this course very informative and applicable.The main points in these lecture slides are:Bayes’ Theorem, Discrete Probability, Conditional Probability, Expected Values, Random Variable, Binomial Distribution, Expected Number, Pair of Dice, Geometric Distribution, Normalization, Variance

Typology: Slides

2012/2013

Uploaded on 04/27/2013

atmaja
atmaja 🇮🇳

4.2

(45)

182 documents

1 / 16

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Lecture 9
5.3 Discrete Probability
Docsity.com
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff

Partial preview of the text

Download Bayes’ Theorem - Discrete Mathematics and its Applications - Lecture Slides and more Slides Discrete Mathematics in PDF only on Docsity!

Lecture 9

5.3 Discrete Probability

5.3 Bayes’ Theorem

We have seen that the following holds:

( | ) (^ ) ( | ) ( ) ( ) ( ) ( | ) (^ ) ( | ) ( ) ( ) ( )

( | ) (^ |^ )^ (^ )^ ( | ) (^ |^ )^ (^ )

P E F P E^ F P E F P F P E F

P F

P F E P E^ F P F E P E P E F

P E

P E F P F^ E P E^ P F E P E^ F P F

P F P E

We can write one conditional probability in terms of the other: Bayes’ Theorem

5.3 Expected Values

The definition of an expected value of a random variable is:

s S

E X X s p s

This equivalent to: ( )

( ) ( ) r X S

E X P X r r

Example: What is the expected number of heads if we toss a fair coin n times?

 We know that the distribution for this experiment is the Binomial distribution:

( , ; )! (1 ) !( )! P k n p n p k^ p n^ k k n k = −^ − −

Therefore we need to compute:

0

0

( ) ( ) ! (^) (1 ) !( )!

n k n (^) k n k k

E X k P X k

k n p p k n k

n p

= − =

= =

= (^) − −

=

More examples: A person checking out coats mixed the labels up randomly. When someone collects his coat, he checks out a coat randomly from the remaining coats. What is the expected number of correctly returned coats? There are n coats checked in.

Xi = 1 of correctly returned, and 0 if wrongly returned. Since the labels are randomly permuted, E(Xi) = 1/n E(X1+...Xn) = n 1/n = 1 (independent of the number of checked in coats)

5.3 Geometric distribution

Q: What is the distribution of waiting times until a tail comes up, when we toss a fair coin?

A: Possible outcomes: T, HT, HHT, HHHT, HHHHT, .... (infinitely many possibilities) P(T) = p, P(HT) = (1-p) p, P(HHT) = (1-p)^2 p, ....

1 1 1

( ) 1 (1 ) k 1 k k

P X k p p

∞ ∞ (^) − = =

∑ =^ =^ ⇒^ ∑ −^ =

P X ( = k ) = (1 − p ) k −^1 p geometric distribution

Normalization:

(matlab) X(s) = number of tosses before success.

5.3 Independence

Definition: Two random variables X(s) and Y(s) on a sample space S are independent if the following holds:r r 1 , 2 (^) P X s ( ( ) = r 1 (^) ∧ Y s ( ) = r 2 (^) ) = P X s ( ( ) = r 1 (^) ) P Y s ( ( ) = r 2 )

Examples

  1. Pair of dice is rolled. X1 is value first die, X2 value second die. Are these independent? P(x1=r1) = 1/ P(X2=r2)=1/ P(X1=r1 AND X2=r2)=1/36 = P(X1=r1) P(X2=r2): YES independent.

  2. Are X1 and X=X1+X2 independent? P(X=12) =1/ P(X1=1)=1/ P(X=12 AND X1=1)=0 which is not the product: P(X=12) P(X1=1)

5.3 Independence

Theorem: If two random variables X and Y are independent over a sample space S then: E(XY)=E(X) E(Y). (proof, read book)

Note1: The reverse is not true: Two random variables do not have to be independent for E(XY)=E(X)E(Y) to hold. Note2: If 2 random variables are not independent, it follows that E(XY) does not have to be equal to E(X)E(Y), although it might still happen.

Example: X counts number of heads when a coin is tossed twice: P(X=0) =1/4 (TT) P(X=1)=1/2 (HT,TH) P(X=2) =1/4 (HH). E(X) = 1x½+2x1/4=1. Y counts the number of tails: E(Y)=1 as well (symmetry, switch role H,T). However, P(XY=0) = 1/2 (HH,TT) P(XY=1) =1/2 (HT,TH) E(XY) = 0x1/2 + 1x1/2=1/

5.3 Variance

Theorem: For independent random variables the variances add: (proof in book) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( , )

E X Y E X E Y always true V X Y V X V Y X Y independent

Example:

  1. We toss 2 coins, Xi(H)=1, Xi(T)=0. What is the STD of X=X1+X2?

X1 and X2 are independent. V(X1+X2)=V(X1)+V(X2)=2V(X1) E(X1)=1/ V(X1) = (0-1/2)^2 x ½ + (1-1/2)^2 x ½ =1/ V(X) = ½ STD(X)=sqrt(1/2). Docsity.com

5.3 Variance

What is the variance of the number of successes when n independent Bernoulli trials are performed.

V(X) = V(X1+...+Xn)=nV(X1) V(X1) = (0-p)^2 x (1-p) + (1-p)^2 x p = p^2(1-p) + p(1-p)^2=p(1-p) V(X)=np(1-p)

(matlab demo)

Example: What is the probability that with 100 Bernoulli trials we find more than 89 or less than 11 successes when the prob. of success is ½.

X counts number of successes. EX=100 x ½ = V(X) = 100 x ½ x ½ = 25.

P(|X-50|>=40)<=25/40^2 = 1/