









Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
During the study of discrete mathematics, I found this course very informative and applicable.The main points in these lecture slides are:Bayes’ Theorem, Discrete Probability, Conditional Probability, Expected Values, Random Variable, Binomial Distribution, Expected Number, Pair of Dice, Geometric Distribution, Normalization, Variance
Typology: Slides
1 / 16
This page cannot be seen from the preview
Don't miss anything!
We have seen that the following holds:
( | ) (^ ) ( | ) ( ) ( ) ( ) ( | ) (^ ) ( | ) ( ) ( ) ( )
We can write one conditional probability in terms of the other: Bayes’ Theorem
The definition of an expected value of a random variable is:
s S
∈
This equivalent to: ( )
( ) ( ) r X S
E X P X r r ∈
Example: What is the expected number of heads if we toss a fair coin n times?
We know that the distribution for this experiment is the Binomial distribution:
( , ; )! (1 ) !( )! P k n p n p k^ p n^ k k n k = −^ − −
Therefore we need to compute:
0
0
( ) ( ) ! (^) (1 ) !( )!
n k n (^) k n k k
E X k P X k
k n p p k n k
n p
= − =
= =
= (^) − −
=
More examples: A person checking out coats mixed the labels up randomly. When someone collects his coat, he checks out a coat randomly from the remaining coats. What is the expected number of correctly returned coats? There are n coats checked in.
Xi = 1 of correctly returned, and 0 if wrongly returned. Since the labels are randomly permuted, E(Xi) = 1/n E(X1+...Xn) = n 1/n = 1 (independent of the number of checked in coats)
Q: What is the distribution of waiting times until a tail comes up, when we toss a fair coin?
A: Possible outcomes: T, HT, HHT, HHHT, HHHHT, .... (infinitely many possibilities) P(T) = p, P(HT) = (1-p) p, P(HHT) = (1-p)^2 p, ....
1 1 1
( ) 1 (1 ) k 1 k k
P X k p p
∞ ∞ (^) − = =
Normalization:
(matlab) X(s) = number of tosses before success.
Definition: Two random variables X(s) and Y(s) on a sample space S are independent if the following holds: ∀ r r 1 , 2 (^) P X s ( ( ) = r 1 (^) ∧ Y s ( ) = r 2 (^) ) = P X s ( ( ) = r 1 (^) ) P Y s ( ( ) = r 2 )
Examples
Pair of dice is rolled. X1 is value first die, X2 value second die. Are these independent? P(x1=r1) = 1/ P(X2=r2)=1/ P(X1=r1 AND X2=r2)=1/36 = P(X1=r1) P(X2=r2): YES independent.
Are X1 and X=X1+X2 independent? P(X=12) =1/ P(X1=1)=1/ P(X=12 AND X1=1)=0 which is not the product: P(X=12) P(X1=1)
Theorem: If two random variables X and Y are independent over a sample space S then: E(XY)=E(X) E(Y). (proof, read book)
Note1: The reverse is not true: Two random variables do not have to be independent for E(XY)=E(X)E(Y) to hold. Note2: If 2 random variables are not independent, it follows that E(XY) does not have to be equal to E(X)E(Y), although it might still happen.
Example: X counts number of heads when a coin is tossed twice: P(X=0) =1/4 (TT) P(X=1)=1/2 (HT,TH) P(X=2) =1/4 (HH). E(X) = 1x½+2x1/4=1. Y counts the number of tails: E(Y)=1 as well (symmetry, switch role H,T). However, P(XY=0) = 1/2 (HH,TT) P(XY=1) =1/2 (HT,TH) E(XY) = 0x1/2 + 1x1/2=1/
Theorem: For independent random variables the variances add: (proof in book) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( , )
E X Y E X E Y always true V X Y V X V Y X Y independent
Example:
X1 and X2 are independent. V(X1+X2)=V(X1)+V(X2)=2V(X1) E(X1)=1/ V(X1) = (0-1/2)^2 x ½ + (1-1/2)^2 x ½ =1/ V(X) = ½ STD(X)=sqrt(1/2). Docsity.com
What is the variance of the number of successes when n independent Bernoulli trials are performed.
V(X) = V(X1+...+Xn)=nV(X1) V(X1) = (0-p)^2 x (1-p) + (1-p)^2 x p = p^2(1-p) + p(1-p)^2=p(1-p) V(X)=np(1-p)
(matlab demo)
Example: What is the probability that with 100 Bernoulli trials we find more than 89 or less than 11 successes when the prob. of success is ½.
X counts number of successes. EX=100 x ½ = V(X) = 100 x ½ x ½ = 25.
P(|X-50|>=40)<=25/40^2 = 1/