Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability for communication engineering, Lecture notes of Communications Engineering

This document is for begginers who want to learn probabiliy

Typology: Lecture notes

2018/2019

Uploaded on 09/01/2019

geeta-ingle
geeta-ingle 🇮🇳

1 document

1 / 14

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
4/12/2019
1
Probability
Probability is a measure of how likely an event
is to occur.
For example –
Today there is a 60% chance of rain.
The odds of winning the lottery are a million
to one.
probability of event = p
0 <= p <= 1
0 = certain non-occurrence
1 = certain occurrence
.5 = even odds
.1 = 1 chance out of 10
Probability varies between 0 and 1
If an event will NEVER happen, then the probability of the event is 0
or 0%., 0indicates(0%) impossibility of event.
The higher the probability of an event, the more likely that the event will
occur.
For each outcome we will get probability
Sum of all the probability of sample space is equal to one, sample
space consists of outcomes.
If an event is certain to happen, then the probability of the
event is 1 or 100%, 1indicates(100%) certainty
If an event is just as likely to happen as to not happen, then
the probability of the event is ½, 0.5 or 50%.
Impossible Unlikely Equal Chances Likely Certain
0 0.5 1
0% 50% 100%
½
The probability of an event is written as:
P(event) = chance of event can occur
total number of outcomes
An outcome(X) is a possible result of a
probability experiment
When rolling a number cube, the possible
outcomes are 1, 2, 3, 4, 5, and 6
When rolling a number cube, the event of rolling
an even number is 3 (you could roll a 2, 4 or 6).
What is the probability of getting heads when flipping a
coin?
P(heads) = 1 head on a coin = 1
total outcomes = 2 sides to a coin = 2
P(heads)= ½ = 0.5 = 50%
1. What is the probability that the spinner will stop on
part A?
2. What is the probability that the spinner will
stop on
(a) An even number?
(b) An odd number?
3. What is the probability that the spinner
will stop in the area marked A?
AB
C D
3 1
2
A
C B
TRY THESE:
P(A)=1/4=25%
P(even)=1/3
P(areaA)=1/3
Probability
Word Problem:
Lawrence is the captain of his track team.
The team is deciding on a color and all
eight members wrote their choice down
on equal size cards. If Lawrence picks one
card at random, what is the probability
that he will pick blue?
Number of blues = 3
Total cards = 8
yellow
red
blue blue
blue
green black
black
3/8 or 0.375 or 37.5%
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe

Partial preview of the text

Download Probability for communication engineering and more Lecture notes Communications Engineering in PDF only on Docsity!

Probability

  • Probability is a measure of how likely an event

is to occur.

  • For example –
    • Today there is a 60% chance of rain.
    • The odds of winning the lottery are a million

to one.

  • probability of event = p

0 <= p <= 1

0 = certain non-occurrence

1 = certain occurrence

  • .5 = even odds
  • .1 = 1 chance out of 10

Probability varies between 0 and 1

  • If an event will NEVER happen, then the probability of the event is 0

or 0%., 0 indicates(0%) impossibility of event.

The higher the probability of an event, the more likely that the event will

occur.

For each outcome we will get probability

Sum of all the probability of sample space is equal to one, sample

space consists of outcomes.

  • If an event is certain to happen, then the probability of the

event is 1 or 100%, 1 indicates(100%) certainty

  • If an event is just as likely to happen as to not happen, then

the probability of the event is ½, 0.5 or 50%.

Impossible Unlikely Equal Chances Likely Certain

0 0.5 1

0% 50% 100%

½

  • The probability of an event is written as:

P(event) = chance of event can occur

total number of outcomes

  • An outcome(X) is a possible result of a

probability experiment

  • When rolling a number cube, the possible

outcomes are 1, 2, 3, 4, 5, and 6

  • When rolling a number cube, the event of rolling

an even number is 3 (you could roll a 2, 4 or 6).

What is the probability of getting heads when flipping a

coin?

P(heads) = 1 head on a coin = 1

total outcomes = 2 sides to a coin = 2

P(heads)= ½ = 0.5 = 50%

  1. What is the probability that the spinner will stop on

part A?

  1. What is the probability that the spinner will

stop on

(a) An even number?

(b) An odd number?

  1. What is the probability that the spinner

will stop in the area marked A?

B A

C D

3 1

2

A

C B

TRY THESE:

P(A)=1/4=25%

P(even)=1/

P(areaA)=1/

Probability Word Problem:

  • Lawrence is the captain of his track team.

The team is deciding on a color and all

eight members wrote their choice down

on equal size cards. If Lawrence picks one

card at random, what is the probability

that he will pick blue?

Number of blues = 3

Total cards = 8

yellow

red

blue

blue

blue

green

black

black

3/8 or 0.375 or 37.5%

  • Donald is rolling a number cube labeled

1 to 6. What is the probability of the

following?

a.) an odd number

odd numbers – 1, 3, 5

total numbers – 1, 2, 3, 4, 5, 6

b.) a number greater than 5

numbers greater – 6

total numbers – 1, 2, 3, 4, 5, 6

Let’s Work These Together

3/6 = ½ = 0.5 = 50%

1/6 = 0.166 = 16.6%

  1. What is the probability of spinning a number

greater than 1?

  1. What is the probability that a spinner with five

congruent sections numbered 1-5 will stop on an

even number?

  1. What is the probability of rolling a multiple of 2 with

one toss of a number cube?

TRY THESE:

1 2

3 4

P(>1)=3/4=75%

P(even)=2/5=40%

P(multiple of 2)=3/6=50%

Probability Properties

X is an outcome of an event

  • A simple example is the tossing of a coin.

The two outcomes ("heads" and "tails") are both equally

probable; the probability of "heads" equals the probability of

"tails"; and since no other outcomes are possible, the

probability of either "heads" or "tails" is 1/2 (which could also

be written as 0.5 or 50%)

Ex: A Coin is tossed, find the probability head outcome.

P(X)= 1 / 2

Ex: A dice is thrown find the probability of each of the outcome

Probability(X)=Desired outcome/ total no. of outcomes

P= 1 / 6

Ex: tossing a coin twice find the probability of each outcomes.

possible outcomes "head-head", "head-tail", "tail-head", and "tail-tail"

outcomes.

Sample space S={HH,HT,TH,TT}

The probability of getting an outcome of “head-head" is 1 out of 4

outcomes or 1 / 4 or 0. 25 (or 25 %). Suppose X=heads

Total Probability=

X 2 1 0

P(x) ¼ 1 / 2 ¼

Probability of Event A

The probability of an event A is written as, P(A).

Probability of not Event A

The opposite or complement of an event A is the event

[not A] (that is, the event of A not occurring), often

denoted as

; its probability is given by

P(not A) = 1 − P(A).

Probability of Event A and not event A

Ex: Tossing a coin three times, write the sample space and

Find the probability of head outcome.

S={HHH, HHT,HTH,THH,HTT,TTH,THT,TTT}

Let X is random variable and X= count of heads in sample

space. When a coin tossed three times

First type of outcomes is X= 3 heads, then subsample (HHH)

Second type of outcomes X= 2 heads, then

Possibilities subsamples are (HHT,HTH,THH)

Third type of outcome X= 1 head,

then possibilities of subsamples (HTT,TTH,THT)

Forth type of outcome X= 0 heads, then possibilities (TTT)

Multiple outcomes ex: X= 2 (equal heads) then first use

intersection

=(( 1 / 2 x 1 / 2 x 1 / 2 ) +( 1 / 2 x 1 / 2 x 1 / 2 )+( 1 / 2 x 1 / 2 x 1 / 2 ))= 3 / 8

=( 1 / 2 x 1 / 2 x 1 / 2 )= 1 / 8 A Single outcome, use intersection i. e.

and then union

X=count of heads 3 2 1 0

P(x) 1 / 8 3 / 8 3 / 8 1 / 8

Probability distribution must be 1

Difference of random variables

X Difference between no. of heads and tail,

Ex: when two coins are tossed.

Probability distribution table for Difference, S{HH,HT,TH,TT}

H T Diff

X(HH,TT)=2 i.e. ((1/2x1/2)+(1/2x1/2))=1/

X(HT,TH)=0 i.e. ((1/2x1/2)+(1/2x1/2))=1/

Difference-X 2 0 sum

P(x) 1 / 2 1 / 2 1

Probability distribution table

Ex: If the outcome is head then 4 bits are sending per second, and if

the outcome is Tail then bits are not sent through the Tx line. When

three coins are tossed (i) find the bits sent for variable outcomes in the

sample space (ii) Find the probability distribution for the same.

Possibilities in

terms of no.(H)

Possibilities in

terms of no.(T)

send(+), not

sent(-) Difference

amount(X)

Possibilities

Bits sent/not

sent

HHH 3 0 12 Bits sent

HHT,THH,HTH 2 1 8 - 0 = 8 Bits sent

HTT,THT,TTH 1 2 4 - 0 = 4 Bits not sent

TTT 0 3 0 - 0 Bits not sent

P(HHH)=(1/2x1/2x1/2) = 1/

P(HHT,THH,HTH )=(1/2x1/2x1/2)+(1/2x1/2x1/2)+(1/2x1/2x1/2) = 3/

P(HTT,THT,TTH)=(1/2x1/2x1/2)+(1/2x1/2x1/2)+(1/2x1/2x1/2) = 3/

P(TTT)=(1/2x1/2x1/2) = 1/

Probability Distribution

X 12 8 4 0

P(x) 1 / 8 3 / 8 3 / 8 1 / 8

Total=1/8+3/8+3/8+1/8=

Mean of random variables

X x 1 x 2 x 3 x 4

P(x) P 1 P 2 P 3 P 4

For Discrete Random Variable: Expected value or

Mean of probability distribution or Mean of random variable,

Expected Value E(x)=X

1

P

1

+X

2

P

2

+X

3

P

3

+X

4

P

4

+--+X

n

P

n

mean ( ) E ( x ) xP ( x )

i

forall x

i

The Probability distribution of discrete random variable x, for Ex.

x 0 1 2

P(x) 0.16 0.48 0.

mean (  ) E ( x ) 0. 48  2  0. 36  1. 2

Ex: X is the real no. associated with sample space, When One dice is

thrown,

X 1 2 3 4 5 6

P(x) 1 / 6 1 / 6 1 / 6 1 / 6 1 / 6 1 / 6

X

2

1 4 9 16 25 36

E(X)=1/6+2/6+3/6+4/6+5/6+6/6=21/

Variance and Standard Deviation

Variance=E(x

2

)-(E(x))

2

Standard deviation( )=  Varience

sample space S={1,2,3,4,5,6}

 Varience  1. 71

E(x)=21/6=3.

E(x

2

)=1/6+4/6+9/6+16/6+25/6+36/6=91/6=15.

Variance=E(x

2

)-(E(x))

2

=2.

Standard deviation( )=

Ex: Two dice thrown at a time, if x outcome is denoted as the no. of

six, write Probability Distribution and find mean, variance, and

standard deviation

Dice1 dice2 X=No. of 6

6 6 not(1,2,3,4,5) 1

6not 6 1

6not 6not 0

P(6)=1/6, P(6not)=5/

P(6 6)=1/6x1/6=1/

P(6,6not)=(1/6x5/6)=1/

P(6not 6not)=(5/6x5/6)=25/

P(6 6not, 6not 6)=(1/6x5/6)+(5/6x1/6)=5/36+5/36=10/

X 0 1 2

P(x) 25 / 36 5 / 36 + 5 / 36 = 10 / 36 1 / 6 x 1 / 6 = 1 / 36

X

2

E(x)=(10/36)+(2x1/36)=12/36=1/

E(x

2

)=(10/36)+(4x(1/36))=14/

Variance(x)=E(x

2

)-(E(x))

2

Variance=(14/36-1/9)=(14-4)/36=10/36=0.

Varience  0. 527

Standard deviation( )=

Variance more, stability less in Discrete Random Variables

It is are of two types

  1. Cumulative Distribution Function

2. Probability Density (Distribution) Function

Probability Distributions

Continuous random variables: It is a range of random variable

Continuous random variable: Ex: Y= 2 , 2. 4 , 2. 8

  • PDF is constant over the a,b interval
  • CDF is continuously increasing we get the ramp function

CDF: Probability of previous outcome is added in to the

probability of next outcome is called Cumulative Density

Function (CDF), symbol Fx(x)

PDF: Probability is uniform for every outcome (ranndom

variable) is called Probability Density (in discrete it is

Probability Distribution) Function (PDF), Symbol f x(x)

3). Non decreasing

Properties

Cumulative distribution Function

probability of P( 2)=1/6+1/6=2/

probability of P( 1)=1/

probability of P( 3)=1/6+1/6+1/6=3/

Probability density (distribution) Function:

Properties:

(1)Finding a CDF where there is only one function in the pdf

(2)Finding a CDF where there is more than one function in the pdf

Conditional probability

Conditional probability is the probability of some event A , given that

another event B has occurred.

Conditional probability is written P(A/B), or P B

(A) and is read"the

probability of A , given B ".itis conditional probability A given B.

PB

PA B

P A B

If P ( A | B ) = P ( A ), then events A and B are said to be independent:

Also, in general, P ( A | B ) (the conditional probability of A given B) is not

equal to P ( B | A ).

Ex: Suppose that somebody secretly rolls two fair six-sided dice, and we

must predict the outcome (i.e sum of the two upward faces).

Example on conditional probability

  • Moments of distribution is a set of parameters
  • It consist of c and n values, depending on the values of c

and n, we can calculate various terms which are very

essential in statistics, and used in communication.

  • C=0 then moments are w. r. t. 0 value, i.e. talking about

the raw or absolute moment.

  • n=0, 0

th

moments;

  • C= μ, then moments are w. r. t. population mean value,

it is about central moments.

  • n=1, 1

st

moments;

  • n=2, 2

nd

moment; and so on

Discrete random variable

2 2

EX xpx

C = 0 & n=1, 1

st

moment (absolute)

C = 0 & n=2, 2

nd

moment (absolute)

Moments and Expectation

Moments and Expectation: Continuous Random Variable

E  X  xf ( x ) dx



 

C 0 & n 1 , 1 moment ( absolute )

st

 

E  X  x f ( x ) dx

2 2



 

C 0 & n 22 moment ( absolute )

nd

 

Ex: Expectation E(X)

X= workout, the probability of workout in a week is given

as:

Find the expected value E(x)

or mean (μ) for the given

data

E(X) = μ = 2.

st

moment

nd

moment  

2

E X

i i i

EX xpx

2 2

i i i

EX x px

( 1  p ). 0  p. 1  p

 p  p  p

2 2

Ex: Variance(X)

 p ( 1  p )

2

 p  p

2 2

Var ( X ) E ( X )( E ( X ))

Ex: Variance(X)

Check

Law of large numbers

X 1

, X 2

,------is an infinite sequence of i.i.d. of

The sample average converges to the expected value

n

n

n

EX EX E X

n

EX X X
E X

n n

n

2 1 2

n n

n

n

VarX VarX Var X

n

VarX X X

Var X

n n

n

2

2

2

2

2

2

1 2

( ) ( ) ( ) ( )

( )

 

 

  

  

2

2

2



  for

n

Var X

P X

n

n

In Weak Law of Large Numbers states that:

Two different versions of the law of large numbers are described below; they are

called the strong law of large numbers , and the weak law of large numbers. Stated

for the case where X 1

, X 2

, ... is an infinite sequence of i.i.d. Lebesgue integrable

random variables with expected value E( X 1

) = E( X

2

) = ...= μ , both versions of the

law state that – with virtual certainty – the sample average

n

EX

( ) n

VarX

P ( X     ) for   0. 1

n