Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Sequence of Random Variables - Statistical Science - Exam, Exams of Statistics

This is the Exam of Statistical Science which includes Stochastic Differential Equation, Brownian Motion, Solution, Measurable Function, Markov Process, Starting, Bounded Functions, Local Martingale, First Time etc. Key important points are: Sequence of Random Variables, Mean, Values, Nonnegative Numbers, Binomial Distribution, Random Variable, Placed Uniformly, Total Number, Distribution, Variables

Typology: Exams

2012/2013

Uploaded on 02/26/2013

dharmanand
dharmanand 🇮🇳

3.3

(3)

61 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
M. PHIL. IN STATISTICAL SCIENCE
Thursday, 27 May, 2010 1:30 pm to 3:30 pm
INTRODUCTION TO PROBABILITY
Attempt no more than THREE questions.
There are FIVE questions in total.
The questions carry equal weight.
STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS
Cover sheet None
Treasury Tag
Script paper
You may not start to read the questions
printed on the subsequent pages until
instructed to do so by the Invigilator.
pf3
pf4
pf5

Partial preview of the text

Download Sequence of Random Variables - Statistical Science - Exam and more Exams Statistics in PDF only on Docsity!

M. PHIL. IN STATISTICAL SCIENCE

Thursday, 27 May, 2010 1:30 pm to 3:30 pm

INTRODUCTION TO PROBABILITY

Attempt no more than THREE questions. There are FIVE questions in total. The questions carry equal weight.

STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS

Cover sheet None Treasury Tag Script paper

You may not start to read the questions printed on the subsequent pages until instructed to do so by the Invigilator.

(a) Let (Xn, n > 0) be a sequence of random variables, and let X be a random variable. What does it mean to say that Xn converges to X in distribution? Show that if Xn and X take values in { 0 , 1 ,... } , then this is equivalent to

P (Xn = k) → P (X = k)

as n → ∞ , for every k ∈ { 0 , 1 ,.. .}. Let cn be a sequence of nonnegative numbers such that cn → c > 0 as n → ∞. Let Xn have a binomial distribution with parameters (n, cn/n). Show that Xn converges to X in distribution, for some random variable X to be determined.

(b) Let λ > 0 , and suppose that N = ⌊λn⌋ balls are placed uniformly at random, independently from one another, in n urns labelled 1 through n. Let Zn(i) denote the total number of balls in urn number i when all balls have been placed. What is the distribution of Zn(i) for a fixed 1 6 i 6 n? Explain briefly why the random variables (Zn(i)) 1 6 i 6 n cannot be independent. Find a random variable Z such that Zn(i) → Z in distribution as n → ∞.

(c) Deduce the following: let Wn denote the number of empty urns when all balls have been placed. Then as n → ∞ ,

E(Wn) ∼ e −λ^ n ,

i.e, the ratio of the two sides converges to 1.

Introduction to Probability

(a) State and give a proof of the Borel-Cantelli lemmas.

(b) Let X 1 ,... Xn be random variables such that var(Xi) < ∞ for all 1 6 i 6 n. Show that var(X 1 +... + Xn) =

∑^ n

i=

var(Xi) + 2

1 6 i< j 6 n

cov(Xi, Xj ).

[Hint: it may help to introduce the random variable Yi = Xi − E(Xi).]

Hence deduce that var(X 1 +... + Xn) < ∞.

(c) Let (A 1 , A 2 ,.. .) be a sequence of events such that

i=1 P(Ai) =^ ∞^. We also assume that these events are pairwise independent, i.e., for every i 6 = j , Ai and Aj are independent. For k > 1 define a random variable Nk by

Nk =

∑^ k

i=

(^1) Ai.

Let mk = E(Nk). Show that mk → ∞ as k → ∞ and that var(Nk) 6 mk.

(d) Using Chebyshev’s inequality, show that P (Nk 6 mk/2) → 0 as k → ∞. Conclude that the events Ai occur infinitely often almost surely. Explain briefly why this result is stronger than the second Borel-Cantelli lemma.

Introduction to Probability

Let S = { 1 , 2 , 3 } and consider a Markov chain X = (X 0 , X 1 ,... ) with values in S defined by the transition matrix P defined as follows:

P =

(a) Draw a diagram to represent the possible one-step transitions of X , including the transition probabilities. What happens if X 0 = 3? Is X irreducible?

(b) Define a stopping time τ by τ = inf{n > 0 : Xn = 3}. For n > 0 and x ∈ S define αn(x) = Px(τ > n). Show αn(2) = 13 α (^) n− 1 (1), and deduce a recurrence relation for αn(1). Deduce that there exists A, B ∈ R and λ > μ such that αn(1) = Aλn^ + Bμn (it is not asked to compute A and B but you should find the value of λ and μ). Conclude that αn(1) ∼ A

)n

as n → ∞ , where A ∈ R is the same as above.

(c) Let S′^ = { 1 , 2 }. Using part (b) above, compute for x, y ∈ S′,

q(x, y) = lim n → ∞ Px(X 1 = y | τ > n).

Show that, for any k > 1 and x 1 ,... , xk ∈ S′, as n → ∞ :

nlim → ∞ Px(X^1 =^ x^1 ,... , Xk^ =^ xk^ |^ τ > n) =^ Px(Y^1 =^ x^1 ,... , Yk^ =^ xk)

where Y = (Y 0 , Y 1 ,... ) is the Markov chain on S′^ with transition probabilities determined by q(x, y).

(d) Determine whether or not Y has an invariant distribution, and find the invariant distribution if it exists.

Introduction to Probability [TURN OVER