



Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
This is the Exam of Statistical Science which includes Stochastic Differential Equation, Brownian Motion, Solution, Measurable Function, Markov Process, Starting, Bounded Functions, Local Martingale, First Time etc. Key important points are: Sequence of Random Variables, Mean, Values, Nonnegative Numbers, Binomial Distribution, Random Variable, Placed Uniformly, Total Number, Distribution, Variables
Typology: Exams
1 / 6
This page cannot be seen from the preview
Don't miss anything!
Thursday, 27 May, 2010 1:30 pm to 3:30 pm
Attempt no more than THREE questions. There are FIVE questions in total. The questions carry equal weight.
Cover sheet None Treasury Tag Script paper
You may not start to read the questions printed on the subsequent pages until instructed to do so by the Invigilator.
(a) Let (Xn, n > 0) be a sequence of random variables, and let X be a random variable. What does it mean to say that Xn converges to X in distribution? Show that if Xn and X take values in { 0 , 1 ,... } , then this is equivalent to
P (Xn = k) → P (X = k)
as n → ∞ , for every k ∈ { 0 , 1 ,.. .}. Let cn be a sequence of nonnegative numbers such that cn → c > 0 as n → ∞. Let Xn have a binomial distribution with parameters (n, cn/n). Show that Xn converges to X in distribution, for some random variable X to be determined.
(b) Let λ > 0 , and suppose that N = ⌊λn⌋ balls are placed uniformly at random, independently from one another, in n urns labelled 1 through n. Let Zn(i) denote the total number of balls in urn number i when all balls have been placed. What is the distribution of Zn(i) for a fixed 1 6 i 6 n? Explain briefly why the random variables (Zn(i)) 1 6 i 6 n cannot be independent. Find a random variable Z such that Zn(i) → Z in distribution as n → ∞.
(c) Deduce the following: let Wn denote the number of empty urns when all balls have been placed. Then as n → ∞ ,
E(Wn) ∼ e −λ^ n ,
i.e, the ratio of the two sides converges to 1.
Introduction to Probability
(a) State and give a proof of the Borel-Cantelli lemmas.
(b) Let X 1 ,... Xn be random variables such that var(Xi) < ∞ for all 1 6 i 6 n. Show that var(X 1 +... + Xn) =
∑^ n
i=
var(Xi) + 2
1 6 i< j 6 n
cov(Xi, Xj ).
[Hint: it may help to introduce the random variable Yi = Xi − E(Xi).]
Hence deduce that var(X 1 +... + Xn) < ∞.
(c) Let (A 1 , A 2 ,.. .) be a sequence of events such that
i=1 P(Ai) =^ ∞^. We also assume that these events are pairwise independent, i.e., for every i 6 = j , Ai and Aj are independent. For k > 1 define a random variable Nk by
Nk =
∑^ k
i=
(^1) Ai.
Let mk = E(Nk). Show that mk → ∞ as k → ∞ and that var(Nk) 6 mk.
(d) Using Chebyshev’s inequality, show that P (Nk 6 mk/2) → 0 as k → ∞. Conclude that the events Ai occur infinitely often almost surely. Explain briefly why this result is stronger than the second Borel-Cantelli lemma.
Introduction to Probability
Let S = { 1 , 2 , 3 } and consider a Markov chain X = (X 0 , X 1 ,... ) with values in S defined by the transition matrix P defined as follows:
(a) Draw a diagram to represent the possible one-step transitions of X , including the transition probabilities. What happens if X 0 = 3? Is X irreducible?
(b) Define a stopping time τ by τ = inf{n > 0 : Xn = 3}. For n > 0 and x ∈ S define αn(x) = Px(τ > n). Show αn(2) = 13 α (^) n− 1 (1), and deduce a recurrence relation for αn(1). Deduce that there exists A, B ∈ R and λ > μ such that αn(1) = Aλn^ + Bμn (it is not asked to compute A and B but you should find the value of λ and μ). Conclude that αn(1) ∼ A
)n
as n → ∞ , where A ∈ R is the same as above.
(c) Let S′^ = { 1 , 2 }. Using part (b) above, compute for x, y ∈ S′,
q(x, y) = lim n → ∞ Px(X 1 = y | τ > n).
Show that, for any k > 1 and x 1 ,... , xk ∈ S′, as n → ∞ :
nlim → ∞ Px(X^1 =^ x^1 ,... , Xk^ =^ xk^ |^ τ > n) =^ Px(Y^1 =^ x^1 ,... , Yk^ =^ xk)
where Y = (Y 0 , Y 1 ,... ) is the Markov chain on S′^ with transition probabilities determined by q(x, y).
(d) Determine whether or not Y has an invariant distribution, and find the invariant distribution if it exists.
Introduction to Probability [TURN OVER