Let {Yk | k = 0 1 . . . } denote a sequence of independent and identically
distributed (i.i.d.) discrete random variables. These random variables are governed by
the probability distribution [p1 p2 . . . pi . . . ] where pi = P{Yk = i} for all i = 1 2 . . . .
Define
Xn = [0 for n = 0 Xn = [ summation (k=1 to n) Yk for n = 1 2 . . . .
(a) Show that X = {Xn n = 0 1 2 . . . } is a Markov chain.
(b) Compute the transition probability matrix for X in terms of {pi}.
Attachments: