《随机过程-第二版》(英文电子版) Sheldon M. Ross 答案整理,此书作为随机过程经典教材没有习题讲解,所以将自己的学习过程记录下来,部分习题解答参考了网络,由于来源很多,出处很多也不明确,无法一一注明,在此一并表示感谢!希望对大家有帮助,水平有限,不保证正确率,欢迎批评指正,转载请注明出处。
Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.
6.1 If \(\{Z_n, n \geq 1 \}\) is a martingale show that, for \(1 \leq k < n,\)
$$E[Z_n|Z_1, \dots, Z_k] = Z_k$$
$$\begin{align}
E[Z_n|Z_1, \dots, Z_k] &= E[E[Z_n|Z_1, \dots, Z_{n-1}]|Z_1, \dots, Z_k]\\
&= E[Z_{n-1}|Z_1, \dots, Z_k]\\
&\dots\\
&=E[Z_{k+1}|Z_1, \dots, Z_k] = Z_k
\end{align}$$
6.3 Verify that \(X_n/m^n, n\geq 1\), is a martingale when \(X_n\) is the size of the nth generation of a branching process whose mean number of offspring per individual is \(m\)
$$\begin{align}
E[\frac{X_{n+1}}{m^{n+1}}|\frac{X_1}{m}, \dots, \frac{X_n}{m^n}]&=E[\frac{\sum_{i=1}^{X_n}Z_i}{m^{n+1}}|\frac{X_1}{m}, \dots, \frac{X_n}{m^n}] \\
&= E[\frac{mX_n}{m^{n+1}}|\frac{X_1}{m}, \dots, \frac{X_n}{m^n}] \\
&= \frac{X_n}{m^n}
\end{align}$$
6.4 Consider the Markov chain which at each transition either goes up 1 with probability \(p\) or down with probability \(q = 1 – p\). Argue that \((q/p)^{S_n}, n \geq 1\), is a martingale.
$$\begin{align}
E[(q/p)^{S_{n+1}}|(q/p)^{S_1}, \dots, (q/p)^{S_n}]&= E[(q/p)^{S_n}(q/p)^{X_{n+1}}|(q/p)^{S_1}, \dots, (q/p)^{S_n}] \\
&= (q/p)^{S_n} E[(q/p)^{X_{n+1}}] \\
&= (q/p)^{S_n}
\end{align}$$
6.5 Consider a Markov chain \(\{X_n, n \geq 0\}\) with \(p_{NN} = 1\). Let \(P(i)\) denote the probability that this chain eventually enters state \(N\) given that it starts in state \(i\). Show that \(\{P(X_n), n \geq 0\}\) is a martingale.
$$\begin{align}
E[P(X_{n+1})|P(X_0), \dots, P(X_n)]&= \sum_k P_{X_n, k}P(k) \\
&=P(X_n)
\end{align} $$
6.6 Let \(X(n)\) denote the size of the nth generation of a branching process, and let \(\pi_0\) denote the probability that such a process, starting with a single individual, eventually goes extinct. Show that \(\{\pi_0^{X_n}, n \geq 0\}\) is a martingale.
$$\begin{align}
E[\pi_0^{X_{n+1}}|\pi_0^{X_1}, \dots, \pi_0^{X_n}]&= \prod_{i=1}^{X_n}E[\pi_0^{Z_i}|\pi_0^{X_1}, \dots, \pi_0^{X_n}] \\
&=\pi_0^{X_n}
\end{align}$$
6.9 A process \(\{Z_n, n \geq 1\}\) is said to be a reverse, or backwards, martingale if \(E|Z_n| < \infty\) for all \(n\) and
$$E[Z_n|Z_{n+1}, Z_{n+2}, \dots] = Z_{n+1}$$
Show that if \(X_i, i \geq 1\), are independent and identically distributed random variables with finite expectation, then \(Z_n = (X_1 + \dots + X_n)/n, n \geq 1\), is a reverse martingale.
$$\begin{align}
&E[Z_n|Z_{n+1}, Z_{n+2}, \dots]\\
=& E[\frac{(n+1)Z_{n+1} – X_{n+1}}{n}|Z_{n+1}, Z_{n+2}, \dots ] \\
=& E[Z_{n+1}|Z_{n+1}, Z_{n+2}, \dots] + \frac{1}{n}E[\frac{\sum_{i=1}^{n+1}X_i}{n+1} – X_{n+1}| Z_{n+1}, Z_{n+2}, \dots] \\
=& Z_{n+1}
\end{align}$$
6.10 Consider successive flips of a coin having probability \(p\) of landing heads. Use a martingale argument to compute the expected number of flips until the following sequences appear:
(a) HHTTHHT
(b) HTHTHTH
(a) $$X_n = N-2 – (p^{-4}q^{-3} – 1) – (p^{-2}q^{-1} – 1) \\
E[N] = E[X_n] + p^{-4}q^{-3} + p^{-2}q^{-1} = p^{-4}q^{-3} + p^{-2}q^{-1} $$
(b) $$E[N] = p^{-4}q^{-3} + p^{-3}q^{-2} + p^{-2}q^{-1} + p^{-1} $$
6.11 Consider a gambler who at each gamble is equally likely to either win or lose 1 unit. Suppose the gambler will quit playing when his winnings are either \(A\) or \(-B, A > 0, B > 0\). Use an appropriate martingale to show that the expected number of bets is \(AB\).
Similar to Example 6.2(C), let two players with A and B coins. When the gambler win, one player gives a coin to another.
6.12 In Example 6.2(C), find the expected number of stages until one of the players is eliminated.
$$M_n = X_nY_nZ_n + ns/3\\
E[M_T] = E[M_0] = xyz\\
M_T = ns/3$$
It’s easy to prove that \(M_n\) is a martingale. Then \(E[T] = 3E[M_T]/s = 3xyz/s\)
6.13 Let \(Z_n = \prod_{i=1}^n X_i\), where \(X_i, i \geq 1\) are independent random variables with$$P\{X_i = 2\} = P\{X_i = 0\} = 1/2$$
Let \(N = Min\{n: Z_n = 0\}\). Is the martingale stopping theorem applicable? If so, what would you conclude? If not, why not?
Not applicable.
6.14 Show that the equation
$$e^{\beta} – e^{-\beta} = 2\beta e^{\beta^2/2}$$
has no solution when \(\beta \neq 0\).
(Hint: Expand in a power series.)
$$\begin{align}
&e^{\beta} – e^{-\beta} – 2\beta e^{\beta^2/2}\\
=& \sum_{i=0}^{\infty} \beta^{2i+1}(1/(2i+1)! – 1/(i!2^i)) \\ \end{align}$$
Since, it equals zero iff \(\beta = 0\).
6.15 Let \(X\) denote the number of heads in \(n\) independent flips of a fair coin. Show that:
(a) \(P\{X- n/2 \geq a\} \leq exp\{-2a^2/n\}\)
(b) \(P\{X- n/2 \leq -a\} \leq exp\{-2a^2/n\}\)
\(Z_i = X_i – i/2\) is a martingale with mean 0, and \(-1/2 \leq Z_i – Z_{i-1} \leq 1/2\). From Azuma inequality, we get the result.
6.16 Let \(X\) denote the number of successes in \(n\) independent Bernoulli trials, with trial \(i\) resulting in a success with probability \(p_i\). Given an upper bound for \(P\{|X – \sum_{i=1}^n p_i| \geq a\}\).
\(Z_i = X_i – \sum_{k=1}^ip_k\) is a martingale with mean 0. and \(0 \leq Z_i – Z_{i-1} \leq 1\). From Azuma inequality,
$$ P\{X – \sum_{i=1}^n p_i \geq a\} \leq 2exp\{-2a^2/n\} \\
P\{X – \sum_{i=1}^n p_i \leq -a\} \leq 2exp\{-2a^2/n\} $$
6.17 Suppose that 100 balls are to be randomly distributed among 20 urns. Let \(X\) denote the number of urns that contains at least five balls. Derive an upper bound for \(P\{X \geq 15\}\).
$$E[X] = 20[1 – \sum_{k=0}^4{100 \choose k}(\frac{1}{20})^k(\frac{19}{20})^{100-k}]$$
And it’s easy to see that, if \(x\) and \(y\) differ in at most one coordinate, then \(|h(x) – h(y)| \leq 1\). Thus,
$$P\{X – E[X] \geq a\} \leq exp\{-a^2/200\}$$
Let \(a = 15 – E[X]\), we get the result.
6.18 Let \(p\) denote the probability that a random selection of 88 people will contain at least three with the same birthday. Use Azuma’s inequality to obtain an upper bound on \(p\). (It can be shown that \(p \approx 0.50\)).
Similarly to Problem 6.17,
$$p \leq exp\{-a^2/176\} \\
a = 1 – E[X]\\
E[X] = 365[1 – \sum_{i=0}^2{88 \choose i}(\frac{1}{365})^i(\frac{364}{365})^{88-i}]$$
6.19 For binary n-vectors \(\boldsymbol x\) and \(\boldsymbol y\) (meaning that each coordinate of these vectors is either 0 or 1) define the distance between them by
$$\rho(\boldsymbol x, \boldsymbol y) = \sum_{i=1}^n |x_i – y_i|$$
(This is called the Hamming distance). Let \(A\) be a finite set of such vectors, and let \(X_1, \dots, X_n\) be independent random variables that are each equally likely to be either 0 or 1. Set $$D = min_{y \in A} \rho(\boldsymbol X, \boldsymbol y)$$
and let \(\mu = E[D]\). In terms of \(\mu\), find an upper bound for \(P\{D \geq b\}\) when \(b > \mu\)
It easy to see that \(|h(x) – h(y)| \leq 1\), thus
$$P\{D – E[D] \geq a\} \leq exp\{-a^2/2n\}$$
Let \(a + E[D] = b\), we get
$$P\{D \geq b\} \leq exp\{-(b – \mu)^2/2n\}$$
6.21 A group of \(2n\) people, consisting of \(n\) men and \(n\) women, are to be independently distributed among \(m\) rooms. Each woman chooses room \(j\) with probability \(p_j\) while each man chooses it with probability \(q_j, j = 1, \dots, m\). Let \(X\) denote the number of rooms that will contain exactly one man and one woman.
(a) Find \(\mu = E[X]\)
(b) Bound \(P\{|X – \mu| > b\}\) for \(b > 0\)
(a) \(E[X] = \sum_{i=1}^m{n \choose 1}p_i(1-p_i)^{n-1}{n \choose 1}q_i(1-q_i)^{n-1} \)
(b) Let \(2a = b\) then, \(P\{|X – \mu| > b\} \leq exp\{-b^2/16n\}\)
6.22 Let \(\{X_n, n \geq 0\}\) be a Markov process for which \(X_0\) is uniform on (0, 1) and, conditional on \(X_n\),
$$X_{n+1} = \left\{\begin{array}{ll}
\alpha X_n + 1 – \alpha \quad \text{with probability } X_n \\
\alpha X_n \quad \text{with probability } 1 – X_n \\ \end{array}\right. $$
where \(0 < \alpha < 1\). Discuss the limiting properties of the sequence \(X_n, n \geq 1\).
$$E[X_{n+1}|X_n] = X_n(\alpha X_n + 1 – \alpha) + (1 – X_n)\alpha X_n = X_n$$
Thus, \(X_n\) is martingale, and \(E[|X_n|] = E[X_n] = E[X_1] < \infty\), so the limit exists.
6.23 An urn initially contains one white and one black ball. At each stage a ball is drawn and is then replaced in the urn along with another ball of the same color. Let \(Z_n\) denote the fraction of balls in the urn that are white after the nth replication.
(a) Show that \(\{Z_n, n \geq 1\}\) is a martingale.
(b) Show that the probability that the fraction of white balls in the urn is ever as large as 3/4 is at most 2/3.
(a) $$\begin{align}
&E[Z_{n+1}|Z_1, \dots, Z_n] \\
=&Z_n\frac{(n+2)Z_n + 1}{n+3} + (1-Z_n)\frac{(n+2)Z_n}{n+3}\\
=&Z_n\end{align}$$
(b) $$P\{max(Z_1, \dots, Z_n) > 3/4\} \leq 4E[Z_n]/3 = 4E[Z_1]/3 = 2/3$$
6.24 Consider a sequence of independent tosses of a coin and let \(P\{head\}\) be the probability of a head on any toss. Let \(A\) be the hypothesis that \(P\{head\} = a\) and let \(B\) be the hypothesis that \(P\{head\} = b, 0 < a,b < 1\). Let \(X_i\) denote the outcome of the ith toss and let
$$Z_n = \frac{P\{X_1, \dots, X_n|A\}}{P\{X_1, \dots, X_n|B\}}$$
Show that if \(B\) is true, then:
(a) \(Z_n\) is a martingale, and
(b) \(\lim_{n \to \infty} Z_n\) exists with probability 1.
(c) If \(b = \neq a\), what is \(\lim_{n \to \infty} Z_n\)?
(a) Let \(X_i=1\) if the outcome of the ith toss is head, \(X_i = 0\) if it is tail. From independence,
$$Z_n = \frac{P\{X_1, \dots, X_n|A\}}{P\{X_1, \dots, X_n|B\}} = Z_{n-1}\frac{P\{X_n|A\}}{P\{X_n|B\}}$$ Now,
$$E[\frac{P\{X_n|A\}}{P\{X_n|B\}}] = E[\frac{a^{X_n}(1-a)^{1-X_n}}{b^{X_n}(1-b)^{1-X_n}}] = \frac{a}{b}b + \frac{1-a}{1-b}(1-b) = 1$$
(b) Since \(Z_n\) is a nonnegative martingale, from Corollary 6.4.7, the limit exists.
(c) From (a), it is clear that \(Z_n\) can have a finite (random or constant) non-zero limit only if
$$\lim_{n \to \infty}\frac{P\{X_n|A\}}{P\{X_n|B\}} = 1$$
with probability 1. However for \(a\neq b\) it is not possible. Thus the limit is 0.
6.25 Let \(Z_n, n \geq 1\), be a sequence of random variables such that \(Z_1 \equiv 1\) and given \(Z_1, \dots, Z_{n-1}, Z_n\) is a Poisson random variable with mean \(Z_{n-1}, n > 1\). What can we say about \(Z_n\) for \(n\) large?
\(Z_n\) is nonnegative martingale, and \(E[Z_n] =E[Z_1] = 1\). Thus the limit of \(Z_n\) exists.
6.26 Let \(X_1, X_2, \dots\) be independent and such that
$$P\{X_i = -1\} = 1 – 1/2^i\\
P\{X_i = 2^i – 1\} = 1/2^i, \quad i \geq 1$$
Use this sequence to construct a zero mean martingale \(Z_n\) such that \(\lim_{n \to \infty}Z_n = -\infty\) with probability 1. (Hint: Make use of the Borel-Cantelli lemma)
It is easy to see that \(Z_n = \sum_{i=1}^nX_i\) is a martingale with mean 0. And
$$\sum_{i=1}^{\infty} P\{X_i = 2^i – 1\} = 1/4 < \infty$$
From Borel-Cantelli lemma,
$$P\{X_i \text{ is positive occurs infinite}\} = 0\\
P\{X_i \text{ is negative occurs infinite}\} = 1 $$
Thus, \(\lim_{n \to \infty}Z_n = -\infty\)
A continuous-time process \(\{X(t), t \geq 0\}\) is said to be a martingale if \(E[|X(t)|] < \infty\) for all \(t\) and, for all \(s < t\),
$$E[X(t)|X(u), 0 \leq u \leq s] = X(s)$$
6.27 Let \(\{X(t), t \geq 0\}\) be a continuous-time Markov chain with infinitesimal transition rates \(q_{ij}, i \neq j\). Given the conditions on the \(q_{ij}\) that result in \(\{X(t), t \geq 0\}\) being a continuous-time martingale.
Do Problem 6.28-6.30 under the assumptions that (a) the continuous-time analogue of the martingale stopping theorem is valid, and (b) any needed regularity conditions are satisfied.
6.28 Let \(\{X(t), t \geq 0\}\) be a nonhomogeneous Poisson process with intensity function \(\lambda(t), t \geq 0\). Let \(T\) denote the time at which the nth event occurs. Show that
$$n = E[\int_0^T \lambda(t)dt]$$
6.29 Let \(\{X(t), t \geq 0\}\) be a continuous-time Markov chain that will, in finite expected time, enter a an absorbing state \(N\). Suppose that \(X(0) = 0\) and let \(m_i\) denote the expected time the chain is in state \(i\). Show that for \(j \neq 0, j\neq N\).
(a) \(E[\)number of times the chain leaves state \(j]=v_j m_j\), where \(1/v_j\) is the mean time the chain spends in \(j\) during a visit.
(b) \(E[\)number of times it enters state \(j] = \sum_{i \neq j}m_i q_{ij}\).
(c) Argue that
$$v_jm_j = \sum_{i \neq j}m_iq_{ij}, \quad j \neq 0 \\
v_0m_0 = 1 + \sum_{i \neq 0}m_iq_{i0}$$
6.30 Let \(\{X(t), t \geq 0\}\) be a compound Poisson process with Poisson rate \(\lambda\) and component distribution \(F\). Define a continuous-time martingale related to this process.