《随机过程-第二版》(英文电子版) Sheldon M. Ross 答案整理,此书作为随机过程经典教材没有习题讲解,所以将自己的学习过程记录下来,部分习题解答参考了网络,由于来源很多,出处很多也不明确,无法一一注明,在此一并表示感谢!希望对大家有帮助,水平有限,不保证正确率,欢迎批评指正,转载请注明出处。
Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.
7.1 Consider the following model for the flow of water in and out of a dam. Suppose that, during day \(n, Y_n\) units of water flow into the dam from outside sources such as rainfall and river flow. At the end of each day, water is released from the dam according to the following rule: If the water content of the dam is greater than \(a\), then the amount of \(a\) is released. If it is less than or equal to \(a\), then the total contents of the dam are released. The capacity of the dam is \(C\), and once at capacity any additional water that attempts to enter the dam is assumed lost. Thus, for instance, if the water level at the beginning of day \(n\) is \(x\), then the level at the end of the day (before any water is released) is \(min(x + Y_n, C)\). Let \(S_n\) denote the amount of water in the dam immediately after the water been released at the end of day \(n\). Assuming that the \(Y_n, n \geq 1\), are independent and identically distributed, show that \(\{S_n, n \geq 1\}\) is a random walk with reflecting barriers at 0 and \(C-a\).
7.2 Let \(X_1, \dots, X_n\) be equally likely to be any of the \(n!\) permutations of \((1,2,\dots, n)\). Argue that,
$$P\{\sum_{j=1}^njX_j \leq a\} = P\{\sum_{j=1}^njX_j\geq n(n+1)^2/2 -a\}$$
$$\begin{align}
P\{\sum_{j=1}^njX_j \leq a \} &= P\{nS_n – \sum_{i=1}^{n-1}S_i \leq a\} \\
&= P\{\sum_{i=1}^nS_i \geq (n+1)S_n – a\} \\
&= P\{\sum_{j=1}^njX_j\geq n(n+1)^2/2 -a\} \end{align}$$
7.3 For the simple random walk compute the expected number of visits to state \(k\).
Suppose \(p \geq 1/2\), and starting at state \(i\). When \(i \leq k\),
$$p_{ik} = 1 \\
f_{kk} = 2 -2p\\
E = 1 + \frac{1}{2p – 1}$$
When \(i > k\)
$$p_{ik} = (\frac{1-p}{p})^{i – k}\\
f_{kk} = 2 – 2p\\
E = p_{ik}(1 + \frac{1}{2p-1})$$
7.4 Let \(X_1, X_2, \dots, X_n\) be exchangeable. Compute \(E[X_1|X_{(1)}, X_{(2)}, \dots, X_{(n)}]\), where \(X_{(1)} \leq X_{(2)} \leq \dots \leq X_{(n)}\) are the \(X_i\) in ordered arrangement.
Since \(X_i\) is exchangeable, \(X_1\) can be any of \(X_{(i)}\) with equal probability. Thus,
$$E[X_1|X_{(1)}, X_{(2)}, \dots, X_{(n)}] = \frac{1}{n}\sum_{i=1}^nX_{(i)}$$
7.6 An ordinary deck of cards is randomly shuffled and then the cards are exposed one at a time. At some time before all the cards have been exposed you must say “next”, and if the next card exposed is a spade then you win and if not then you lose. For any strategy, show that at the moment you call “next” the conditional probability that you win is equal to the conditional probability that the last card is spade. Conclude from this that the probability of winning is 1/4 for all strategies.
Let \(X_n\) indicate if the nth card is a spade and \(Z_n\) be the proportion of spades in the remaining cards after the \(n\) card. Thus \(E|Z_n| < \infty\) and
$$E[Z_{n+1}|Z_1, \dots , Z_n] = \frac{(52 -n)Z_n – 1}{52 -n-1}Z_n + \frac{(52-n)Z_n}{52-n-1}(1 – Z_n) = Z_n$$
Hence \(Z_n\) is a martingale.
Note that \(X_52 = Z_51\). Thus
$$\begin{align}
E[X_{n+1}|X_1, \dots, X_n]&= E[X_{n+1}|Z_1, \dots, Z_n] = Z_n\\
&= E[Z_51|Z_1, \dots, Z_n] = E[X_52|X_1, \dots, X_n]
\end{align}$$
Finally, let \(N\) be the stopping time corresponding to saying “next” for a given strategy.
$$\begin{align}
P\{\text{Win}\} &= E[X_{N+1}] = E[E[X_{N+1}|N]] \\
&= E[Z_N] = E[Z_1] = 1/4
\end{align}$$
7.7 Argue that the random walk for which \(X_i\) only assumes the values \(0, \pm 1, \dots, \pm M\) and \(E[X_i] = 0\) is null recurrent.
7.8 Let \(S_n, n \geq 0\) denote a random walk for which
$$\mu = E[S_{n+1} – S_n] \neq 0$$
Let, for \(A >0, B > 0\),
$$N = min\{n: S_n \geq A \text{ or } S_n \leq -B\}$$
Show that \(E[N] < \infty\). (Hint: Argue that there exists a value \(k\) such that \(P\{S_k > A +B\} > 0\). Then show that \(E[N] \leq kE[G]\), where \(G\) is an appropriately defined geometric random variable.)
Suppose \(\mu > 0\), and let \(k > (A+B)/\mu\), then
$$\begin{align}
k\mu – A -B &= E[S_k – A – B] \\
&= E[S_k – A – B|S_k > A + B]P\{S_k > A+B\} \\&+ E[S_k – A – B|S_k \leq A + B]P\{S_k \leq A+B\}\\
&\leq E[S_k – A – B|S_k > A + B]P\{S_k > A+B\}
\end{align}$$ Thus, there exists \(k > (A+B)/\mu, p = P\{S_k > A +B\} > 0\). Let \(Y_i = \sum_{j = ik+1}^{(i+1)k} X_j\), then \(P\{Y_i > A + B\} = p\). And it’s obviously that if any of \(Y_i\) exceeds \(A+B\), \(S_N\) occurs. Hence, \(E[N] \leq k/p\)
7.10 In the insurance ruin problem of Section 7.4 explain why the company will eventually be ruined with probability 1 if \(E[Y] \geq cE[X]\).
7.11 In the ruin problem of Section 7.4 let \(F\) denote the interarrival distribution of claims and let \(G\) be the distribution of the size of a claim. Show that \(p(A)\), the probability that a company starting with \(A\) units of assets is ever ruined, satifies
$$p(A) = \int_0^{\infty}\int_0^{A + ct}p(A + ct -x)dG(x)dF(t) + \int_0^{\infty}\bar{G}(A+ct)dF(t)$$
Condition on the first claim, then
$$\begin{align}
p(A) &= P\{\text{ruined at first claim}\} + P\{\text{ruined after first claim}\} \\
&= \int_0^{\infty}\bar{G}(A+ct)dF(t) + \int_0^{\infty}\int_0^{A + ct}p(A + ct -x)dG(x)dF(t)
\end{align}$$
7.12 For a random walk with \(\mu = E[X] > 0\) argue that, with probability 1,
$$\frac{u(t)}{t} \to \frac{1}{\mu} \quad \text{as } t \to \infty$$
where \(u(t)\) equals the number of \(n\) for which \(0 \leq S_n \leq t\).
7.13 Let \(S_n = \sum_{i=1}^n X_i\) be a random walk and let \(\lambda_i, i > 0\), denote the probability that a ladder height equals \(i\) — that is, \(\lambda_i = P\){first positive value of \(S_n\) equals \(i\)}.
(a) Show that if
$$P\{X_i = j\} = \left\{\begin{array}{ll}
q \quad j = -1 \\
\alpha_j \quad j \geq 1 \\ \end{array}\right. \\
q + \sum_{j=1}^{\infty} \alpha_j = 1$$
then \(\lambda_i\) satisfies
$$\lambda_i = \alpha_i + q(\lambda_{i+1} + \lambda_1\lambda_i) \quad i > 0$$
(b) If \(P\{X_i = j\} = 1/5, j = -2,-1,0,1,2\), show that
$$\lambda_1 = \frac{1+\sqrt{5}}{3+\sqrt{5}} \quad \lambda_2 = \frac{2}{3+\sqrt{5}} $$
7.14 Let \(S_n, n\geq 0\), denote a random walk in which \(X_i\) has distribution \(F\). Let \(G(t,s)\) denote the probability that the first value of \(S_n\) that exceeds \(t\) is less than or equal to \(t+s\). That is,
$$G(t,s) = P\{\text{first sum exceeding } t \text{ is } \leq t+s\}$$
Show that
$$G(t, s) = F(t + s) – F(t) + \int_{-\infty}^t G(t-y, s)dF(y)$$
\(S_n|X_1\) is distributed as \(X_1 + S_{n-1}\). Thus if \(A\)={first sum exceeding \(t\) is \(\leq t + s\)},
$$\begin{align}
G(t,s) &\equiv P\{A\} = E[P\{A|X_1\}] \\
&= F(t+s) – F(t) + \int_{-\infty}^t G(t-y, s)dF(y)
\end{align}$$