写在大盘重回3000点

2018年真的是“民生凋敝”的一年,8作为中国人的国民幸运数字似乎带来的总不是好运,逢八必惨在2018年也没有逃脱。2018年,身边人的问我最多的问题是,有没有什么挣钱的路子?不知道是什么原因让大家产生了我有什么点石成金秘诀的错觉,我的回答统一都是“苟着别动”,如果自己擅长的行业尚且没有突破的希望,换个行业显然是个更坏的选择。

直到2018年底到2019年初,整个社会的悲观情绪到达了顶点,集中体现就是上证指数跌破了2638点直到2400点。我作为坚定的爱国的社会主义接班人,虽然是长期唱空中国经济,但是我觉得这显然是个“预期差”。于是在年初的时候,亲戚朋友再问我有什么赚钱的路子的时候,我统一的回答是买指数基金。此时虽然各项经济指标还在进一步恶化,经济是否见底还很难定论,但是市场的预期确实是见底了,市场已经将未来预期的恶化体现出来了。在韭菜遍布国字牌赌场里,散户们在一跌再跌的情况下,挥泪割肉更是放大了这种效应。资产价格体现的更多是预期而不是现值,所以只要未来不出现超出预期的利空,下行空间已经不大了。在我看来悲观是利好,乐观是利空。用巴菲特的话讲,别人恐慌我贪婪,别人贪婪我恐慌。

更为重要的是,虽然我认为国家经济的症结在自身,中美贸易摩擦只是导火索,但是中美贸易的磋商结果直接会决定是否会引发国内问题的集中爆发。从川普的行事风格看,无论是关停政府与议会对峙要钱修墙,还是后来在越南与三胖会面,三胖要求先停止制裁再去核,川普扭头就走 ,我们历史上多次使用的拖延外交度过危机的方法,这一次胜算不大了。而中国妥协,双方停战短期上是利好自不用说,细看美帝的要求,长期来说对经济改革也是一种促进。

基于以上两点判断,我觉得大盘重回3000只是时间问题。其实我知道荐股是件出力不讨好的事情,耐不住我这个好为人师的坏毛病,每当有好朋友非常真诚地问我的时候,我还是没法拒绝。而我本人是不炒股的,无法掌控的事情我向来是很谨慎的,只因为给许多朋友推荐了指数,我养成了每天分析一下大盘的习惯,中间又瞎蒙对了2700点的左右和今天3000点上方的两次暴涨,进一步又加深了一些朋友对我的错觉。截图我就不贴了,以上权当是一些事后诸葛亮的复盘。

一时间,朋友圈、微信群突然间都在讨论股市,老股民忙着找回密码,新股民忙着开户,投资者新增一百五十多万人,券商的服务器都宕机了。还有个段子说,门口保安大爷都开始讨论股票了,问这是见顶了吗?答,这只是拐点,当他开始给你荐股的时候才是。不禁要想,是不是到了巴菲特的后半句的时候?

但是见好就收,我不准备再预测了。我看到暴涨的背后是上市公司股东集体减持,PMI指数连续处于荣枯线之下这一系列宏观指标。再从细节看,国内乳业龙头尚且要写大字报告御状,最高院的大法官要录视频自保,国有企业垄断多少资源又效率几何,机关单位的工作中心到底是为领导服务还是为人民服务,自己就有答案了。翻开历史,它清醒地告诉我们政治的清明是经济发展的前提与保障,政治体制改革的扎实推进才是3000点、5000点乃至10000点的根基,不然便是空中楼阁。如果说我们不能改变什么,那自然是太悲观了,我们可以不用改变社会,也不要去奢想改变社会,只要坚守住自己,当越来越多的90后走上工作岗位,逐渐成为主力,历史的轨迹就会因为我们的坚守而发生一些改变,未来还有00后、10后,愚公移山大概说的就是这个意思吧。

最后我想再借用吴晓波在他的激荡三十年的序言里的一段话:“任何被视为奇迹的事物,往往都很难延续,因为它来自一个超越了常规的历程,处身其中的人们,因此而获得巨大利益的人们,每每不可能摆脱那些让他们终身难忘的际遇,他们相信那就是命运,他们总希望每次都能红运高照,每次都能侥幸胜出,最后,所有的光荣往往枯萎在自己的光环中。”

Solutions to Stochastic Processes Ch.3

随机过程-第二版》(英文电子版Sheldon M. Ross 答案整理,此书作为随机过程经典教材没有习题讲解,所以将自己的学习过程记录下来,部分习题解答参考了网络,由于来源很多,出处很多也不明确,无法一一注明,在此一并表示感谢!希望对大家有帮助,水平有限,不保证正确率,欢迎批评指正,转载请注明出处。
Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

3.1 Is it true that:
(a) \(N(t) < n\) if and only if \(S_n > t\) ?
(b) \(N(t) \leq n\) if and only if \(S_n \geq t\) ?
(c) \(N(t) > n\) if and only if \(S_n < t\) ?


(a) True, (b)(c) False.


3.2 In defining a renewal process we suppose that \(F(\infty)\), the probability that an interarrival time is finite, equals 1. If \(F(\infty) < 1\), then after each renewal there is a positive probability \(1 – F(\infty)\) that there will be no further renewals. Argue that when \(F(\infty) < 1\) the total number of renewals, call it \(N(\infty)\), is such that \(1 + N(\infty)\) has a geometric distribution with mean \(1/(1 – F(\infty))\).


$$P\{1 + N(\infty) = k\} = F^{k-1}(\infty)(1 – F(\infty))$$


3.3 Express in words what the random variable \(X_{N(t) + 1}\) represents (Hints: It is the length of which renewal interval?). Show that$$P\{X_{N(t)+1} \geq x\} \geq \bar{F}(x).$$
Compute the above exactly when \(F(x) = 1 – e^{-\lambda x}\).


\(X_{N(t)+1}\) represents the first renewal interval after the last event in \(t\).
$$\begin{align}
P\{X_{N(t)+1} \geq x\} &= \int_0^{\infty} P\{X_{N(t)+1} \geq x | S_{N(t)} = s \}dF_{S_{N(t)}}(s)\\
&= \int_0^{\infty} P\{X_{N(t)+1} \geq x | X_{N(t) + 1} > t – s \}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} \frac{P\{X_{N(t)+1} \geq x, X_{N(t) + 1} > t – s \}}{P\{X_{N(t) + 1} > t – s \}}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} \frac{1 – F(max\{x, t-s\})}{1 – F(t-s)}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} min\{\frac{1 – F(x)}{1 – F(t-s)}, \frac{1 – F(t-s)}{1 – F(t-s)}\}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} min\{\frac{1 – F(x)}{1 – F(t-s)}, 1\}dF_{S_{N(t)}}(s) \\
&\geq \int_0^{\infty} 1 – F(x)dF_{S_{N(t)}}(s) \\
&= \bar{F}(x)
\end{align}$$
When \(F(x) = 1 – e^{-\lambda x}\),
$$\begin{align}
P\{X_{N(t)+1} \geq x\} &= \int_0^{\infty} \frac{e^{-\lambda x} }{e^{-\lambda(t-s)}}dF_{S_{N(t)}}(s) \\
&= \int_0^{t-x} dF_{S_{N(t)}}(s) + \int_{t-x}^t e^{-\lambda x} dm(s) \\
&= (1 + x/\lambda)e^{-\lambda x} – e^{-\lambda t}
\end{align}$$


3.4 Prove the renewal equation
$$m(t) = F(t) + \int_0^t m(t-x)dF(x)$$


$$\begin{align}
m(t) &= E[N(t)] = \int_0^{\infty} E[N(t)|X_1]dF(x) \\
&= \int_0^t E[1 + N(t-x)]dF(x) \\
&= F(t) + \int_0^t m(t-s)dF(x)
\end{align}$$


3.5 Prove that the renewal function \(m(t), 0 \leq t < \infty\) uniquely determines the interarrival distribution \(F\).



The Laplace transform for \(F\) is,
$$\phi_{F_n}(\lambda) = \int_0^{\infty} e^{-\lambda t}dF_n(t) = E[e^{-\lambda S_n}] = [\phi_F(\lambda)]^n $$
The Laplace transform for \(m(t)\) is,
$$\begin{align}
U(\lambda) &= \int_0^{\infty} e^{-\lambda t}dm(t) \\
&= \sum_{n=1}^{\infty} \int_0^{\infty} e^{-\lambda t}dF_n(t) \\
&= \sum_{n=1}^{\infty}[\phi_F(\lambda)]^n = \frac{\phi_F(\lambda)}{1 – \phi_F(\lambda)} \\
\phi_F(\lambda) &= \frac{U(t)}{1 + U(t)}
\end{align}$$
The distribution \(F\) is uniquely determined by its Laplace transform \(\phi_F\), which is uniquely determined by the renewal function.


3.6 Let \(\{N(t), t \geq 0\}\) be a renewal process and suppose that for all \(n\) and \(t\), conditional on the event that \(N(t) = n\), the event times \(S_1, \dots, S_n\) are distributed as the order statistics of a set of independent uniform \((0, t)\) random variables. Show that \(\{N(t), t \geq 0\}\) is a Poisson process.
(Hint: Consider \(E[N(s)|N(t)]\) and then use the result of Problem 3.5)


$$\begin{align}
E[N(s) | N(t) = n] &= E[\sum_{i=1}^n I(U_{(i)} \leq s)] \\
&= E[\sum_{i=1}^n I(U_{i} \leq s)] \\
&= \sum_{i=1}^n P\{U_i \leq s\} = \frac{ns}{t}\\
m(s) &= E[E[N(s)|N(t)]] = \frac{s}{t}m(t)\\
\end{align}$$Solve the equation, we get \(m(s) = \lambda s\) which uniquely determines the interarrival distribution function: \(F(x) = 1 – e^{-\lambda t}\). Proven.


3.8 The random variables \(X_1, \dots, X_n\) are said to be exchangeable if \(X_{i_1}, \dots, X_{i_n}\) has the same joint distribution as \(X_1, \dots, X_n\) whenever \(i_1, \dots, i_n\) is a permutation of \(1, \dots, n\). That is, they are exchangeable if the joint distribution function \(P\{X_1 \leq x_1, \dots, X_n \leq x_n \}\) is a symmetric function of \(x_1, \dots, x_n\). Let \(X_1, X_2, \dots\) denote the interarrival times of a renewal process.
(a) Argue that conditional on \(N(t) = n, X_1, \dots, X_n\) are exchangeable. Would \(X_1, \dots, X_n, X_{n+1}\) be exchangeable (conditional on \(N(t) = n\)) ?
(b) Use (a) to prove that for \(n > 0\)
$$E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) = n] = E[X_1|N(t) = n].$$
(c) Prove that
$$E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) > 0] = E[X_1|X_1 < t] .$$


(a) $$\begin{align}
&P\{X_1 \leq x_1, \dots, X_n \leq x_n, N(t) = n\} \\
=& \int_{y_1 \leq x_1}\dots \int_{y_n \leq x_n} P\{X_{n+1} > t – \sum_{i=1}^n y_i\}dF(y_1)\dots dF(y_n) \\
=& \int_0^1 \dots \int_0^1 I\{y_1 \leq x_1, \dots, y_n \leq x_n\} \bar{F}(t – \sum_{i=1}^n y_i)dF(y_1)\dots dF(y_n) \\
\end{align}$$By Fubini’s theorem, the order change of integration doesn’t change the result.
Since distribution of \(X_{n+1}\) which contains the time \(t\) is not identical to others, they are not exchangeable.
(b) Since they’re exchangeable,
$$\begin{align}
&E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) = n]\\
=&E[\frac{X_1 + \dots + X_n}{n}|N(t) = n] \\
=&\frac{1}{n}\sum_{i=1}^{n} E[X_i|N(t) = n] \\
=& E[X_1|N(t) = n]
\end{align}$$

(c) $$\begin{align}
&E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) > 0] \\
=& \sum_{n=1}^{\infty} E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) = n]P\{N(t) = n | N(t) > 0\}\\
=& \sum_{n=1}^{\infty} E[X_1|N(t) = n]P\{N(t) = n | N(t) > 0\}\\
=& E[X_1|N(t) > 0] = E[X_1|X_1 < t]
\end{align}$$


3.9 Consider a single-server bank in which potential customers arrive at a Poisson rate \(\lambda\). However, an arrival only enters the bank if the server if free when he or she arrives. Let \(G\) denote the service distribution.
(a) At what rate do customers enter the bank?
(b) What fraction of potential customers enter the bank?
(c) What fraction of time is the server busy?


(a) This is a renewal process whose renewal occurs at a customer’s entrance. Let \(\mu_G\) denote the average service time, then average interarrival time of the renewal process is \(\mu_G\) plus the average time a customer arrive after the service finished, which is \(1/\lambda\) due to the memoryless property. Thus the rate is \(\lambda/(1 + \lambda\mu_G)\)
(b) When a customer enter the bank, the renewal occurs. In the \(\mu_G\) time, the average customers arrive is \(\lambda\mu_G\). Thus, in each period, there will \(1 + \lambda\mu_G\) customers, only one of them entered the bank. The rate is \(1/(\lambda\mu_G + 1)\)
(c) \(\mu_G/(1/\lambda + \mu_G) = \lambda\mu_G/(1 + \lambda\mu_G)\)


3.10 Let \(X_1, X_2, \dots\) be independent and identically distributed with \(E[X_i] < \infty\). Also let \(N_1, N_2, \dots\) be independent and identically distributed stopping times for the sequence \(X_1, X_2, \dots\) with \(E[N_i] < \infty\). Observe the \(X_i\) sequentially, stopping at \(N_1\). Now start sampling the remaining \(X_i\)–acting as if the sample was just beginning with \(X_{N_i + 1}\)–stopping after an additional \(N_2\)(Thus, for instance, \(X_1 + \dots + X_{N_1}\) has the same distribution as \(X_{N_1+1} + \dots + X_{N_1 + N_2}\)). Now start sampling the remaining \(X_i\)–again acting as if the sample was just beginning–and stop after an additional \(N_3\), and so on.
(a) Let
$$S_1 = \sum_{i=1}^{N_1}X_i, \quad S_2 = \sum_{i=N_1 + 1}^{N_1 + N_2}X_i, \quad\dots,\quad S_m = \sum_{i=N_1+\dots+N_{m-1}+1}^{N_1 + \dots + N_m}X_i$$
Use the strong law of large numbers to compute
$$\lim_{m \to \infty} (\frac{S_1 + \dots + S_m}{N_1 + \dots + N_m}).$$
(b) Writing
$$\frac{S_1 + \dots +S_m }{N_1 + \dots + N_m } = \frac{S_1 + \dots +S_m}{m}\frac{m}{N_1 + \dots + N_m},$$
derive another expression for the limit in part (a).
(c) Equate the two expression to obtain Wald’s equation.


(a) $$\lim_{m \to \infty} (\frac{S_1 + \dots + S_m}{N_1 + \dots + N_m}) = E[X] $$
(b) $$\lim_{m \to \infty} (\frac{S_1 + \dots + S_m}{N_1 + \dots + N_m}) = E[S]/E[N] $$
(c) $$E[S] = E[N]E[X]$$


3.11 Consider a miner trapped in a room that contains three doors. Door 1 leads her to freedom after two-days’ travel; door 2 returns her to her room after four days’ journey, and door 3 returns her room after eight-days’ journey. Suppose at all times she is equally to choose any of the three doors, and let \(T\) denote the time it takes the miner to become free.
(a) Define a sequence of independent and identically distributed random variables \(X_1, X_2, \dots\) and a stopping time \(N\) such that
$$T = \sum_{i=1}^N X_i.$$
Note: You may have to imagine that the miner continues to randomly choose doors even after she reaches safety.
(b) Use Wald’s equation to find \(E[T]\).
(c) Compute \(E[\sum_{i=1}^N X_i|N=n]\) and note that it is not equal to \(E[\sum_{i=1}^n X_i]\).
(d) Use part (c) for a second derivation of \(E[T]\).


(a) \(P\{X=2\} = P\{X=4\} = P\{X=8\} = 1/3\), and \(N= min\{n, X_n = 2\}\).
(b) Since \(N \sim Geometric(1/3)\), then \(E[N] = 3\). Hence \(E[T] = E[N]E[X] = 14\).
(c) $$\begin{align}
E[\sum_{i=1}^N X_i|N=n ] &= E[\sum_{i=1}^N X_i|X_1 \neq 2, \dots, X_{n-1}\neq 2, X_n = 2] \\&=(n-1)E[X_i|X_i\neq 2] + 2 = 6n – 4 \\
E[\sum_{i=1}^N X_i] &= nE[X_i] = 14n/3
\end{align}$$
(d) $$E[T] = E[E[\sum_{i=1}^N X_i|N=n]] = 6E[N] – 4 = 14$$


3.12 Show how Blackwell’s theorem follows from the key renewal theorem.


Let \(h(x) = 1_{\{x \in [0, a]\}}\), then \(h(x)\) is directly Riemann integrable. Then we get
$$\lim_{t \to \infty} m(t) – m(t-a) = \frac{a}{\mu}$$


3.13 A process is in one of \(n\) states, \(1, 2, \dots, n\). Initially it is in state 1, where it remains for an amount of time having distribution \(F_1\). After leaving state 1 it goes to state 2, where it remains for a time having distribution \(F_2\). When it leaves 2 it goes to state 3, and so on. From state \(n\) it returns to 1 and starts over. Find
$$\lim_{t \to \infty} P\{\text{process is in state i at time t}\}.$$
Assume that \(H\), the distribution of time between entrances to state 1, is nonlattice and has finite mean.


There may be a typo in the last statement, which I think he might means state \(i\) not state 1. Then,
Let \(\mu_i\) denote the expected time of distribution \(F_i, i = 1, 2, \dots, n\), and \(\mu_H\) denote the expectation of distribution \(H\). Then
$$
\lim_{t \to \infty} P\{\text{process is in state i at time t}\} = \frac{\mu_i}{\sum_{j=1}^n \mu_j + n\mu_H}
$$


3.14 Let \(A(t)\) and \(Y(t)\) denote the age and excess at \(t\) of a renewal process. Fill in the missing terms:
(a) \(A(t) > x \leftrightarrow 0\) events in the interval____?
(b) \(Y(t) > x \leftrightarrow 0\) events in the interval____?
(c) \(P\{Y(t) > x\} = P\{A(\ ) > \ \ \}\)
(d) Compute the joint distribution of \(A(t)\) and \(Y(t)\) for a Poisson process.


(a) \((t-x, t)\)
(b) \((t, t+x)\)
(c) \(P\{Y(t) > x\} = P\{A(t+x) > x \}\)
(d) \(P\{A(t) > x, Y(t) > y\} = e^{-\lambda(x+y)}\)


3.15 Let \(A(t)\) and \(Y(t)\) denote respectively the age and excess at \(t\). Find:
(a) \(P\{Y(t) > x|A(t) = s\}\).
(b) \(P\{Y(t) > x|A(t + x/2) = s\}\).
(c) \(P\{Y(t) > x|A(t + x) > s\}\) for a Poisson process.
(d) \(P\{Y(t) > x, A(t) > y\}\).
(e) If \(\mu < \infty\), show that, with probability 1, \(A(t)/t \to 0\) as \(t \to \infty\).


(a) \(\bar{F}(x+s)/\bar{F}(s)\) (Formal solution should be conditional on \(S_{N(t)}\))
(b) Since \(A(t+x/2) = s, s \geq x/2\), and \(P\{Y(t) > x|A(t + x/2) = s\} =\bar{F}(x/2+s)/\bar{F}(s) \)
(c) \(P\{Y(t) > x|A(t + x) > s\} = e^{-\lambda s}\)
(d) \(e^{-\lambda(x+y)}\)
(e) $$\begin{align}
\lim_{t \to \infty} \frac{A(t)}{t} &= \lim_{t \to \infty} \frac{t – S_{N(t)}}{t} \\
&= 1 – \lim_{t \to \infty} \frac{S_{N(t)}}{N(t)}\frac{N(t)}{t} \\
&= 1 – \frac{\mu}{\mu} = 0
\end{align}$$


3.16 Consider a renewal process whose interarrival distribution is the gamma distribution with parameters \((n, \lambda)\). Use Proposition 3.4.6 to show that
$$\lim_{t \to \infty} E[Y(t)] = \frac{n+1}{2\lambda}.$$
Now explain how this could have been obtained without any computations.


Since \(X \sim Gamma(n, \lambda)\), then \(E[X] = n/\lambda, E[X^2] = n(n+1)/\lambda^2\), then
$$\lim_{t \to \infty} E[Y(t)] = \frac{E[X^2]}{2E[X]} = \frac{n+1}{2\lambda}. $$
The gamma distribution can be considered as the sum of \(n\) exponential random variables with parameter \(\lambda\). And the one containing \(t\) can be seen as divided into two due to the memoryless property. Hence,
$$E[Y(t)] = E[A(t)] = \frac{n+1}{2\lambda}$$


3.18 In Problem 3.9 suppose that potential customers arrive in accordance with a renewal process having interarrival distribution \(F\). Would the number of events by time \(t\) constitute a (possibly delayed) renewal process if an event corresponds to a customer:
(a) entering the bank?
(b) leaving the bank?
What if \(F\) were exponential?


Let \(X_i\) denote the length of the i-th service and let \(Y_i\) denote the time from the end of i-th service until the start of the \((i + 1)\)th service. Let \(Y_0\) denote the time when first arrival enters the bank (and gets service). Note that \(X_i\) and \(Y_i\) may be dependent when the arrival is not a Poisson process.
(a) In this case, each cycle consists of \(Z_i = X_i + Y_i, i = 1, 2, \dots\) and \(Z_0 = Y_0\). Since \(X_i, Y_i\) are independent of \(X_j, Y_j, j = 1, \dots, i − 1, \{Z_i\}_{i\in N}\) are i.i.d copies. We thus have a delayed renewal process.
(b) In this case, \(Z_i = Y_{i−1}+X_i\). When \(X_i, Y_i\) are dependent, \(\{Z_i\}_{i \in N}\) are not i.i.d. copies. We do not have a (delayed) renewal process.
If \(F\) is exponential, then (a) is still a delayed renewal process, (b) is a renewal process.


3.19 Prove Equation (3.5.3).


$$\begin{align}
&P\{S_{N(t)} \leq s\} \\
=& \sum_{n=0}^{\infty} P\{S_n \leq s, S_{n+1} > t\} \\
=& \bar{G}(t) + \sum_{n=1}^{\infty} \int_0^{\infty} P\{S_n \leq s, S_{n+1} > t | S_n = y\}dG*F_{n-1}(y) \\
=& \bar{G}(t) + \sum_{n=1}^{\infty} \int_0^s \bar{F}(t-y)dG*F_{n-1}(y) \\
=& \bar{G}(t) + \int_0^s \bar{F}(t-y)d(\sum_{n=1}^{\infty} G*F_{n-1}(y)) \\
=& \bar{G}(t) + \int_0^s \bar{F}(t-y)dm_D(y)
\end{align}$$


3.20 Consider successive flips of a fair coin.
(a) Compute the mean number of flips until the pattern HHTHHTT appears.
(b) Which pattern requires a larger expected time to occur: HHTT or HTHT?


(a) \(2^7\)
(b) HTHT


3.21 On each bet a gambler, independently of the past, either wins or loses 1 unit with respective probabilities \(p\) and \(1-p\). Suppose the gambler’s strategy is to quit playing the first time she wins \(k\) consecutive bets. At the moment she quits:
(a) find her expected winnings.
(b) find the expected number of bets she has won.


Let \(X_i\) denote the ith winning, and \(Y_i\) denote the indicator of whether win ith game, \(N\) denote the number of play until she quits. Then \(N\) is the stopping time for sequence \(X_i\)s and \(Y_i\)s, and \(E[N] = \sum_{i=1}^kp^{-i}\).
(a) \(E[N]E[X] = (2p-1)\sum_{i=1}^kp^{-i} \)
(b) \(E[N]E[Y] = \sum_{i=0}^{k-1}p^{-i} \)


3.22 Consider successive flips of a coin having probability \(p\) of landing heads. Find the expected number of flips until the following sequences appear.
(a) A = HHTTHH.
(b) B = HTHTT.
Suppose now that \(p = 1/2\).
(c) Find \(P\{A \text{ occurs before } B\}\).
(d) Find the expected number of flips until either A or B occurs.


(a) \(p^{-1} + p^{-2} + p^{-4}(1-p)^{-2}\)
(b) \(p^{-2}(1-p)^{-3}\)
(c)(d) $$\begin{align}
E[N_{B|A}] &= E[N_{HTHTT}] – E[N_{H}] = p^{-2}q^{-3} – p^{-1} \\
E[N_{A|B}] &= E[N_{A}]= p^{-1} + p^{-2} + p^{-4}p^{-2} \\
E[N_{A}] &= E[M] + E[N_A – M] \\
&= E[M] + E[N_A – M|\text{B before A}](1 – P_A) \\
&= E[M] + E[N_{A|B}](1-P_A). \\
E[N_B] = E[M] + E[N_{B|A}]P_A\\
\end{align}$$where \(P_A = P\{\text{A before B}\}, M=min(N_A, N_B), p=q=1/2\). Solve the equations above, we get,
$$
P_A = \frac{E[N_B] + E[N_{A|B}] – E[N_A]}{E[N_{A|B}] + E[N_{B|A}]} = \frac{8}{25} \\
E[M] = E[N_B] – E[N_{B|A}]P_A = \frac{112}{5}\\
$$


3.23 A coin having probability \(p\) of landing heads is flipped \(k\) times. Additional flips of the coin are then made until the pattern of the first \(k\) is repeated (possibly by using some of the first \(k\) flips). Show that the expected number of additional flips after the initial \(k\) and \(2^k\).


Let \(m\) denote the number of head in the first \(k\) flips. Then
$$\begin{align}
E[T] &= \sum_{i=0}^k E[T|m=i]P\{m=i\} \\
&= \sum_{i=0}^k p^{-i}(1-p)^{-k+i}{k \choose i} p^i(1-p)^{k-i} \\
&= 2^k
\end{align}$$


3.25 Consider a delayed renewal process \(\{N_D(t), t \geq 0\}\) whose first interarrival has distribution \(G\) and the others have distribution \(F\). Let \(m_D(t) = E[N_D(t)]\).
(a) Prove that
$$m_D(t) = G(t) + \int_0^t m(t-x)dG(x),$$
where \(m(t) = \sum_{n=1}^{\infty}F_n(t)\)
(b) Let \(A_D(t)\) denote the age at time \(t\). Show that if \(F\) is nonlattice with \(\int x^2 dF(x) < \infty\) and \(t\bar{G}(t) \to 0\) as \(t \to \infty\), then
$$E[A_D(t)] \to \frac{\int_0^{\infty} x^2dF(x)}{2\int_0^{\infty}xdF(x)}$$
(c) Show that if \(G\) has a finite mean, then \(t\bar{G}(t) \to 0\) as \(t \to \infty\).


(a) $$\begin{align}
m_D(t) &= \int_0^t E[N_D(t) | X_1 = x]dG(x)\\
&= \int_0^t 1 + m(t – x)dG(x) \\
&= G(t) + \int_0^t m(t – x)dG(x)
\end{align}$$
(b) $$\begin{align}
&E[A_D(t)]\\
=& E[A_D(t)|S_{N(t)} = 0]\bar{G}(t) + \int_0^t E[A_D(t) | S_{N(t)} = y]\bar{F}(t-y)dm_D(y)\\
=& E[t|X>t]\bar{G}(t) + \int_0^t E[t-y|X>t-y]\bar{F}(t-y)dm_D(y) \\
=& \int_0^{\infty}E[t|X>t]\bar{F}(t)dt/\mu \quad (\text{from Proposition 3.5.1(v)}) \\
=& \frac{\int_0^{\infty} x^2dF(x)}{2\int_0^{\infty}xdF(x)}
\end{align}$$
(c) $$\begin{align}
E[G] &= \int_0^{\infty} \bar{G}(t)dt \\
&= t\bar{G}(t)|_0^{\infty} + \int_0^{\infty}tdG(t) \\
&= \lim_{t \to \infty} t\bar{G}(t) + E[G]\\
\lim_{t \to \infty} t\bar{G}(t) &\to 0
\end{align}$$


3.26 Prove Blackwell’s theorem for renewal reward processes. That is, assuming that the cycle distribution is not lattice, show that, as \(t \to \infty\),
$$E[\text{reward in } (t, t+a)] \to a\frac{E[\text{reward in cycle}]}{E[\text{time of cycle}]}$$
Assume that any relevant function is directly Riemann integrable.


From Proposition 3.6.1 (ii), \(E[R(t)] \to t\frac{E[R]}{E[X]}\), then
$$\begin{align}
&E[\text{reward in } (t, t+a)]\\
=& E[R(t+a)] – E[R(t)] \\
\to & (t+a)\frac{E[R]}{E[X]} – t\frac{E[R]}{E[X]} \\
=& a\frac{E[\text{reward in cycle}]}{E[\text{time of cycle}]}
\end{align}$$


3.28 In Example 3.6(C) suppose that the renewal process of arrivals is a Poisson process with mean \(\mu\). Let \(N^*\) denote that value of \(N\) that minimizes the long-run average cost if a train leaves whenever there are \(N\) travelers waiting. Another type of policy is to have a train depart every \(T\) time units. Compute the long-run average cost under this policy and let \(T^*\) denote the value of \(T\) that minimizes it. Show that the policy that departs whenever \(N^*\) travelers are waiting leads to a smaller average cost than the one that departs every \(T^*\) time units.


From 3.6(c), we get the average cost for policy 1 is
$$E_1 = \frac{c(N-1)}{2} + \frac{K\mu}{N}$$
By derivation, we get the minimum value \(E_1^* = \sqrt{2kc\mu} – c/2\), when \(N= N^*\).
For policy 2,
$$E_2 = \frac{ct\mu}{2} + {K}{t}$$
By derivation, we get the minimum value \(E_2^* = \sqrt{2kc\mu}\), when \(T = T^*\).
Since \(c > 0\), we get the result.


3.29 The life of a car is a random variable with distribution \(F\). An individual has a policy of trading in his car either when it fails or reaches the age of \(A\). Let \(R(A)\) denote the resale value of an A-year-old car. There is no resale value of a failed car. Let \(C_1\) denote the cost of a new car and suppose that an additional cost \(C_2\) is incurred whenever the car fails.
(a) Say that a cycle begins each time a new car is purchased. Compute the long-run average cost per unit time.
(b) Say that a cycle begins each time a car in use fails. Compute the long-run average cost per unit time.
Note. In both (a) and (b) you are expected to compute the ratio of the expected cost incurred in a cycle to the expected time of a cycle. The answer should, of course, be the same in both parts.


(a) $$E[\text{time per cycle}] = \int_0^A xdF(x) + A(1 – F(A)) \\
E[\text{cost per cycle}] = C_1 – \bar{F}(A)R(A) + F(A)C_2\\
\lim_{t \to \infty} \frac{E[\text{total cost}]}{t} = \frac{C_1 – \bar{F}(A)R(A) + F(A)C_2 }{\int_0^A xdF(x) + A(1 – F(A)) }$$
(b) The probability of using a car until it fails is \(F(A)\), then the number of car in a cycle \(N\) is a geometric variable with parameter \(F(A), E[N] = 1/F(A)\), then
$$\begin{align}
E[\text{time per cycle}] &= E[(N-1)A] + \int_0^A xd\frac{F(x)}{F(A)}\\
&= \frac{A\bar{F}(A)}{F(A)} + \int_0^A xd\frac{F(x)}{F(A)} \\
E[\text{cost per cycle}] &= E[NC_1 – (N-1)R(A) + C_2]\\
&= \frac{C_1 + \bar{F}(A)R(A)}{F(A)} + C_2\end{align}$$


3.30 Suppose in Example 3.3(A) that a coin’s probability of landing heads is a beta random variable with parameters \(n\) and \(m\), that is, the probability density is
$$f(p) = Cp^{n-1}(1 – p)^{m-1} \quad 0 \leq p \leq 1.$$
Consider the policy that flips each newly chosen coin until \(m\) consecutive flips land tails, then discards that coin and does the same with a new one. For this policy show that, with probability 1, the long-run proportion of flips that land heads is 1.


Similar to 3.3(A),
$$\begin{align}
&E[\text{number of flips between successive patterns}] \\
= &\int_0^1 (\sum_{i=1}^m (1-p)^{-i}) Cp^{n-1}(1-p)^{m-1} dp \\
= &C\int_0^1 \frac{p^{n-2}}{1-p} – p^{n-2}(1-p)^{m-2} dp \\
\geq &C[\int_0^1 p^{n-2}dp\int_0^1 \frac{1}{1-p}dp – \int_0^1dp ] \\
=& \infty
\end{align}$$


3.31 A system consisting of four components is said to work whenever both at least one of components 1 and 2 work and at least one of components 3 and 4 work. Suppose that component \(i\) alternates between working and being failed in accordance with a nonlattice alternating renewal process with distributions \(F_i\) and \(G_i, i = 1, 2, 3, 4\). If these alternating renewal processes are independent, find \(\lim_{t \to \infty} = P\{\text{system is working at time } t\}\).


Let \(\mu_{F_i}, \mu_{G_i}\) denote the expected work time and failure time of ith component. When \(t \to \infty\), the probability of at least one of 1 and 2 work is
$$P_{12} = 1 – \frac{\mu_{G_1}\mu_{G_2}}{(\mu_{G_1} + \mu_{F_1})(\mu_{G_2} + \mu_{F_2})}$$
Similarly,$$P_{34} = 1 – \frac{\mu_{G_3}\mu_{G_4}}{(\mu_{G_3} + \mu_{F_3})(\mu_{G_4} + \mu_{F_4})}$$
$$\lim_{t \to \infty} = P\{\text{system is working at time } t\} = P_{12}P_{34} $$


3.32 Consider a single-server queueing system having Poisson arrivals at rate \(\lambda\) and service distribution \(G\) with mean \(\mu_G\). Suppose that \(\lambda\mu_G < 1\)
(a) Find \(P_0\), the proportion of time the system is empty.
(b) Say the system is busy whenever it is nonempty (and so the server is busy). Compute the expected length of a busy period.
(c) Use part (b) and Wald’s equation to compute the expected number of customers served in a busy period.


(a) \(P_0 = 1 – \lambda\mu_G\)
(b) \(E[B] = \frac{\mu_G}{1 – \lambda\mu_G}\)
(c) \(E[C] = \frac{1}{1 – \lambda\mu_G}\)
For detail see the “Introduction to probability models” by Sheldon M. Ross 10th edition Chapter 8.5


有一天我会想起爱情

谁会是我生命中永恒的歌手?

注定在我冥冥的前生安排下所有美妙的音符,一双纤纤素手在纷繁的心弦中翻飞如舞,我就在岁月里听到那首千年前熟悉的歌。

爱我,从遥远的时间和空间开始。

在那阳光集结的丛林,亲爱的,我要为你点燃一路的花香。让河流静止,山川也以光的速度下沉,我想和你在风中展翅飞翔。

或者,在思绪疯长的原野,我就是那株骄傲的稻穗,倒在你收获的镰刀与硕大的手掌间,静静听你急促的心跳,咏吸汗水的咸味。

微笑吧,或者幸福地哭泣。

明月其实没有清楚自己的表情,我只在星星睡去的夜晚看你晶莹的眸子。即使那点点泪花照着我艰辛的跋涉。分分秒秒,年年岁岁,我在你深黑的眼睛里游泳,直抵内心我无法确定的温柔。

让我做梦,再不要醒来。

那些平常日而简单的情节一次又一次为我展开。阳光在这个暗夜大面积降临,我偷偷地抽出三柱挂光线,做弦,弹琴,为你。而你的微笑,定在某个我无法确知无法抵达的角落悄悄绽放,朝我绽放。在这冷冷的大都市和漠然的人群中,我相信再也没有人见过如此动人的花瓣,圣洁的颜色。再没有人比我更幸福。

那么,乘着我破旧的马车,带着月色回家吧,回到唐风宋雨中去,会到后主的玉砌栏杆柳永的晓风残月,回到我们一生注定贫寒而朴素的家园。

而你的微笑呵,定会铺满我们木棉树制成的床,让我做梦,不再醒来。

背影,永远匆匆。

正如这个太过忙碌的季节常常模糊春花秋实,夏雨冬雪的清晰含义。我读不懂你在时光中疾走的影子,更不敢去弄清你的每一个动作。在阳光下我只能越来越远的地看着你,无法捕捉目光的影子更无法触及深处你没有定义的沉默。在光与影交换角度与暗度的时刻,我承认了某些事实。

我在黑暗中看到自己痛苦流涕的样子。在失去偶像的夜晚,我将人类的恶语言一起丧失。那么多日子透明而坚硬,我看见音乐的俄流动却无法穿越。

没有行动,即行失败。

想哭,却没有眼泪。

烛火渐渐隐去了吗那才姹紫嫣红的笑颜,退潮的海水也卷走沙滩上美丽的足印。我不会唱歌了,因为海鸥在轻轻地飞。

有一天,我想起了爱情。

初次与另一次自己邂逅,陌生而恐惧。我并不后悔,因为总还有歌声流浪到下一岁月的迷宫。我将在一个大雪纷飞的夜晚再次经过你挂满风铃的窗前,让目光再次停留,停留在结满冰凌的窗台上。因为我想起了爱情。

青鸟在这诗行中飞来飞去并最终消失,你却在消失后留下了飞翔的痕迹。

这篇散文诗是我小时候临摹的一本行书字帖上的文章( 说来惭愧,我并没有规规矩矩地练过字,但是经常被人夸赞字好看,不过我很清楚自己的水平。就硬笔书法而言,我比较喜欢钱沛云和沈鸿根的风格,软笔实在没怎么看过帖,觉得把毛笔字写的舒服就已经很有功力了,等我退休我会好好练字的),虽然没有什么特别之处,但直到今日我仍会时不时地回忆起其中句子,尤其是第一段和第三段我总会无意间脱口而出。从网上搜索到原文也未见具体的出处,贴在这里供自己欣赏。

Solutions to Stochastic Processes Ch.2

随机过程-第二版》(英文电子版Sheldon M. Ross 答案整理,此书作为随机过程经典教材没有习题讲解,所以将自己的学习过程记录下来,部分习题解答参考了网络,由于来源很多,出处很多也不明确,无法一一注明,在此一并表示感谢!希望对大家有帮助,水平有限,不保证正确率,欢迎批评指正,转载请注明出处。
Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

2.1 Show that Definition 2.1.1 of a Poisson process implies Definition 2.1.2.


$$\begin{align}
&\ \forall t_1 < t_2, s > 0 \\
&P\{N(t_2+s) – N(t_1+s) = n\}\\
= &P\{N(t_2) – N(t_1) = n\} = e^{-\lambda(t_2-t_1)}\frac{\lambda^n(t_2-t_1)^n}{n!}
\end{align}$$
Thus, the process has stationary increments.
$$P\{N(h) = 1\} = \lambda h + \lambda h(e^{-\lambda h} – 1)\\
\lim_{h \to 0} \frac{\lambda h(e^{-\lambda h} – 1) }{h} = 0$$
Thus, \( P\{N(h) = 1\} = \lambda h + o(h)\).
$$P\{N(h) \geq 2\} = 1 – P\{N(h) = 1\} – P\{N(h) = 0\} = o(h) $$
Thus, \( P\{N(h) \geq 2\} = o(h)\).


2.2 For another approach to proving that Definition 2.1.2 implies Definition 2.1.1:
(a) Prove, using Definition 2.1.2, that
$$P_0(t + s) = P_0(t)P_0(s)$$
(b) Use (a) to infer that the interarrival times \(X_1, X_2, \dots\) are independent exponential random variables with rate \(\lambda\).
(c) Use (b) to show that \(N(t)\) is Poisson distributed with mean \(\lambda t\).


(a)$$P_0(t+s) = P\{N(t) = 0, N(s)=0\} = P_0(t)P_0(s) $$
(b)$$\begin{align}
1 – F_X(t) &= P_0(t) = \lim_{\Delta t \to 0} P_0(\Delta t)^{t/\Delta t} \\
&= \lim_{\Delta t \to 0} (1 – \lambda\Delta t + o(\Delta t))^{t/\Delta t} \\
&= e^{-\lambda t}\\
F_X(t) &= 1 – e^{-\lambda t}\\
\end{align}$$
(c) From Problem 1.29 we know, \(\sum_1^n X_i \sim \Gamma(n, \lambda)\) when all \(X_i\) are i.i.d and \( X_i \sim Exponential(\lambda), i=1, 2, \dots\) Let \(f_n(x)\) denote the PDF of the sum of \(n \ X_i\). Thus,
$$\begin{align}
P\{N(t) = n\} &= P\{\sum_1^n X_i < t, \sum_1^{n+1} X_i > t\} \\
&= \int_0^t f_n(x)e^{-\lambda(t-x)} dx \\
&= e^{-\lambda t}\frac{(\lambda t)^n}{n!}
\end{align}$$


2.3 For a Poisson process show, for \(s < t\), that
$$P\{N(s) = k | N(t) = n\} = {n \choose k}(\frac{s}{t})^k(1 – \frac{s}{t})^{n-k}, \quad k=0, 1, \dots, n.$$


$$\begin{align}
P\{N(s)=k|N(t)=n\} &= \frac{ P\{N(s)=k, N(t-s)=n-k\} }{ P\{N(t)=n\} } \\
&= \frac{ P\{N(s)=k\}P\{N(t-s)=n-k\} }{ P\{N(t)=n\} } \\
&= {n \choose k}(\frac{s}{t})^k(1 – \frac{s}{t})^{n-k}, \quad k=0, 1, \dots, n.
\end{align}$$


2.4 Let \(\{N(t), t \geq 0\}\) be a Poisson process with rate \(\lambda\). Calculate \(E[N(t)N(t+s)]\).


$$\begin{align}
E[N(t)N(t+s)] &= E[N(t)(N(t) + N(s))] \\
&= E[N^2(t)] + E[N(t)N(s)] \\
&= \sum_0^{\infty} n^2 e^{-\lambda t}\frac{(\lambda t)^n}{n!} + E[N(t)]E[N(s)] \\
&= \lambda t[\sum_0^{\infty} (n-1 + 1) e^{-\lambda t}\frac{(\lambda t)^{n-1}}{(n-1)!}] + \lambda^2ts \\
&= \lambda^2t(t+s) + \lambda t \\
\end{align}$$
The second mixed raw moment, which is \(E[N(t)N(s)]\), is called the auto-correlation function of the stochastic process. And the acf for Poisson process with parameter \(\lambda\) is
$$E[N(t)N(s)] = \lambda st + \lambda min\{s, t\}, \quad s,t\geq 0$$


2.5 Suppose that \(\{N_1(t), t \geq 0\}\) and \(\{N_2(t), t \geq 0\}\) are independent Poisson process with rates \(\lambda_1\) and \(\lambda_2\). Show that \(\{N_1(t) + N_2(t), t \geq 0\}\) is a Poisson process with rate \(\lambda_1 + \lambda_2\). Also, show that the probability that the first event of the combined process comes from \(\{N_1(t), t \geq 0\}\) is \(\lambda_1 /(\lambda_1 + \lambda_2)\), independently of the time of the event.


Let \(N(t) = N_1(t) + N_2(t)\), then
$$\begin{align}
&P\{N(t) = n\}\\= &\sum_{i=0}^n P\{N_1(t) = i\}P\{N_2(t) = n- i\} \\
=&\sum_{i=0}^n e^{-\lambda_1 t}\frac{(\lambda_1 t)^i}{i!} e^{-\lambda_2 t}\frac{(\lambda_2 t)^{n-i}}{(n-i)!} \\
=&e^{-(\lambda_1 + \lambda_2)t}\frac{(\lambda_1+\lambda_2)^nt^n}{n!}
\sum_{i=0}^n {n \choose i} (\frac{\lambda_1}{\lambda_1 + \lambda_2})^i(\frac{\lambda_2}{\lambda_1 + \lambda_2} )^{n-i} \\
=&e^{-(\lambda_1 + \lambda_2)t}\frac{(\lambda_1+\lambda_2)^nt^n}{n!}
\end{align}$$
Let \(X_i\) denote the waiting time until the first event occur of the \(i\) process, then \(X_i \sim Exponential(\lambda_i), i=1,2\). The probability of first event comes from 1 is,
$$P\{X_1 < X_2\} = \int_0^{\infty} \lambda_1e^{-\lambda_1 t} e^{-\lambda_2 t} dt = \frac{\lambda_1}{\lambda_1 + \lambda_2}$$


2.10 Buses arrive at a certain stop according to a Poisson process with rate \(\lambda\). If you take the bus from that stop then it takes a time \(R\), measured from the time at which you enter the bus, to arrive home. If you walk from the bus stop then it takes a time \(W\) to arrive home. Suppose that your policy when arriving at the bus stop is to wait up to a time s, and if a bus has not yet arrived by that time then you walk home.
(a) Compute the expected time from when you arrive at the bus stop util you reach home.
(b) Show that if \(W < 1/\lambda + R\) then the expected time of part (a) is minimized by letting \(s=0\); if \(W > 1/\lambda + R\) then it is minimized by letting \(s=\infty\) (that is, you continue to wait for the bus), and when \(W = 1/\lambda + R\) all values of \(s\) give the same expected time.
(c) Give an intuitive explanation of why we need only consider the case \(s=0\) and \(s=\infty\) when minimizing the expected time.


(a) $$\begin{align}
E &= (s+W)P\{N(s) = 0\} + \int_0^s (t+R)\lambda e^{-\lambda t} dt \\
&= (W-R-\frac{1}{\lambda})e^{-\lambda s} + R + \frac{1}{\lambda}
\end{align}$$
(b) When \(W < 1/\lambda + R\) the expectation monotonically increases as \(s\) increases, the expected time is minimized when \(s\) equals to its minimum value 0.
When \(W > 1/\lambda + R\) the expectation monotonically decreases as \(s\) increases.
When \(W = 1/\lambda + R\) the expectation always equals \(R + \frac{1}{\lambda} \).
(c) Because the interarrival times of bus is exponential distributed which is memoryless. The time until the first bus come after waited \(s\) is just the “same” as if you just came the stop. That is to say, if you decide to leave after waited a \(s\) time, you should leave at the beginning; and if you decide not to leave at the beginning, you should continue to wait for the bus.


2.11 Cars pass a certain street location according to a Poisson process with rate \(\lambda\). A person wanting to cross the street at that location waits until she can see that no cars will come by in the next \(T\) time units. Find the expected time that the person waits before starting to cross. (Note, for instance, that if no cars will be passing in the first \(T\) time units then the waiting time is 0).


Condition on the first arrival time \(X\), and the total expected waiting time
$$\begin{align}
E[W] &= E[E[W|X]] \\
&= \int_0^T (E[W] + x)\lambda e^{-\lambda x} dx\\
&= E[W] + \frac{1}{\lambda} -e^{-\lambda T}(T + E[W] + \frac{1}{\lambda}) \\
E[W] &= \frac{e^{\lambda T} -1}{\lambda} – T
\end{align}$$
In the second line, consider the first arrival time as a part of the second, then the conditional state is identical the initial state, due to the memoryless property.


2.12 Events, occurring according to a Poisson process with rate \(\lambda\), are registered by a counter. However, each time an event is registered the counter becomes inoperative for the next \(b\) units of time and does not register any new events that might occur during that interval. Let \(R(t)\) denote the number of events that occur by time \(t\) and are registered.
(a) Find the probability that the first \(k\) events are all registered.
(b) For \(t \geq (n-1)b\), find \(P\{R(t) \geq n\}\).


(a) Since the interarrival times conform exponential distribution with parameter \(\lambda\), then the first \(k\) events all registered means the \(k-1\) intervals are all greater than \(b\), which probability is \(e^{-\lambda t(k-1)}\).
(b) Due to the memoryless property of exponential distribution, the distribution of \(R(t)\) is identical to \(N(t-(n-1)b)\). Thus,
$$P\{R(t) \geq n\} = P\{N(t-(n-1)b) \geq n\} = e^{-\mu}\sum_{k=n}^{\infty} \frac{\mu^k}{k!}$$
where \(\mu = \lambda(t – (n-1)b) \).


2.13 Suppose that shocks occur according to a Poisson process with rate \(\lambda\), and suppose that each shock, independently, causes the system to fail with probability \(p\). Let \(N\) denote the number of shocks that it takes for the system to fail and let \(T\) denote the time of failure. Find \(P\{N=n|T=t\}\).


The process can be considered as two independent Poisson counter, the failure \(N_1(t)\) with parameter \(\lambda p\) and the non-failure \(N_2(t)\) with \(\lambda (1-p)\). Then,
$$P\{N=n|T=t\} = P\{N_2(t) = n-1\} = e^{-\lambda(1-p)t}\frac{(\lambda(1-p)t )^{n-1}}{(n-1)!} $$


2.16 The number of trails to be performed is a Poisson random variable with mean \(\lambda\). Each trails has \(n\) possible outcomes, number \(i\) with probability \(P_i, \sum_1^n P_i = 1\). Let \(X_j\) denote the number of outcomes that occur exactly \(j\) times, \(j=0, 1, \dots\). Compute \(E[X_j]\) and \(Var(X_j)\).


Let \(Y_i\) denote the number of the occurrence of the ith outcome. From Proposition 2.3.2, we have \(Y_i \sim \pi(\lambda P_i)\), and all \(Y_i\) is i.i.d. Thus, \(X_j\) can be considered as the sum of \(n\) random variable conformed (0-1) distribution independently.
$$ E[X_j] = \sum_{i=1}^n P\{Y_i = j\} = \sum_{i=1}^n e^{-\lambda P_i} \frac{(\lambda P_i)^j}{j!} $$
$$Var(X_j) = \sum_{i=1}^n P\{Y_i = j\}(1 – P\{Y_i = j\} ) $$


2.17 Let \(X_1, X_2, \dots, X_n\) be independent continuous random variables with common density function \(f\). Let \(X_{(i)}\) denote the ith smallest of \(X_1, \dots, X_n\).
(a) Note that in order for \(X_{(i)}\) to equal \(x\), exactly \(i-1\) of the \(X\)’s must be less than \(x\), one must equal \(x\), and other \(n-i\) must all be greater than \(x\). Using this fact argue that the density function of \(X_{(i)}\) is given by
$$f_{X_{(i)}}(x) = \frac{n!}{(i-1)!(n-i)!}(F(x))^{i-1}(\bar{F}(x))^{n-i}f(x).$$
(b) \(X_{(i)}\) will be less than \(x\) if, and only if, how many of the \(X\)’s are less than \(x\)?
(c) Use (b) to obtain an expression for \(P\{X_{(i)} \leq x\}\).
(d) Using (a) and (c) establish the identity for \(0 \leq y \leq 1\)
$$\sum_{k=i}^n {n \choose k}y^k(1-y)^{n-k} = \int_0^y \frac{n!}{(i-1)!(n-i)!}x^{i-1}(1-x)^{n-i}dx.$$
(e) Let \(S_i\) denote the time of the ith event of the Poisson process \(\{N(t), t \geq 0\}\). Find$$
E[S_i|N(t)=n] = \left\{
\begin{array}{ll}
\underline{\ \ } \quad i \leq n \\
\underline{\ \ } \quad i > n \\
\end{array}
\right. \\
$$


(a) $$\begin{align}
f_{X_{(i)}}(x) &= {n \choose i} (F(x))^{i-1}(\bar{F}(x))^{n-i} {i \choose 1}f(x) \\
&= \frac{n!}{(i-1)!(n-i)!}(F(x))^{i-1}(\bar{F}(x))^{n-i}f(x)\\
\end{align}$$
(b) \(\geq i\)
(c) $$ P\{X_{(i)} \leq x\} = \sum_{k=i}^n(F(x))^k(\bar{F}(x))^{n-k} $$
(d) $$\begin{align}
&P\{X_{(i)} \leq x\}\\
= &\sum_{k=i}^n(F(x))^k(\bar{F}(x))^{n-k} \\
= & \int_0^{F(x)}\frac{n!}{(i-1)!(n-i)!}(F(t))^{i-1}(\bar{F}(t))^{n-i}dF(t)
\end{align}$$
(e) From Theorem 2.3.1 we know the distribution of \((S_1, \dots, S_n | N(t) = n)\) is identical to \(U_{(1)}, \dots, U_{(n)}\).
Thus, when \(i \leq n\),
$$\begin{align}
E[S_i|N(t)=n] &= E[U_{(i)}] \\
&= \int_0^t x \frac{n!}{(i-1)!(n-i)!}(\frac{x}{t})^{i-1}(1 – \frac{x}{t})^{n-i}\frac{1}{t} dx \\
&= \frac{it}{n + 1} \int_0^t \frac{(n+1)!}{i!(n-i)!}(\frac{x}{t})^i(1 – \frac{x}{t})^{n-i} d \frac{x}{t} \\
&= \frac{it}{n + 1} \sum_{k=i}^n{n \choose k} (\frac{t}{t})^k(1 – \frac{t}{t})^{n-k}\\
&= \frac{it}{n + 1}
\end{align}$$
When \(i >n\), due to the memoryless property, the time remaining before \(t\) after the nth event occur can be omitted. Hence it can be considered as time \(t\) plus \(i – n\) exponential distributed variables with parameter \(\lambda\). Thus,
$$
E[S_i|N(t)=n] = \left\{
\begin{array}{ll}
\frac{it}{n + 1} \quad i \leq n \\
t + \frac{i-n}{\lambda} \quad i > n \\
\end{array}
\right. \\
$$


2.18 Let \(U_{(1)}, \dots, U_{(n)}\) denote the order statistics of a set of \(n\) uniform \((0, 1)\) random variables. Show that given \(U_{(n)} = y, U_{(1)}, \dots, U_{(n-1)}\) are distributed as the order statistics of a set of \(n-1\) uniform \((0, y)\) random variables.


$$\begin{align}
&f(U_{(1)}, \dots, U_{(n-1)} | U_{(n)} = y ) \\
=& (n-1)!\prod_{i \neq (n)} f(y_i|y_{(n)}) \\
=&\frac{(n-1)!}{y^{n-1}}
\end{align}$$


2.19 Busloads of customer arrive at an infinite server queue at a Poisson rate \(\lambda\). Let \(G\) denote the service distribution. A bus contains \(j\) customers with probability \(\alpha_j, j=1, \dots\). Let \(X(t)\) denote the number of customers that have been served by time \(t\).
(a) \(E[X(t)] = ?\)
(b) Is \(X(t)\) Poisson distributed?


(a) Let \(N_j(t)\) denote the number of bus with \(j\) customers which has finish by the time \(t\), \(X_j(t)\) denote the number of customers. Then \(N_j(t)\) are independent Poisson variables with mean \(\lambda \alpha_j \int_0^t G(y)dy\).
$$E[X(t)] = \sum E[X_j(t)] = \sum E[E[X_j(t)|N_j(t)]] = \lambda \int_0^t G(y)dy \sum j\alpha_j $$
(b) No. When \(j=2, P\{N_2(h) = 1\} = P\{X_2(h) = 2\} \neq o(h)\)


2.20 Suppose that each event of a Poisson process with rate \(\lambda\) is classified as being either type \(1, 2, \dots, k\). If the event occurs at \(s\), then, independently of all else, it is classified as type \(i\) with probability \(P_i(s), i = 1, \dots, k, \sum_1^k P_i(s) = 1\). Let \(N_i(t)\) denote the number of type \(i\) arrivals in \([0, t]\). Show that the \(N_i(t), i = 1, \dots, k\) are independent and \(N_i(t)\) is Poisson distributed with mean \(\lambda \int_0^t P_i(s)ds\).


Similar to the proof of Proposition 2.3.2, the probability an arbitrary event occurred in \([0, t]\) is a type-i event is$$p_i = \frac{1}{t}\int_0^t P_i(s)ds$$ independently of the other events. Hence the joint distribution conditioning on \(N(t)\) is
$$\begin{align}
&P\{N_i(t) = n_i\}\\
=& P\{N_i(t) = n_i | N(t) = n\}P\{N(t) = n\} \\
=& \frac{n!}{n_1!\dots n_k!}p_1^{n_1}\dots p_k^{n_k}e^{-\lambda t}\frac{(\lambda t)^n}{n!}\\
=& \prod_{i=1}^k e^{-\lambda tp_i}\frac{(\lambda tp_i)^{n_i}}{n_i!}
\end{align}$$ where \(n = \sum_{i=1}^k n_i, i = 1, 2, \dots\)


2.21 Individuals enter the system in accordance with a Poisson process having rate \(\lambda\). Each arrival independently makes its way through the states of the system. Let \(\alpha_i(s)\) denote the probability that an individual is in state \(i\) a time \(s\) after it arrived. Let \(N_i(t)\) denote the number of individuals in state \(i\) at time \(t\). Show that the \(N_i(t), i \geq 1\), are independent and \(N_i(t)\) is Poisson with mean equal to
\(\lambda E\)[amount of time an individual is in state i during its first t units in the system]


As shown in Problem 2.20, \(N_i(t), i \geq 1\), are independent Poisson with mean \(\lambda \int_0^t \alpha_i(t-s)ds \). And the later part is equal to the expectation of the amount of time an individual is in state i during its first t units in the system.


2.23 For the model of Example 2.3(C), find
(a) \(Var[D(t)] \)
(b) \(Cov[D(t), D(t+s)]\)


(a) Let \(X_i\) denote the ith damage, which are i.i.d. And
$$E[X_i] = \frac{E[D]}{\alpha t}(1 – e^{-\alpha t}) \\
E[X_i^2] = \frac{E[D^2]}{2\alpha t}(1 – e^{-2\alpha t}) $$ Hence,
$$\begin{align}
E[D^2(t)] &= E[E[D^2(t)|N(t) = n]] \\
&= E[E[\sum_{i=1}^n D_i^2e^{-2\alpha(t-S_i)} + \sum_{i \neq j} D_iD_je^{-2\alpha(t – S_i – S_j)} | N(t) = n]] \\
&= E[nE[X_i^2]] + E[n(n-1)E^2[X_i]] \\
&= E[N(t)]E[X_i^2] + E[N^2(t) – N(t)]E^2[X_i]] \\
\end{align}$$ Since \(E[N(t)] = \lambda t, E[N^2(t)] = \lambda^2t^2 + \lambda t\), thus we have,
$$Var[D(t)] = E[D^2(t)] – E^2[D(t)] = \frac{\lambda E[D^2]}{2\alpha}(1 – e^{-2\alpha t})$$
(b) $$\begin{align}
&D(t+s) = D(t)e^{-\alpha s} + D(s) \\
&Cov[D(t), D(t+s)]\\
= &E[D(t)D(t+s)] – E[D(t)]E[D(t+s)] \\
= &E[D^2(t)e^{-\alpha s} + D(s)D(t)] – E[D(t)]E[D(t)e^{-\alpha s} + D(s)] \\
= &e^{-\alpha s} (E[D^2(t)] – E^2[D(t)]) + E[D(s)]E[D(t)] – E[D(t)]E[D(s)] \\
= &\frac{\lambda E[D^2]}{2\alpha}(e^{-\alpha s} – e^{-\alpha (2t + s)})
\end{align}$$


2.25 Suppose that events occur in accordance with a Poisson process with rate \(\lambda\), and that an event occurring at time \(s\), independent of the past, contributes a random amount having distribution \(F_s, s \geq 0\). Show that \(W\), the sum of all contributions by time \(t\), is a compound Poisson random variable. That is, show \(W\) has the same distribution as \(\sum_{i=1}^N X_i\), where the \(X_i\) are independent of \(N\), a Poisson random variable. Identify the distribution of the \(X_i\) and the mean of \(N\).


The distribution function of \(X_i\) is \(F(x) = \frac{1}{t}\int_0^t F_s(x)ds\), the mean of \(N\) is \(\lambda t\).


2.27 Compute the moment generating function of \(D(t)\) in Example 2.3(C).



2.28 Prove Lemma 2.3.3


$$
E[Y_1+\dots +Y_n | Y_1+\dots +Y_n = y ] = y \\
nE[Y_1| Y_1+\dots +Y_n = y ] = y \\
E[Y_1+\dots +Y_k | Y_1+\dots +Y_n = y ] = \frac{k}{n} y \\
$$


2.29 Complete the proof that for nonhomogeneous Poisson process \(N(t+s) -N(t)\) is Poisson with mean \(m(t+s) – m(t)\).


$$\begin{align}
P_n(s+h) &= P\{N(t+s+h) – N(t) =n\} \\
&= P_n(s)[1 – \lambda(t+s)h + o(h)] + P_{n-1}(s)[\lambda(t+s)h + o(h)] \\
P_n^{\prime}(s) + \lambda(t+s)P_n &= \lambda(t+s)P_{n-1} \\
\frac{d}{ds}(e^{\int\lambda(t+s)ds}P_n(s)) &= e^{\int\lambda(t+s)ds}\lambda(t+s)P_{n-1}(s)\\
\end{align}$$
When \(n=0\), it holds true. Assume it holds true for \(n = k – 1\), we can find it’s true for \(n = k\). Proven.


2.30 Let \(T_1, T_2, \dots\) denote the interarrival times of events of a nonhomogeneous Poisson process having intensity function \(\lambda(t)\)
(a) Are the \(T_i\) independent?
(b) Are the \(T_i\) identically distributed?
(c) Find the distribution of \(T_1\).
(d) Find the distribution of \(T_2\).


(a) $$\begin{align}
P\{T_1 > t\} &= P\{N(t) = 0\} = e^{-m(t)} \\
P\{T_2 > t | T_1 = s\} &= P\{N(t+s) – N(s) = 0 | T_1 = s\} \\
&= e^{-[m(t+s) – m(s)]}
\end{align}$$Thus, \(T_i\) are not independent.
(b) No, \(T_2\) depends on \(T_1\).
(c)$$F_{T_1}(t) = 1-e^{-m(t)}$$
(d) $$F_{T_2}(t) =1- \int_0^{\infty} \lambda(s)e^{-m(t+s)}ds$$


2.31 Consider a nonhomogeneous Poisson process \(\{N(t), t \geq 0\}\), where \(\lambda(t) > 0\) for all \(t\). Let
$$N^{*}(t) = N(m^{-1}(t))$$
Show that \(\{N^{*}(t), t \geq 0\}\) is a Poisson process with rate \(\lambda = 1\).


Let \(u = m^{-1}(t)\), and suppose that \(t_1, t_2, \dots\) is a sequence of points in \([0, \infty]\) with \(0 \leq t_1 < t_2 < \dots\). Since \(m^{-1}\) is strictly increasing, we have \(0 \leq u_1 < u_2 < \dots\), where \(u_i = m^{-1}(t_i)\). Due the independence of nonhomogeneous Poisson process, the sequence \(N^*_{t_1}, N^*_{t_2}, \dots\) is also independent.
Suppose that \(t, h \in [0, \infty)\), and let \(u_1 = m^{-1}(t), u_2 = m^{-1}(t+h)\). Hence, \(N^*(t+h) – N^*(t) = N(u_2) – N(u_1)\) has the Poisson distribution with parameter \(m(u_2) – m(u_1) = h\).


2.32 (a) Let \(\{N(t), t \geq 0\}\) be a nonhomogeneous Poisson process with mean value function \(m(t)\). Given \(N(t) = n\), show that the unordered set of arrival times has the same distribution as \(n\) independent and identically distributed random variables having distribution function$$
F(x) = \left\{
\begin{array}{ll}
\frac{m(x)}{m(t)} \quad x \leq t \\
1 \quad x > t \\
\end{array}
\right. $$
(b) Suppose that workers incur accidents in accordance with a nonhomogeneous Poisson process with mean value function \(m(t)\). Suppose further that each injured person is out of work for a random amount of time having distribution \(F\). Let \(X(t)\) be the number of workers who are out of work at time \(t\). Compute \(E[X(t)]\) and \(Var(X(t))\).


(a) Similarly to the proof of Theorem 2.3.1, let \(S_i\) denote the ith event,
$$\begin{align}
&P\{t_i < S_i < t_i + h_i, i = 1, \dots, n|N(t) = n\} \\
=&\frac{n!\prod[m(t_i + h_i) – m(t_i)]}{e^{-m(t)[m(t)]^n/n!}}
\end{align}$$Dividing both sides by \(h_1, \dots, h_n\) and let \(h_i \to 0\):
$$f_{s_1\dots s_n}(t_1,\dots,t_n|N(t) = n) = n!\prod[\lambda(t_i)/m(t)]$$
(b) The probability of an arbitrary injured worker is still out of work at time \(t\) is
$$p = \int_0^t \bar{F}(t-s)d\frac{m(s)}{m(t)} = \int_0^t \bar{F}(t-s)\frac{\lambda(s)}{m(t)}ds$$Thus, we have
$$\begin{align}
E[X(t)] &= E[E[X(t)|N(t)]] \\
&= m(t)\int_0^t \bar{F}(t-s)\frac{\lambda(s)}{m(t)}ds \\
&= \int_0^t \bar{F}(t-s)\lambda(s)ds \\
Var(X(t)) &= E[Var(X(t)|N(t))] + Var(E[X(t)|N(t)]) \\
&= m(t)p(1-p) + p^2Var(N(t)) \\
&= \int_0^t \bar{F}(t-s)\lambda(s)ds
\end{align}$$


2.33 A two-dimensional Poisson process is a process of events in the plane such that (i) for any region of area \(A\), the number of events in \(A\) is Poisson distributed with mean \(\lambda A\), and (ii) the numbers of events in nonoverlapping regions are independent. Consider a fixed point, and let \(X\) denote the distance from that point to its nearest event, where distance is measured in the usual Euclidean manner. Show that
(a) \(P\{X > t\} = e^{-\lambda\pi t^2}\).
(b) \(E[X] = 1 / (2\sqrt{\lambda})\).
(c) Let \(R_i, i \geq 1\) denote the distance from an arbitrary point to the ith closest event to it. Show that, with \(R_0 = 0, \pi R_i^2 – \pi R_{i-1}^2, i \geq 1\) are independent exponential random variables, each with rate \(\lambda\).


(a) $$P\{X > t\} = e^{-\lambda A}\frac{(\lambda A)^0}{0!} = e^{-\lambda\pi t^2} $$
(b) $$\begin{align}
E[X] &= \int_0^{\infty} t d(1 – e^{-\lambda\pi t^2} ) \\
&= -te^{-\lambda \pi t^2}|_0^{\infty} + \frac{1}{\sqrt{\lambda \pi}}\int_0^{\infty} e^{-x^2} dx \\
&= \frac{1}{2\sqrt{\lambda}}
\end{align} $$
(c) Let \(D_i = \pi R_i^2 – \pi R_{i-1}^2\), then
$$
F_{D_i}(x) = 1 – P\{D_i > x\} =1 – e^{-\lambda x}\frac{x^0}{0!} =1 – e^{-\lambda x} $$


2.34 Repeat Problem 2.25 when the events occur according to a nonhomogeneous Poisson process with intensity function \(\lambda(t), t \geq 0\).


As shown in Problem 2.32, the distribution function of \(X_i\) is \(F(x) = \int_0^t F_s(x)\frac{\lambda(s)}{m(t)} ds\), the mean of \(N\) is \(m(t)\).


2.35 Let \(\{N(t), t \geq 0\}\) be a nonhomogeneous Poisson process with intensity function \(\lambda(t), t \geq 0\). However, suppose one starts observing the process at a random time \(\tau\) having distribution function \(F\). Let \(N^*(t) = N(\tau + t) – N(\tau)\) denote the number of events that occur in the first \(t\) time units of observation
(a) Does the process \(\{N^*(t), t \geq 0\}\) process independent increments?
(b) Repeat (a) when \(\{N(t), t \geq 0\}\) is a Poisson process.


\(N^*(t+s) – N^*(t) = N(\tau + t + s) – N(\tau + t)\) which counts the event in a non-overlapping interval with \(N(\tau + t) – N(\tau)\). Thus the increments are independent.


2.36 Let \(C\) denote the number of customers served in an \(M/G/1\) busy period. Find
(a) \(E[C]\).
(b) \(Var(C)\).


(a) \(E[C] = \frac{1}{1 – \lambda\mu_G}\)


2.37 Let \(\{X(t), t \geq 0\}\) be a compound Poisson process with \(X(t) = \sum_{i=1}^{N(t)}X_i\), and suppose that the \(X_i\) can only assume a finite set of possible values. Argue that, for \(t\) large, the distribution of \(X(t)\) is approximately normal.


(i) A Poisson variable converges to a normal distribution as \(\lambda \to \infty\):
Let \(X \sim \pi(\lambda)\), then
$$\begin{align}
Y &= \frac{X-\lambda}{\sqrt{\lambda}} \\
M_Y &= E[exp\{t\frac{X-\lambda}{\sqrt{\lambda}} \}] \\
&= exp\{-t\sqrt{\lambda}\}exp\{\lambda(e^{t/\sqrt{\lambda} – 1})\} \\
&= exp\{-t\sqrt{\lambda} + \lambda(\sum_{i=0}^{\infty} \frac{(t/\sqrt{\lambda})^i}{i!} – 1)\} \\
&= exp\{t^2/2 + t^3/(6\sqrt{\lambda}) + \dots\}\\
\lim_{\lambda \to \infty} M_Y &= exp\{t^2/2\}
\end{align}$$
Thus, \(Y \sim N(0, 1), X \sim N(\lambda, \lambda)\).
(ii) \(X(t)\) can be expressed as the sum the all the possible value \(\alpha_j\) from different independent Poisson process. Namely, \(X(t) = \sum_j \alpha_j N_j(t)\), with parameters \(\lambda p_j t\). When \(t\) is large, all the Poisson variables converges to normal. Hence the sum of \(j\) normal variables is also normal.


2.38 Let \(\{X(t), t \geq 0\}\) be a compound Poisson process with \(X(t) = \sum_{i=1}^{N(t)}X_i\), and suppose that \(\lambda = 1\) and \(P\{X_i = j\} = j/10, j = 1,2,3,4\). Calculate \(P\{X(4) = 20\}\).


\(X(4)\) is compound Poisson variable with parameter \(\lambda t = 4\). From Corollary 2.5.4 we know,
$$\begin{align}
P\{ X(t) = 20 \} &= P_{20} = \frac{1}{5} \sum_{j = 1}^{4} \frac{j^2}{10} P_{20 – j} \\
&= \dots
\end{align}$$


2.39 Compute \(Cov(X(s), X(t))\) for a compound Poisson process.


Suppose \(s \leq t\), then
$$\begin{align}
Cov(X(s), X(t)) &= Cov(X(s), X(t)-X(s)) + Cov(X(s), X(s)) \\
&= Var(X(s)) = \lambda s E[X^2]
\end{align}$$ Symmetrically, when \(t \leq s\), thus we have
$$Cov(X(s), X(t)) = \lambda min(s,t)E[X^2] $$


2.40 Give an example of a counting process \(\{N(t), t \geq 0\}\) that is not a Poisson process but which has the property that conditional on \(N(t) = n\) the first \(n\) event times are distributed as the order statistics from a set of \(n\) independent uniform \((0,t)\) random variables.


Conditional Poisson processes don’t have independent increments, which means they’re not Poisson process. But given \(N(t) = n\) the arrival times are distributed as the order statistics from a set of \(n\) independent uniform \((0,t)\) random variables. Refer the solution for Problem 2.41 in textbook for detail.


灾难发生时哄抬物价是变相抢劫吗?

节选自《经济学原理》第七版,微观分册, 曼昆 (N.Gregory Mankiw)、 梁小民
第四章新闻摘录,新闻原作者 John Carney,来源 Courtesy of CNBC.

当台风这样的灾难袭击一个地区时,由于需求增加或供给减少,许多商品都有价格上升的压力。政治家反对这种物价暴涨,但以下这篇文章认可了市场这种自然而然的反应。

一罐可乐4美元,在 Brooklyn 市中心住一晚旅馆 500 美元,一对电池6.99美元。这仅仅是我和我的朋友在 Sandy 台风前后个人亲历的几桩物价暴涨的例子。通常人们把这种情况称为哄抬物价,在突发事件期间这种情况会普遍出现。

左右派政治家一致认为自然灾害时哄抬物价是一件可怕的、毫无任何好处的恶劣事件。纽约州总检察长 Eric Schneiderman 发表紧急声明:“反对在 Sandy 台风期间必需品与必需的服务价格膨胀。”新泽西州州长 Chris Christie 发表强制性警告说:“哄抬物价会导致高额罚款。”政府设立了热线让消费者举报哄抬物价的行为。

新泽西州的法律极为明确。在宣布本州有突发事件时,物价上升超过10%就被认为是过分的。去年热带风暴 Irene 期间,新泽西州一家加油站由于汽油涨价16%而支付了5万美元罚金。
纽约州的法律甚至更严厉。据总检察长 Schneiderman 所说:“任何必需品与必需的服务的价格上涨都被视为哄抬物价。”
这位纽约州总检察长在公告中说:“普通商业法禁止在自然灾害或其他干扰市场的突发事件期间,食品、水、煤气、发电机、电池和手电这些基本用品的价格以及交通这类服务的费用上涨。”

这些法律是根据完全传统的观点制定的。这种观点认为,企业利用灾害追求自己的利润是不道德的。企业所有者从自己邻居的苦难中赚钱看起来是不对的。商人由于灾难而赚到更多钱被看作通过涨价来不劳而获。
我在 Brooklyn 的一位邻居在谈到当地电器店电池的价格时说:“这是变相抢劫。”

但是,当每个人都担心停电而突然开始购买电池或瓶装水时,就发生了短缺。有时,由于需求的突然剧增,会有某种物品就是供应不足了。这样就引发了一个问题:我们如何决定哪一位顾客应该得到电池、杂货或者汽油?

我们可以抽彩。也许人们可以在杂货店拿到彩票。赢家能以正常价格购物。输家则会挨饿。或者更可能的情况是,输家被迫以更高的价格向中签者购买食品,因为没有人购买食品是为了以相同的价格卖出去。 因此,哄抬物价的人只是从商人变成了中签的顾客。

我们可以有某种配给方案。根据家庭需求,每个人都可以配给一部分必需品。这是第二次世界大战期间美国采用的方法。问题是配给方案 需要巨量计划——以及难以达到的知识水平。制定配给方案的官员必须准确地知道在既定的地区可以得到的每种物品的数量,以及有多少人需要它。如果台风这样的灾难降临到你所在的城市,要想得到上述信息,只能祝你好运了。

我们也可以简单地按先来先得的原则卖出物品。事实上这就是反映 抬物价法所鼓励的事情。结果是大家都知道的:人们积物品,商店的货架都空了。而且,你不得不怀疑:为什么比谁能先跑到收银台就比另一 价格体系更公正?速度看来不能很好地代表公正。

在极端需求情形下允许价格上涨限制了过度消费。人们会更仔细地 考虑他们的购买,而不是购买成打的电池(或瓶装水、煤气),也许他们只会买一半的量。结果是在极端需求情形下会有更多顾客买得到物品。市场过程的结果实际上比反映抬物价法更能实现较为平等的分配。

一旦我们理解了这一点,就很容易明白实际上商人并没有从灾难中获利。商人是通过对物价的管理来获利的,这种对于自己物价的管理实际上扩大了商品的分配范围,并限制了囤积居奇,从而产生了有益的社会效应。简言之,他们是由于提供了重要的公共服务而正当地获得了回报。

有反对意见认为,在价格自由浮动的价格体系里,哄抬物价合法化会让富人买得到一切物品,而穷人什么也没有。这种担心过重了。就大多数情况而言,灾难期间价格上升实际上并没有使最穷的人无法获得必需品与必需的服务。这只是让穷人的开支雪上加霜了。相对于一开始就进行价格管制来说,通过转移支付来减轻家庭赤字是更有效的解决方案。

与其打击哄抬物价,我们应该用我们在这次危机中的经验来启动我们现有的适得其反的法律的改革。下一次灾难袭来时,我们应该期待映 抬物价的情况更多些,但空货架更少些。