Probability distributions: True False questions

True false questions

Below you can find a bunch of simple questions that are either true or false. The questions are based on Chapters 6 to 10 from Introduction to Probability by Joseph Blitzstein and Jessica Hwang.1Below I often abbreviate this to BH. The questions are meant to help you practice with the definitions and some of the theorems discussed in the book. They are not meant as substitution for the problems of the book itself.

1. Week 1

Exercise 1:

\(X\) is a rv, \(X \in \R\). Claim: \(\supp{X} = (c, \infty) \implies \P{X\leq c} = 0\).

Solution

True.

Exercise 2:

Suppose \(X\) is a real-valued rv with \(\supp X = [0, c]\). Claim:

\begin{equation*} \V X = \E{X^2} - (\E{X})^2 \leq c \E X - (\E X)^2 = (c-\E X)\E X. \end{equation*}
Solution

True.

Exercise 3:

Let \(X\) and \(Y\) be independent rvs. Claim: \(F_{X+Y}(x, y) = F_X(x) + F_{Y}(y)\).

Solution

False. We should write \(F_{X,Y}\) rather than \(F_{X+Y}\) , and since the rvs are independent, consider the product of the CDFs, not the sum.

Exercise 4:

Let \(M_{X}(s)\) be the moment generating function of some rv \(X\). Claim:

\begin{equation*} M_{X}(0) = 0. \end{equation*}
Solution

False. Recall the definition: \(\E{e^{0\cdot X}} = \E{e^0} = 1\).

Exercise 5:

Let \(M_{X}(s)\) be the moment generating function of some rv \(X\). Claim:

\begin{equation*} \left(\frac{\d}{\d s}\right)^{2 }M_{X}(s)|_{s=0} = \V X + (\E{X})^{2}. \end{equation*}
Solution

True.

Exercise 6:

We have two positive rvs \(X\) and \(Y\). Claim: \(\V{X+Y} = \V X + \V Y\).

Solution

False, it’s not given that \(X\) and \(Y\) are independent.

Exercise 7:

We have two independent positive rvs \(X\) and \(Y\). Claim: \(M_{2X+Y}(s) = (M_{X}(s))^{2} M_{Y}(s)\).

Solution

False in general, because \(X\) is not independent of itself.

Exercise 8:

People enter a shop such that the time \(X\) between any two consecutive customers is \(X\sim \Exp{\lambda}\) with \(\lambda=10\) per hour. Claim: \(\P{X > x} = e^{-\lambda x}\), for \(x\geq 0\).

Solution

True, see BH.5.45.

Exercise 9:

People enter a shop such that the time \(X\) between any two consecutive customers is \(X\sim \Exp{\lambda}\) with \(\lambda=10\) per hour. Assume that the interarrival times between customers are iid. Let \(N(t)\) be the number of people that enter during an interval \([0,t]\). Claim \(N(t) \sim \Pois{\lambda}\).

Solution

False, see BH.5.45. It’s \(\sim \Pois{\lambda t}\).

Exercise 10:

People enter a shop such that the time \(X\) between any two consecutive customers is \(X\sim \Exp{\lambda}\) with \(\lambda=10\) per hour. Assume that the interarrival times between customers are iid. Let \(N(t)\) be the number of people that enter during an interval \([0,t]\). Suppose that \(T_{3}\) is the time the third person enters. Claim: \(\P{N(t) <3} = \P{T_{3}>t}\).

Solution

True.

Exercise 11:

Write \(m\) for the median of the rv \(X\). Claim: the following definition is correct:

\begin{equation*} \V X := \E{X^2} - m^{2}. \end{equation*}
Solution

False, \(\E X\) need not be equal to the median \(m\), and the definition of the variance involves the mean, not the median.

Exercise 12:

For two events \(A\) and \(B\). Claim: \(\E{\1{A}\1{B}} = \P{A} + \P{B} - \P{A \cup B}\)

Solution

It’s True

Exercise 13:

For an unfair 4-sided die that throws 4 half of the time and 1 to 3 with equal probability. If the rv \(X\) denotes the thrown value of the dice. Claim: \(\E{X^2} = \frac{38}{3}\)

Solution

It’s False. Calculating using LOTUS gives \(\E{X^2} = \frac{1}{2} 4^{2} + \frac{1}{2\cdot3}(1 + 4 + 9) = \frac{31}{3}\)

Exercise 14:

For a degenerate rv \(X\) and \(c\) an arbitrary, non-zero constant. Claim: \(\V{cX} > 0\)

Solution

It’s False. the variance of a degenerate rv is always 0. See the warning in BH.4.1.3 in which degeneracy is discussed.

Exercise 15:

Assume that \(\V{X} = \sigma^2\) and \(\E{X^2} = a^2\) both exist and are finite. Claim:$ \E{X} = \sqrt{a^2-\sigma^2}$

Solution

It’s False, because \(X\) can be negative.

Exercise 16:

Given \(\V{X+Y} = \V{X} + \V{Y}\). Claim: \(X\) and \(Y\) are independent,

Solution

It’s False. Independence is sufficient but not necessary for the equality to hold.

Exercise 17:

For two rvs \(X\) and \(Y\), where \(Y\) is always equal to \(X\), given \(\V{X}>0\). Claim: \(\V{X+Y} = \V{X} + \V{Y}\)

Solution

It’s False, since if \(Y\) is always equal to \(X\) they are definitely not independent. (see BH. p. 172) In fact, as \(Y=X\), \(\V{X+Y} = \V{2X} = 4 \V X\). Isn’t it a bit counter intuitive that when \(X\) and \(Y\) are dependent like this, the variance is larger than if they would be independent?

Exercise 18:

Claim: The expectation of a continuous random variable must always be nonnegative given that the probability density function values are nonnegative. (i.e. , \(f(x) \geq 0\)).

Solution

False.
The density values are nonnegative. The values that the RV can attain can be negative, therefore the expectation may be negative.

Exercise 19:

Let \(X \sim \Norm{0, 1}\). We then know that \(\P{X = 0} > \P{X = 5}\).

Solution

False. By definition $¶{X = k} = 0 \quad ∀ k $.

Exercise 20:

Let \(U \sim \Unif{(a,b)}\) and \((c,d) \subset (a,b)\). Then the conditional distribution of \(U\) given that \(U \in (c,d)\) is \(\Unif{(c,d)}\).

Solution

True. \\See Blitzstein proposition 5.2.3:
\textit{Proof.} For \(u\) in \((c, d)\), the conditional CDF at \(u\) is \[ P(U \leq u \mid U \in(c, d))=\frac{P(U \leq u, c

Exercise 21:

Let \(U \sim \Unif{(a,b)}\). The distribution of the rv \(X = c^{2} \log(d) U + e - f^{4}\) is still uniform when \(c,d, e, f \in \R^{+}\), and \(c\neq 0\), \(d\neq 1\).

Solution

True.
This still is a linear transformation. Let \(c^{2} \log(d)\) be a constant \(c_{1}\). And \(e - f^{4}\) be a constant \(c_{2}\). Then the question becomes $c1 U + c2 $, this still is linear by Blitzstein 5.2.6.

Exercise 22:

Let \(Z \sim \Norm{0,1}\) then \(\Phi(z) = \Phi(-z)\) due to symmetry.

Solution

False.
This equation for symmetry does not hold for the CDF.
We have \(\phi(z) = \phi(-z)\) \\And \(\Phi(z) = 1 - \Phi(-z)\)

Exercise 23:

Let \(Z \sim \Norm{\mu, \sigma^{2}}\) with \(\sigma>0\). Let \(X = Z \cdot \sigma^{-1} - \mu \cdot \sigma^{-1}\) then \(X \sim \Norm{0,1}\).

Solution

True. \\Rewriting results in $X = \frac{Z - \mu}{\sigma} $ Then $ X ∼ \Norm{0,1}$ by definition.

Exercise 24:

Let \(X \sim \Exp{1}\) and \(Y = \lambda X\), then \(Y \sim \Exp{\lambda}\).

Solution

False. It should become \(Y \sim \Exp{\frac{1}{\lambda}}\).

Exercise 25:

Let \(X \sim \Exp{\lambda}\), then: \[ \int^{\infty}_{0} x \lambda e^{ - \lambda x} \d x = \int^{\infty}_{- \infty} x \lambda e^{ - \lambda x} \1{x> 0}\d x.\]

Solution

True.

Exercise 26:

For the Exponential distribution holds that: \[\P{X > t + s | X > t} = \P{X > t}\]

Solution

False. \\It should be : \(\P{X > t + s | X > s} = \P{X > t}\) \\This is the memoryless property. % A version could be the correct answer, then the solution would be true.

Exercise 27:

Let there be three cars in three different painting workshops. The painting times of the cars are independent, and start at the same time. The paint times in hours are iid and follow \(X\sim \Exp{\lambda}\), with \(\E X = 1/ \lambda = 3\). Claim: The expected time for the first two cars to be finished is 2.5 hours.

Solution

True. Due to independence and memorylessness we have that the expected time for the first two cars to finish is $ T = T1 + T2$, as we only need the first two painting jobs; \(T_{1}\) is the time for the first job to finish, \(T_{2}\) is the additional time for the second job to finish. Clearly \(T_{1} = \min\{X_{1}, X_{2}, X_{3}\}\), and, after a restart (recall, memoryless), \(T_{2} = \min\{X_{1}, X_{2}\}\). Hence,

\begin{equation*} \E T = \frac{1}{3 \lambda} + \frac{1}{2\lambda} = 1 + 1.5 = 2.5. \end{equation*}

For more reference, see 5.6.3 and 5.6.5 in Blitzstein.

Exercise 28:

Let \(A\), \(B\) be two arbitrary events. Claim: \(\P{A|B}=\frac{\P{A \cap B}}{\P{B}}\).

Solution

False. The condition \(\P{B}>0\) is missing.

Exercise 29:

Let \(A\), \(B\) be events s.t. \(\P{A},\P{B}>0\). Claim: \(\P{A|B}\P{B}=\P{B|A}\P{A}\).

Solution

True.

Exercise 30:

Let \(A\), \(B\), \(C\) be events s.t. \(\P{A \cap B}>0\). Claim:

\begin{equation*} \frac{\P{B \cap C | A}}{\P{B|A}}=\frac{\P{A \cap C|B}}{\P{A|B}} \end{equation*}
Solution

True. Both sides are equal to \(\P{C|A,B}\).

Exercise 31:

Let \(A\), \(B\), \(C\) be events s.t. \(\P{C}>0\). Claim:

\begin{equation*} \P{A \cap B | C} + \P{A \cup B | C}=\P{A|C}+\P{B|C} \end{equation*}
Solution

True.

Exercise 32:

Let \(A\) and \(B\) be two disjoint events with positive probability. Claim: \(A\) and \(B\) are dependent.

Solution

True. \(\P{A\given B} = 0 \neq \P{A}\). %If \(A\) and \(B\) are disjoint, they can be only independent if \(\P{A}=0\) or \(\P{B}=0\).

Exercise 33:

Suppose \(A_i\) for \(i=1, 2, \dots, n\) are independent indicator rvs. Claim: \(\sum_{i=1}^n A_i\) has a Binomial distribution.

Solution

False. They also need to have identical distributions.

Exercise 34:

Let \(X\) and \(Y\) be independent. Claim: for any functions \(f, g\) it holds that \(g(X)\) is independent of \(f(Y)\).

Solution

True.

Exercise 35:

Claim: a discrete random variable requires that the number of outcomes is finite.

Solution

False. It must have a countable number of outcomes, but not necessarily finite, like \(\N\).

Exercise 36:

We have an urn with \(w>0\) white balls and \(b>0\) black balls. We pick, without replacement, \(n\leq w+b\) balls from the urn. Then the number \(X\) of white balls picked from the urn is hypergeometric, i.e., \(X\sim\HGeom{w, b, n}\).

Claim: if \(X \sim \HGeom{3, 5, 2}\), then \(\P{X = 3} = 0\).

Solution

True. \(3\) is not in the support of X: \(\{0, 1, 2\}\).

Exercise 37:

Given numbers \(c, d \in \R\), possibly the same. Take \(X\equiv c\), i.e., \(\P{X=c} = 1\), and \(Y\equiv d\). Claim: \(X\) and \(Y\) are independent.

Solution

True. \(\P{X=c, Y=d} = 1 = \P{X=c}\P{Y=d}\), \(\P{X=c, Y\neq d} = 0 = \P{X=c} \P{Y\neq d}\), etc.

2. Week 2

Exercise 38:

\(X\sim \Geo{p}\). Take \(s\) such that \(e^{s}q < 1\).

\begin{align*} M_{X}(s) &\stackrel{1}= \E{e^{sX}} \stackrel{2}= \sum_{k=1}^{\infty} p q^{k} e^{sk} \\ &\stackrel{3}= p \sum_{k=1}^{\infty} (e^{s}q)^{k} \stackrel{4}= p (\sum_{k=0}^{\infty} (e^{s}q)^{k}-1) \\ & \stackrel{5}= \frac p {1+e^{s}q} - p. \end{align*}

Claim: more than one of these steps is incorrect.

Solution

True. Steps 2 and 5 are incorrect. Step 2: start with \(k=0\), step 5: the plus should be a minus.

Exercise 39:

For two strictly positive rvs \(X\) and \(Y\), let \(f_{X,Y}(x, y) = \frac{xy}{x^{2}+y^2}\). Claim: since

\begin{align*} f_{X,Y}(x, y) = \frac{xy}{x^{2}+y^2} = \frac{x}{\sqrt{x^{2}+y^2}} \frac{y}{\sqrt{x^{2}+y^2}}, \end{align*}

the rvs \(X\) and \(Y\) are independent.

Solution

False. For indepence of continuous rvs, the joint pdf should be split into two functions that strictly depend on one variable, like \(f_{X,Y}(x,y) = f_{X}(x) f_{Y}(y)\). The two functions at the RHS of the claim are not of this type, both include \(x^{2}+y^{2}\).

Exercise 40:

The joint density of \(X\) and \(Y\) is given by \(f_{X,Y}(x,y) = Ce^{-(x + 2y)}\1{x\geq 0}\1{y\geq 0}\). Claim: \(X\) and \(Y\) are iid because \(f_{X,Y}(x, y) = C e^{-x}e^{-2y}\).

Solution

False. The densities \(f_{X}\) and \(f_{Y}\) are not the same.

Variations on this type of question.

  1. Yes, because
  2. No, these rvs are identical, hence dependent.
  3. Yes, because \(X\) and \(Y\) are independent and identical.
  4. Yes, because \(X\) and \(Y\) are independent and identically distributed.
Exercise 41:

Claim: \(p(k) = \frac{1-x}{1-x^{n+1}} x^k\) with \(k \in \{0, \dots, n\}\) is a valid PMF for any \(x\).

Solution

False. Not if \(x = 0\) or \(x = 1\).

Exercise 42:

Let \(X \sim \Geo{p}\). Claim: all of the following steps are correct.

\begin{equation*} \P{X = k \given X \geq n} = \frac{\P{X=k, X \geq n}}{\P{X \geq n}} = \frac{\1{k \geq n} \P{X = k}}{\P{X \geq n}} = \1{k \geq n} pq^{k-n} \end{equation*}

Thus,

\begin{align*} \E{X \given X \geq n} &= \sum_{k=0}^\infty k \1{k \geq n} pq^{k-n}\\ &= p \sum_{k=n}^\infty kq^{k-n}\\ &= (1 - q) \big(n q^0 + (n+1) q^1 + (n+2) q^2 + \dots\big)\\ &= \big(n q^0 + (n+1) q^1 + (n+2) q^2 + \dots\big) - \big(n q^1 + (n+1)q^2 + (n+2)q^3 + \dots\big)\\ &= n + q^1 + q^2 + \dots\\ &= n + \frac{1}{1-q} - 1\\ &= n + \frac{q}{p}\\ \end{align*}
Solution

True. Of course, the memoryless property of the geometric distribution can be used to find the answer too.

Exercise 43:

Claim: \(M(t) = 2^{-n} \sum_{k=0}^n \binom{n}{k} e^{tk}\) is a valid expression for the MGF of a \(\Bin{n, 0.5}\) distribution.

Solution

True. Apply the binomial theorem: \((e^t + 1)^n = \sum_{k=0}^n \binom{n}{k} e^{tk}\).

Exercise 44:

Claim: This contains an error:

\begin{equation*} e \stackrel 1 = \lim_{n\to\infty} \big(1+n^{-1}\big)^n \stackrel{2} = \lim_{n\to\infty} \sum^n_{k=0} n^{-k}. \end{equation*}
Solution

True, Step 2 is wrong. Compare the Taylor series for \(e^{x}\).

Exercise 45:

Claim: suppose \(f(x) = a x + b\) with \(a\neq 0\), then there is a \(c\) such that \(c e^{-(f(x))^2}\) is the pdf of a normal distribution.

Solution

True.

Exercise 46:

Claim: \(M_X(s)=e^{-(s-1)^2/2}\) could be a valid MGF for some rv \(X\).

Solution

False. Note that \(M_X(s)=\E{e^{sX}}\), s.t. \(M_X(0)=1\) must always hold.

Exercise 47:

Let \(X\) and \(Y\) be two independent rvs such that \(\E{e^{s X}}\) and \(\E{e^{s Y}}\) are well defined for \(s\) in an open interval around \(0\). Claim:

\begin{align*} M_{X-Y}(s)=M_X(s)M_Y(-s) \end{align*}
Solution

True. \(M_{X-Y}(s)=\E{e^{s(X-Y)}}=\E{e^{sX}}\E{e^{-sY}}=M_X(s)M_Y(-s)\).

Exercise 48:

Given \(X\sim \Pois{\lambda}\), \(Y \sim \Pois{\mu}\), \(X\) and \(Y\) are independent, and \(\E{e^{tX}} = e^{-\lambda}\sum^{\infty}_{k=0} \frac{(\lambda e^t)^k}{k!}\). Claim: \(M_{X+Y}(t) = e^{(\lambda \mu)(e^t-1)}\).

Solution

False. By Taylor series we find$ \E{etX} = e eλ et$ and by independence \(M_{X+Y}(t) = M_X(t) \cdot M_Y(t) = e^{\lambda(e^t-1)} e^{\mu(e^t-1)} = e^{(\mu + \lambda)(e^t-1)}\)

Exercise 49:

Given \(X_1\sim \Norm{\mu_1, \sigma_1^2}\) with \(M_{X_1}(t) = e^{\mu_1t + \frac{1}{2}\sigma_1^2t^2}\), and \(X_2\sim \Norm{\mu_2, \sigma_2^2}\). Claim: \(M_{X_1+X_2}(t) = e^{(\mu_1 + \mu_2)t +\frac{1}{2}(\sigma_1^2 + \sigma_2^2)t^2}\).

Solution

False. It is not given that \(X_1\) and \(X_2\) are independent.

3. Week 3

Exercise 50:

Suppose the rv \(X\) has PDF \(f_{X}(x) = Ax^{-s} \1{x\geq 1}\) for \(s\in (1, 2)\) where \(A\) is the normalization constant. Claim: \(\E X = \infty\).

Solution

True.

Exercise 51:

Claim: LOTP on a discrete sample space \(S\) states that \(P(B) = \sum_{i=1}^{n} \P{B|A_{i}} \P{A_{i}}\), where \(\{A_{i}\}\) is a set of non-overlapping subsets of \(S\).

Solution

False in general, because it’s not given that the subsets cover \(S\).

Exercise 52:

Let \(X\) be a rvs on \(\R\) and \(g\) a function from \(\R^{2}\) to \(\R\). Claim, \(g(X)\) is a rv.

Solution

False. \(g\) needs to two arguments as it maps \(\R^{2}\) to \(\R\).

Exercise 53:

Let \(X\sim \FS{p}\) with \(p\in (0, 1)\), \(q=1-p\). Claim: \(\E X = 1+q\E X \implies \E X = 1/p\).

Solution

True.

Exercise 54:

Claim: according to 2D LOTUS: if \(g\) is a function such that $g(x, y) ∈ \R $, and \(X, Y\) two real-valued rvs, then

\begin{equation*} \E{g(X,Y)} = \int_{-\infty}^{\infty} \int_{-\infty} ^{\infty} g(x,y) f_{X}(x)f_Y(y) \d x \d y. \end{equation*}
Solution

False. \(X\) and \(Y\) need not be independent, as is suggested here.

Exercise 55:

Let \(L = \min\{X_{i} : i =1, \ldots, n\}\) with \(\{X_{i}\}\) a set of iid rvs. Claim:

\begin{equation*} \P{L\leq x} = (\P{X_{1} \leq x})^{n}. \end{equation*}
Solution

False. This holds for the maximum, or reverse the direction of the inequalities with respect to \(x\).

Exercise 56:

For two continuous rvs \(X, Y\) with joint distribution \(F_{X,Y}(x,y)\). Write \(\partial_{x}\) for \(\partial/ \partial_{x}\). Claim: \(f_{X}(x) = \partial_{x} F_{X,Y}(x,y)\).

Solution

False. Why is this claim nonsense?

Exercise 57:

Claim: for two continuous rvs \(X, Y\) with joint PDF \(f_{X,Y}(x,y)\) it holds that \(f_{Y|X}(y|x) = F_{X,Y}(x,y)/F_{X}(x)\).

Solution

False.

Exercise 58:

Let \(X\) be a discrete rv on the numbers \(\{a_{i}\}_{i=1}^{\infty}\) with \(a_{i} \in \R\). Claim: the PMF of \(X\) can be found by \(f_{X}(x) = F_{X}'(x)\) for \(x\in \{a_{i}\}_{i=1}^{\infty}\).

Solution

False. The PMF does not have a derivative (in the proper sense) at \(a_{i}\).

Exercise 59:

Let \(X \sim \Norm{0,1}\), \(Y=X^2\). Claim: for the density of \(Y\), \(f_{Y}(0) = 0\), and on \(y>0\), \[f_Y(y)= \frac{\phi(\sqrt{y}) + \phi(-\sqrt{y})}{2\sqrt{y}}\] by the change of variables formula.

Solution

A (True). Consider the set $ A = {x : x2 =y}$ as the inverse of \(y\). The change of variables formula says this

\begin{equation*} f_Y(y)= \sum_{x_{i} \in A} f_{X}(x_{i} \left(\frac{\d y}{\d x}(x_{i})\right)^{-1}, \end{equation*}

if \(\d y/\d x (x_{i}) \neq 0\) for all \(x_{i} \in A\). As it is given that \(y>0\), this condition is satisfied.

Exercise 60:

Let \(X \sim \Norm{0,1}\), \(Y=X^2\). Claim: The change of variables formula tells us that: \[f_Y(y)=f_X(x)\left|\frac{dx}{dy}\right|=\phi(x)\left|\frac{1}{2x}\right|=\phi(\sqrt{y})\frac{1}{2\sqrt{y}},\quad y>0.\]

Solution

False. \(Y=X^2\) is not a strictly increasing (or decreasing) function, so the change of variables formula cannot be applied.

Exercise 61:

Let \(X \sim \Gamm{n,\lambda}\), then \(\E{X^{k}} = \frac{n+k-1}{\lambda} \E{X^{k-1}}\) for \(k \in \N = \{0, 1, 2, \ldots\}\). Claim, for \(c\in \N\),

\begin{equation*} \E{X^{c}} = \frac{(n+c-1)!}{(n-1)!\lambda^{c}}. \end{equation*}
Solution

True.

Exercise 62:

Let \(X\) and \(Y\) be independent discrete rvs, \(T = X + Y\). Claim: \[F_T(t) = \sum_x F_X(x - t) p_X(x)\]

Solution

False, it should be \(F_Y(t - x)\). Variations:

\begin{itemize} \item True \item False with $p_X(t)$ \item False with $\sum_x p_Y(t - x) p_X(x)$ \item False with $\sum_{x=0}^t F_Y(t - x) p_X(x)$ \end{itemize}
Exercise 63:

Let \(X_1, \cdots, X_n\) be iid continuous rvs with CDF \(F\). Let \(N_{x} \sim \Bin{n,F(x)}\). Claim: The CDF of the \(j\)th order statistic can be written as \(P(X_{(j)} \leq x) = P(N_{x} \geq j)\).

Solution

True. See BH. 400

4. Week 4

Exercise 64:

Claim: \(\E{Y|A} = \sum_{y=0}^{\infty} y \P{Y=y|A}\) if \(Y\) is discrete.

Solution

No, what if \(Y\) can take negative values?

Exercise 65:

Claim: \(\E{Y|A} = \sum_{y=-\infty}^{\infty} y \P{Y=y|A} = 0\) if \(\P{A}=0\) and \(Y\) is discrete.

Solution

It is false, the definition in BH requires that \(\P{A}>0\).

Exercise 66:

Let \(X\) be a continuous rv with PDF \(f_{X}(x) > 0\) on \(x\in[a, b]\) and \(Y\) discrete. Claim: when \(x\in [a, b]\), \(\E{Y|X=x} = \sum_{y=-\infty}^{\infty} y \frac{f_{X,Y}(x,y)}{f_{X}(x)}\).

Solution

True.

Exercise 67:

Let \(X_{(i)}, i = 1, \ldots, n\) be the order statistic of \(\{X_{i}, i=1, \ldots, n\}\). Claim:

\begin{equation*} \P{X_{(j)}\leq x} = \sum_{k=0}^{n}{n \choose k} F(x)^k (1-F(x))^{n-k}. \end{equation*}
Solution

False, check theorem 8.6.3. It’s easy to check. The LHS depends on \(j\), the RHS not, hence it must be false, unless the probability would not depend on \(j\), but that cannot be true.

Exercise 68:

Let \(X_{(i)}, i = 1, \ldots, n\) be the order statistic of the continuous rvs \(\{X_{i}, i=1, \ldots, n\}\). Claim:

\begin{equation*} f_{(j)}(x) \d x = n f(x) \d x {n \choose j} F(x)^j (1-F(x))^{n-j}. \end{equation*}
Solution

False, theorem 8.6.4.

Exercise 69:

Claim: It is a good idea to conceptualize the order statistic as a set rather than as a list.

Solution

False. Elements in a set are not ordered, elements in a sequence or list are.

Exercise 70:

Let \(X_{(i)}, i = 1, \ldots, n\) be the order statistic of the continuous iid rvs \(\{X_{i}, i=1, \ldots, n\}\). Claim: \(\E{X_{(i)} | X_{(j)}=x} \leq x\) for any \(i\leq j\).

Solution

True.

Exercise 71:

The continuous rvs \(\{X_{i}\}\) with support on \((0, \infty)\) are idd, and \(S_n=\sum_{i=1}^n X_{i}\). Claim: for some \(x\) and \(n\),

\begin{equation*} \E{X_{n}| S_{n-1} = x} = S_n/n. \end{equation*}
Solution

False. The condition is on some given \(x\), so on the LHS we have an \(x\). HOwever, on the RHS there is no \(x\). Besides this, we condition on \(S_{n-1}\), so outcomes dependent on \(X_{1}, X_{2}, \ldots, X_{n-1}\), but \(X_{n}\) is independent of these rvs.

Exercise 72:

Take \(g(x) = \E{Y|X=x}\). Define the conditional expectation of the rv \(Y\) given \(X\) as \(g(X)\), and write it as \(\E{Y|X}\). Claim: this is one of the most important definitions in probability.

Solution

True. This is just a bonus to stress the importance of the definition of conditional expectation.

Exercise 73:

Let \(X\) be a continuous rv with support \((0, \infty)\). Claim: for \(x>0\),

\begin{equation*} \E{X|X>x}>\E{X}. \end{equation*}
Solution

True.

Exercise 74:

Let \(X \sim \Exp{\lambda}\); we write \(f_{X}\) for the density of \(X\). Claim: all steps in the following lines are correct.

\begin{align*} f_{X}(x|X>s) &= \frac{\P{X>s|X=x}f_{X}(x)}{\P{X>s}} = \frac{ \1{x > s} \lambda e^{-\lambda x}}{e^{-\lambda s}}. \\ & \implies \\ \E{X|X>s} &= \int^{\infty}_{s} x \lambda e^{-\lambda(x-s)} dx\\ &= \int^{\infty}_{0}(x + s) \lambda e^{-\lambda x} dx \\ &= \int^{\infty}_{0}x \lambda e^{-\lambda x} dx + \int^{\infty}_{0} s \lambda e^{-\lambda x} dx. \end{align*}
Solution

True.

Exercise 75:

Let X and Y be two rvs. Claim: If all is well defined and finite, \(\E{X|Y} = c\), where c is some constant

Solution

It’s False, \(\E{X|Y}\) is a function of Y, which is a random variable.

Exercise 76:

Let X be an rv and A an event. Claim: \(\E{X\mid\1{A}} = \E{X\mid A}\)

Solution

False. One way to see this is to note that the LHS is a rv, and the RHS is number. In fact, \(\E{X\mid\1{A} = 1} = \E{X\mid A}\). An alternative question: Is \(\E{X\1{A}} = \E{X\mid A} \P{A}\)?

5. Week 5

Exercise 77:

You may assume that, when \(T=X+Y\), \(f_{X,T}(x,t) = f_{X,Y}(x, t-x)\), even when the positive rvs \(X\) and \(Y\) are not independent.

Claim: \(f_T(t) = \int_{0}^{t} f_{X, Y}(x,t-x) \d x\).

Solution

True.

For next year: Claim: \(f_T(t) = \int_{0}^{t} f_{X, Y}(x,t-x) f_{X}(x)\d x\), now it’s false.

Exercise 78:

You may assume that, when \(T=X+Y\), \(f_{X,T}(x,t) = f_{X,Y}(x, t-x)\).

Claim: When the rvs \(X, Y\) have support equal to \(\R\) and are independent,

\begin{equation*} f_T(t) = \int_{0}^{t} f_{Y}(t-x) f_X(x) \d x. \end{equation*}
Solution

No, it is not given that \(X\) or \(Y\) are positive.

Exercise 79:

The positive rvs \(X\) and \(Y\) are independent, and \(T=XY\).

Claim: \(f_T(t) = \int_{0}^{t} f_{Y}(t/x) f_X(x) \d x\).

Solution

No. The integration bounds are not correct, which is easy to see. The harder part is that the Jabobian is missing. Even when students don’t know this, they can guess, based on the 1D case, that there must be function to compensate for the change in \(f_{Y}\), because we divide \(t\) by \(x\).

Exercise 80:

Claim: \(\E{h(Y)Y | X} = h(Y) \E{Y|X}\).

Solution

False.

Exercise 81:

Claim: \(\E{h(Y)|Y} = h(Y)\E{1|Y} = h(Y)\cdot 1 = h(Y)\).

Solution

True.

Exercise 82:

Let \(g(x) =\E{Y|X=x} = \sum_{y=-\infty}^{\infty} y \P{Y=y|X=x}\). Claim: the following rv

\begin{equation*} g(X) = \sum_{y=-\infty}^{\infty} y \P{Y=y|X=X} \end{equation*}

is well-defined.

Solution

False. The final condition \(X=X\) is nonsense.

Exercise 83:

Claim: \(\V{X|Y} = \E{X^2|Y} - (\E{X|Y})^{2}\).

Solution

True.

Exercise 84:

Claim: \(\V{\E{X|Y}} = \E{X^2|Y} - (\E{X|Y})^{2}\).

Solution

False. The first term of the RHS should be \(\E{(\E{X|Y})^{2}}\). The second term is also wrong: it must be \((\E{Y})^{2}\). Another way to see why the claim, is like this. The LHS takes the variance of the rvs \(\E{X|Y}\). Thus, the LHS is some constant (probably \(>0\)). The RHS still depends on \(Y\), hence are rvs.

Exercise 85:

We have two rvs \(X\) and \(Y\), such that \(X\) is independent of \(Y\). Claim: the fact that \(X\) is independent of \(Y\) does not imply that \(Y\) is independent of \(X\).

Solution

False. Independence is symmetric.

Exercise 86:

Claim: if \(X\) is independent from \(Y\), then \(\E{\V{Y|X}} = \V Y\).

Solution

correct. \(\V{Y|X}\) is a rv, while \(\V{Y}\) is a number. By taking the expectation of \(\V{Y|X}\) we reduce the rv to a number. The book writes, as a property, that \(\E{Y|X} = \E{Y}\) when \(Y,X\) are independent. We use the same property here for the variance.

Exercise 87:

Let \(N \sim \Pois{\lambda}\), \(X_j\) be i.i.d. rvs with mean \(\mu\). Claim: \(\E{\sum_{j=1}^NX_j}=\lambda\mu\).

Solution

True.

Exercise 88:

\(\E{Z \mid \E{X \mid Y}}\) is a non-degenerate rv and a function of \(Y\).

Solution

False. It is not given that \(X\), \(Y\) and \(Z\) are dependent.

Exercise 89:

Claim: \(\E{\E{Y|X,Z}|Z} = \E{Y|X}\).

Solution

False. By definition: $\E{\E{Y|X,Z}|Z} = \E{Y|Z} $.

Exercise 90:

For any rvs X and Y, we claim that all next steps are correct:

\begin{align*} \V{Y-\E{Y|X}} &= \E{(Y-\E{Y|X})^2} - \E{Y-\E{Y|X}}^2 \\ &= \E{(Y-\E{Y|X})^2} - \E{\E{(Y-\E{Y|X})^2|X}} \\ &= \E{\V{Y|X}}. \end{align*}
Solution

True, see BH page 435

6. Week 6

Exercise 91:

Eve’s law says that \(V(X) = \E{\V{X|N}} + \V{\E{X|N}}\).

Claim: \(\E{\V{X|N}}\) is called the in-between group variation.

Solution

False.

Exercise 92:

Eve’s law says that \(V(X) = \E{\V{X|N}} + \V{\E{X|N}}\).

Claim: \(\V{\E{X|N}}\) is called the explained variance.

Solution

True.

Exercise 93:

Write \(g(X) = \E{Y|X}\) for two rvs \(X, Y\). Claim: This derivation is correct:

\begin{equation*} \V{\E{Y|X}} = \E{(g(X))^2} - (\E{g(X)})^{2} = \E{(g(X))^2} - (\E{Y})^{2}. \end{equation*}
Solution

True.

Exercise 94:

Claim: The inequality of Cauchy-Schwarz says that \(\E{(XY)^{2}} \geq \E X \E Y\).

Solution

False.

Exercise 95:

Claim: This is correct: \(\E{X} \leq \E{X\1{X\geq 0}}\).

Solution

True.

Exercise 96:

Let \(g\) be a function that is concave and convex at the same time, and such that \(g(0) = 0\). Claim: by Jensen’s inequality: \(\E{g(X)} = g(\E{X})\).

Solution

True.

Exercise 97:

Claim: The following reasoning is correct. For any \(a\geq 0\),

\begin{equation*} \E{|X|} \geq \E{a \1{|X|\geq a}} = a \P{|X| \geq a}. \end{equation*}
Solution

True.

Exercise 98:

The set \(\{X_i : i = 1, 2, \ldots\}\) forms a set of iid rvs such that \(X_i\in \{0, 1\}\) and \(\P{X_i=1} = 1/2\) for all \(i\). Take \(A=\{\lim_{n\to\infty} n^{-1}\sum_{i=1}^{n}X_i = 1\}\).

Claim: The strong law of large numbers implies that \(A=\varnothing\).

Solution

False.

Exercise 99:

The set \(\{X_i : i = 1, 2, \ldots\}\) forms a set of iid rvs such that \(X_i\in \{0, 1\}\) and \(\P{X_i=1} = 1/2\) for all \(i\). Take \(A=\{\lim_{n\to\infty} n^{-1}\sum_{i=1}^{n}X_i = 1\}\).

Claim: The strong law of large numbers says that \(\P{A} = 0\).

Solution

True.

Exercise 100:

The set \(\{X_i : i = 1, 2, \ldots\}\) forms a set of iid rvs. Claim: The weak law of large numbers states that:

\begin{equation*} \forall \delta, \epsilon > 0: \exists m > 0 : \forall n > m: \P{|\bar X_{n}-\mu| > \epsilon} < \delta, \end{equation*}

where \(\mu=\E{X_{i}}\) and \(\bar X_n = n^{-1}\sum_{i=1}^{n}X_{i}\).

Solution

True.

Exercise 101:

Let \(X_i \sim \Exp{\lambda}\) with \(Y_i = 2X_i\), \(i=1,2,\dots\) Claim: \(\P{\bar{X}_n\bar{Y}_n \xrightarrow{} \frac{2}{\lambda^2}}=1\).

Solution

True.

Exercise 102:

The Chernoff bound is always tighter than the Chebyshev bound and both are always tighter than the Markov bound.

Solution

False.

Exercise 103:

The equation given only holds for random variables with a strictly positive support: \(\E{|X|} = |\E{X}|\).

Solution

False. Every random variable with a nonnegative support satisfies this equation, zero included.

Exercise 104:

Let \(X_1, X_2, \cdots\) be iid fair coin tosses. Let \(\Bar{X}_n\) be the fraction of heads after \(n\) tosses. Claim: By SLLN \(\bar X_n \rightarrow \frac{1}{2}\) as \(n\to \infty\) with probability 1.

Solution

It’s True

7. Week 7

Exercise 105:

The central limit theorem in its approximation form states that for rvs \(X_i\) with mean \(\mu\) and variance \(\sigma^2\). \[ n^{-1}\sum_{i=1}^n X_i = \bar X_n \stackrel{\cdot}\sim \Norm{\mu, \sigma^2 \sqrt{n}} \]

Solution

False. First, the \(X_i\)’s need to be independent. Second, the variance of \(\bar X_{n}\) should be \(\sigma^2/n\).

Exercise 106:

Let \(V \sim \chi^2_n\). Claim: For large \(n\), \(V \stackrel{\cdot}\sim \Norm{n, 2n}\).

Solution

True. Check the definition.

Exercise 107:

Let \(\{X_{n}\}\) be a sequence of i.i.d. rvs with an unknown distribution but with finite mean $μ $ and variance \(\sigma^{2}\). If the average \(\bar{X}_{n} \to \Norm{\mu, \sigma^{2}/n}\) as \(n \to \infty\) then $ Xn$ must be normally distributed for all n.

Solution

It’s False. Given a sequence of rvs \(\{X_{n}\}\) with finite mean and variance, \(\bar{X}_{n} \to \Norm{\mu, \sigma^{2}/n}\) as \(n \to \infty\) by the CLT. This holds for all sequences of RVS. Not only the normal distribution.

Exercise 108:

Let \(X_{i} \sim \Pois{\lambda}\), and iid, then \[\left( \frac{ \sum^{n}_{i=1} X_{i}}{\sqrt{n} \sqrt{\lambda}} - \sqrt{n}\sqrt{\lambda} \right) \to \Norm{0,1} \quad \text{as } n \to \infty\].

Solution

Correct
\[\sqrt{n} \left( \frac{ \bar{X}_{n} - \lambda}{\sqrt{\lambda}} \right) \] \[\sqrt{n} \left( \frac{ \frac{1}{n} \sum^{n}_{i=1} X_{i} }{\sqrt{\lambda}} - \sqrt{\lambda} \right) \] \[ = \left( \frac{ \sum^{n}_{i=1} X_{i}}{\sqrt{n}\lambda} - \sqrt{n}\sqrt{\lambda} \right)\] \[ \to \Norm{0,1} \quad \text{as } n \to \infty\]

Exercise 109:

Let \(T_n = \frac{Z}{\sqrt{V_n/n}}\), with \(Z\sim \Norm{0,1}\), \(V_n\sim \chi_n^2\), Z and V independent. Claim: because by SLLN \(\frac{V_n}{n}\rightarrow \E{Z_1^2}\) with probability 1, \(T_n\) approaches the standard normal distribution.

Solution

It’s True, see BH 480

Exercise 110:

Let \(Z_1^2 + \cdots + Z_n^2 = V \sim \chi^2_n\), where \(Z_1, \cdots ,Z_n \sim \Norm{0,1}\) are iid, and given \(\E{Z_1^4} = 3\). Claim: \(\V{V} = 2n^2\)

Solution

False. \(\V{V} = n\V{Z_1^2} = n\left(\E{Z_1^4} - \E{Z_1^2}^2\right) = 2n\).

Exercise 111:

Let \(\bar{Z}_n=\frac{1}{n}\sum_{i=1}^nZ_i\) where \(Z_i\sim\Norm{0,1}\), \(i=1,\dots,n\). Claim: \(\bar{Z}_n\sim\Norm{0,\frac{1}{n}}\), s.t. \(n\bar{Z}_n^2\sim\chi^2_1\).

Solution

True. \(\sqrt{n}\bar{Z}_n\sim\Norm{0,1}\implies n\bar{Z}_n^2\sim\chi^2_1\).

Exercise 112:

Let \(Z_i \sim \Norm{0,1}\), \(i=1,\dots,n\) with \(V_n=\sum_{i=1}^nZ_i^2\). Claim: \(\frac{Z_1}{\sqrt{V_n/n}} \sim t_n\).

Solution

False. \(Z_1\) and \(V_n\) are dependent.