These are a selection of worked problems from Theoretical Statistics, Chapter One. (Keener 2010).

Problem 1.2
Let $$B \subset \mathbb{N} = \{ 1,2, \dots \}$$, and \begin{equation*} \mu(B) = \lim_{ n \to \infty } \frac{ \# [ B \cap \{ 1, \dots, n \} ] }{n}. \end{equation*}
[a]
\begin{gather*} \mu(E) = \mu(O) = \frac{1}{2} \\ \mu(S) = 0 \end{gather*}
[b]
\begin{align*} \mu(A \cup B) & = \lim_{ n \to \infty } \frac{ \# [ (A \cup B) \cap \{ 1, \dots, n \} ] }{n} \\ & = \lim_{ n \to \infty } \frac{ \# [ ( A \cap \{ 1, \dots, n \} ) \cup ( B \cap \{ 1, \dots, n \} ) ] }{n} \end{align*} Since A and B are disjoint, so too are their respective intersections with $$\{ 1, \dots, n \}$$. Thus, the cardinality in the equation above is additive. Thus, \begin{align*} \mu(A \cup B) & = \lim_{ n \to \infty } \frac{ \# [ ( A \cap \{ 1, \dots, n \} ) ] + \# [ ( B \cap \{ 1, \dots, n \} ) ] }{n} \\ & = \mu(A) + \mu(B) \end{align*}
[c]

$$\mu$$ is not a measure; it fails countable additivity.

$$\mu(\mathbb{N}) = 1$$. In contrast, for k a singleton, $$\mu(k) = 0$$. The disjoint union of singletons equals $$\mathbb{N}$$ and, as such, countable additivity would require $$\sum_{k \in \mathbb{N}} \mu(k) = \mu(\mathbb{N})$$. In fact, the left hand side of the previous equation resolves to zero, while the right hand side equals 1, a contradiction.

Problem 1.3

Let $$\mu ((x,2x]) = \sqrt{x}$$.

Decompose $$(0,1]$$ into a countable, disjoint union of sets: $$(0.5, 1], (0.25, 0.5], (0.125, 0.25], \dots$$. Then, \begin{align*} \mu{((0,1])} &= \sum_{i \in \mathbb{N}} \sqrt{ \frac{1}{2^i} } \\ &= \sqrt{2} + 1r \end{align*}

Problem 1.5

Let $$\mu$$ be a measure on $$(\mathcal{X}, \mathcal{A})$$; A, B sets in $$\mathcal{A}$$; fix A and define \begin{equation*} \nu(B) = \mu( A \cap B) \end{equation*}

To show that $$\nu(B)$$ is a measure on $$(\mathcal{X}, \mathcal{A})$$:
• Mapping range and domain:
• $$\nu(B)$$ trivially maps each $$B \in \mathcal{A}$$ to the non-negative reals.

• Suppose $$B_1, B_2, \dots$$ are disjoint and $$B = \cup_i B_i$$. Then, \begin{align*} \nu(B) &= \nu( \cup_i B_i ) \\ &= \mu( A \cap (\cup_i B_i) ) \\ &= \mu( \cup_i (A \cap B_i) ) \\ &= \sum_i \mu(A \cap B_i) \\ &= \sum_i \nu(B_i) \end{align*} where we use the fact that $$B_i \cup B_j = \varnothing$$ implies $$(B_i \cap A) \cup (B_j \cap A) = \varnothing$$. That is, in the third equation above, we have a union of disjoint sets, and countable additivity holds via $$\mu$$.

Problem 1.15
$$\mu$$ is a measure on subsets of $$\mathbb{N} = \{1,2,\dots\}$$ defined as \begin{equation*} \mu(\{n, n+1, \dots\}) = \frac{n}{2^n}, \end{equation*} and let $$S_n(r) = 1 + r + r^2 + \dots + r^n = \frac{1-r^{n+1}}{1-r}$$.

Thus, \begin{align*} \int x d\mu(x) &= \sum_{i=1}^\infty i \mu(\{i\}) \\ &= \sum_{i=1}^\infty i \left( \frac{i}{2^i} - \frac{i+1}{2^{i+1}} \right) \\ &= \sum_{i=1}^\infty i^2 r^i - \sum_{i=1}^\infty i\cdot(i+1) r^{i+1} \end{align*} where $$r = \frac{1}{2}$$.

Note: \begin{align*} U_n &= r \frac{d}{dr} \left[ r \frac{dS_n(r)}{dr} \right] = r + 4 r^2 + 9r^3 + \dots + n^2 r^n \\ &= \frac{-n^2r^{n+3} + (2n^2 + 2n -1)r^{n+2} - (n^2+2n + 1)r^{n+1} + r^2 + r}{(1-r)^3} \end{align*} and \begin{align*} V_n &= r^2 \frac{d^2 S_n(r)}{dr^2} = 2 r^2 + 3 \cdot 2 r^3 + \dots + n \cdot (n-1) r^n \\ &= \frac{-r^{n+3}(n^2-n) + r^{n+2}(2n^2-2)-r^{n+1}(n^2+n) + 2r^2}{(1-r)^3} \end{align*} Hence, for $$0 < r < 1$$, \begin{align*} S_\infty \left( r \right) &= \frac{1}{1-r} \\ U_\infty \left( r \right) &= \frac{r^2 + r}{(1-r)^3} \\ V_\infty \left( r \right) &= \frac{2r^2}{(1-r)^3} \end{align*}

Then, \begin{align*} \int x d\mu(x) &= U_\infty \left( \frac{1}{2} \right) - V_\infty \left( \frac{1}{2} \right) \\ &= 6 - 4 = 2 \end{align*}

Problem 1.17 [geometric random variable]

A geometric random variable has mass function given by: \begin{equation*} p(x) = P(X=x) = \theta (1 - \theta)^x,\quad x \in \{0,1,2,\dots \} \end{equation*} where $$\theta \in (0, 1)$$.

\begin{align*} P(X \mbox{ is even}) &= P(X=0) + P(X=2) + P(X=4) + \cdots \\ &= \theta \left[ 1 + (1-\theta)^2 + (1-\theta)^4 + \cdots \right] \\ &= \theta \left[ \frac{1}{1 - (1-\theta)^2} \right] \\ & = \frac{1}{2-\theta} \end{align*}

Problem 1.20 [poisson random variable]

A poisson random variable has mass function given by: \begin{equation*} p(x) = P(X=x) = \frac{\lambda^x e^{-\lambda}}{x!},\quad x \in \{0,1,2,\dots \} \end{equation*} where $$\lambda > 0$$.

Recall the following Taylor series expansions: \begin{gather*} e^{\lambda} = 1 + \lambda + \frac{\lambda^2}{2!} + \frac{\lambda^3}{3!} + \cdots \\ e^{-\lambda} = 1 - \lambda + \frac{\lambda^2}{2!} - \frac{\lambda^3}{3!} + \cdots \end{gather*}

Now, \begin{align*} P(X \mbox{ is even}) &= P(X=0) + P(X=2) + P(X=4) + \cdots \\ &= e^{-\lambda} \left[ 1 + \frac{\lambda^{2}}{2!} + \frac{\lambda^{4}}{4!} + \cdots \right] \\ &= e^{-\lambda} \left[\frac{e^{\lambda} + e^{-\lambda}}{2} \right] \\ &= \frac{1 + e^{-2\lambda}}{2} \end{align*}

Problem 1.22

The solution is given in the appendix, but it worth recounting here as it illustrates a useful method.

Let $$\mathcal{E} =(0,1)$$; and $$\mathcal{B}$$, the Borel subsets of $$\mathcal{E}$$. $$P(A)$$ is the length of A for $$A \in \mathcal{B}$$. Define the random variable X by \begin{equation*} X(e) = \min(e, 0.5) \end{equation*}

The key is to consider the event $$\{ X \in B \}$$ for some $$B \in \mathcal{B}$$: \begin{align*} \{ e : X(e) \in B \} &= \begin{cases} B \cap \left( 0, 1/2 \right) & 1/2 \notin B \\ \left(B \cap \left( 0, 1/2 \right) \right) \cup \left[ 1/2, 1 \right) & 1/2 \in B \end{cases} \end{align*} and then apply the probability measure in the pre-image of the random variable \begin{align*} P( X \in B ) &= \lambda \left( B \cap \left( 0,1/2 \right) \right) + \lambda \left( \left[ 1/2, 1 \right) \right) 1_B \left( 1/2 \right) \\ &= \lambda \left( B \cap \left( 0,1/2 \right) \right) + 1/2 \cdot 1_B \left( 1/2 \right) \end{align*}