Chapter 3 Numerical Sequences and Series
- Part A: Exercise 1 – Exercise 14
- Part B: Exercise 15 – Exercise 17
- Part C: Exercise 18 – Exercise 25
Exercise 15
(By Dan kyp44 Whitman)
Theorem 3.22 For $a_n \in \mathbb R^k$, the series $\sum a_n$ converges if and only if for every $\epsilon > 0$ there is an integer $N$ such that \[ \left| \sum_{k=n}^m a_k \right| \leq \epsilon \] for all $m \geq n \geq N$.
Proof: $(\to)$ Suppose that $\sum a_n$ converges. Then by definition the sequence $\{s_n\}$ of partial sums converges where $s_n = \sum_{k=1}^n a_k$. It follows from Theorem~3.11a that $\{s_n\}$ is a Cauchy sequence, noting that this is true in any metric space. So consider any $\epsilon > 0$. Then, since $\{s_n\}$ is Cauchy, there is an integer $N$ such that for every $m \geq N$ and $n \geq N$, $|s_m – s_n| < \epsilon$.
So let $M = N+1$ and consider any $m \geq n \geq M$. Also let $l = n-1$, from which it follows immdediately that $n = l+1$. Clearly then $m \geq l \geq N$ so that $|s_m – s_l| < \epsilon$. We then have \[ \left| \sum_{k=n}^m a_k \right| = \left| \sum_{k=l+1}^m a_k \right| = \left| \sum_{k=1}^m a_k – \sum_{k=1}^l a_k \right| = |s_m – s_l| \leq \epsilon \]as required.
$(\leftarrow)$ Suppose that for every $\epsilon > 0$ there is an integer $N$ such that \[ \left| \sum_{k=n}^m a_k \right| \leq \epsilon \]for all $m \geq n \geq N$ and consider any $\epsilon > 0$. Then obviously there is an integer $N$ for which\[ \left| \sum_{k=n}^m a_k \right| \leq \frac{\epsilon}{2} \] for every $m \geq n \geq N$. So consider any $m \geq N$ and $n \geq N$. We can assume that $m \geq n$ without loss of generality. If $m = n$ then we have simply $$|s_m – s_n| = |s_m – s_m| = |0| = 0 < \epsilon.$$If $m > n$ let $l = n+1$. It then follows that $m \geq l \geq N$ so that \[ \left| \sum_{k=l}^m a_k \right| \leq \frac{\epsilon}{2} \,. \] We then have\[ |s_m – s_n| = \left| \sum_{k=1}^m a_k – \sum_{k=1}^n a_k \right| = \left| \sum_{k=n+1}^m a_k \right| = \left| \sum_{k=l}^m a_k \right| \leq \frac{\epsilon}{2} < \epsilon \,. \]Thus we have shown that $\{s_n\}$ is a Cauchy sequence. Then by Theorem~3.11c, $\{s_n\}$ converges since we are in $\mathbb R^k$. It follows then by definition that $\sum a_n$ converges.
Theorem 3.23: For $a_n \in \mathbb R^k$, if $\sum a_n$ converges then $\lim_{n \to \infty} a_n = 0$.
Proof: Suppose that $\sum a_n$ converges and consider any $\epsilon > 0$. Then by Theorem~3.22 there is an integer $N$ such that \[ \left| \sum_{k=n}^m a_k \right| \leq \frac{\epsilon}{2} \]for all $m \geq n \geq N$. So consider any $n \geq N$ and let $m = n$.Then clearly $m \geq n \geq N$ is true so that \[|a_n – 0| = |a_n| = \left|\sum_{k=n}^n a_k\right| = \left|\sum_{k=n}^m a_k\right| \leq \frac{\epsilon}{2} < \epsilon \,, \]thereby showing by definition that $\lim_{n \to \infty} a_n = 0$.
Theorem 3.25a: For $a_n \in \mathbb R^k$ and $c_n \in \mathbb R$, if $|a_n| \leq c_n$ for all $n \geq N_0$ where $N_0$ is a fixed integer, and if $\sum c_n$ converges, then $\sum a_n$ converges.
Proof: The proof is the same as that in the text since Theorem 3.22 applies to sums in $\mathbb R^k$ and the triangle inequality is also true there.
Theorem 3.33: For $a_n \in \mathbb R^k$, given $\sum a_n$, put $\alpha = \limsup_{n \to \infty} \sqrt[n]{|a_n|}$. Then
(a) if $\alpha < 1$ then $\sum a_n$ converges
(b) if $\alpha > 1$ then $\sum a_n$ diverges
(c) if $\alpha = 1$, then the test gives no information.
Proof: The proof is again the same as that given in the text since for each part:
(a) The comparison test (Theorem 3.25a) is valid in $\mathbb R^k$.
(b) Theorem 3.23 is valid in $\mathbb R^k$.
(c) The series given still provide a counter example since they are in $\mathbb R^1$.
Theorem 3.34: For $a_n \in \mathbb R^k$, the series $\sum a_n$
(a) converges if $\limsup_{n \to \infty} \left|\frac{a_{n+1}}{a_n}\right| < 1$
(b) diverges if $\left|\frac{a_{n+1}}{a_n}\right| \geq 1$ for all $n \geq N_0$, where $N_0$ is some fixed integer.
Proof: Similar to the root test, the proof is the same as that given in the text since for each part:
(a) The comparison test (Theorem 3.25a) is valid in $\mathbb R^k$.
(b) Theorem 3.23 is valid in $\mathbb R^k$.
Theorem 3.42: For $a_n \in \mathbb R^k$ and $b_n \in \mathbb R$ suppose that
(a) the partial sums $A_n$ of $\sum a_n$ form a bounded sequence;
(b) $b_0 \geq b_1 \geq b_2 \geq \cdots$;
(c) $\lim_{n \to \infty} b_n = 0$.
Then $\sum a_n b_n$ converges.
Proof: The proof given in the text still holds since Theorem 3.41 clearly holds for $a_n \in \mathbb R^k$ as does the Cauchy criterion (Theorem 3.22).
Theorem 3.45: For $a_n \in \mathbb R^k$, if $\sum a_n$ converges absolutely, then $\sum a_n$ converges.
Proof: The proof is the same since the Cauchy criterion (Theorem 3.22) and the triangle inequality both hold in $\mathbb R^k$.
Theorem 3.47: For $a_n \in \mathbb R^k$ and $b_n \in \mathbb R^k$, if $\sum a_n = A$ and $\sum b_n = B$, then $\sum (a_n + b_n) = A + B$, and $\sum c a_n = cA$ for any fixed $c \in \mathbb R$.
Proof: The proofs are the same as those given in the text since the limit rules used (Theorem 3.3 parts a and b) are trivially shown to hold in $\mathbb R^k$.
Theorem 3.55: If $\sum a_n$ is a series of elements in $\mathbb R^k$ that converges absolutely, then every rearrangement of $\sum a_n$ converges, and they all converge to the same sum.
Proof: The proof is the same since the Cauchy criterion (Theorem~3.22) holds in $\mathbb R^k$.
Exercise 16
(By Dan kyp44 Whitman)
(a) First we show that $x_n > \sqrt\alpha$ for all $n \geq 1$ by induction. The $n=1$ case is obvious since $x_1 > \sqrt\alpha$ by construction. So then assume that $x_n > \sqrt\alpha$. We then have
\begin{align*}
x_n &> \sqrt\alpha \\
x_n – \sqrt\alpha &> 0 \\
(x_n – \sqrt\alpha)^2 &> 0 && \text{(since both sides $\geq 0$)}\\
x_n^2 – 2x_n\sqrt\alpha + \alpha &> 0 \\
x_n^2 + \alpha &> 2x_n\sqrt\alpha \\
\frac{x_n^2 + \alpha}{x_n} &> 2\sqrt\alpha && \text{(since $x_n > \sqrt\alpha > 0$)}\\
x_n + \frac{\alpha}{x_n} &> 2\sqrt\alpha \\
\frac{1}{2}\left(x_n + \frac{\alpha}{x_n}\right) &> \sqrt\alpha \\
x_{n+1} &> \sqrt\alpha
\end{align*} by definition. Therefore $x_n^2 > \alpha$ for any $n \geq 1$. From this we can easily show that $\{x_n\}$ is decreasing monotonically: \[ x_{n+1} = \frac{1}{2}\left(x_n + \frac{\alpha}{x_n}\right) < \frac{1}{2}\left(x_n + \frac{x_n^2}{x_n}\right) = \frac{1}{2}\left(x_n + x_n\right) = x_n \,. \]Now since $\{x_n\}$ is monotonic and bounded (bounded above by $x_1$ and bounded below by $\sqrt\alpha$), it converges by Theorem 3.14. So suppose that $\lim_{n \to \infty} x_n = x$. Then clearly it must be that $\lim_{n \to \infty} x_{n+1} = \lim_{n \to \infty} x_n = x$ also. So we have\[
\lim_{n \to \infty} x_{n+1} =
\lim_{n \to \infty} \frac{1}{2}\left(x_n + \frac{\alpha}{x_n}\right) =
\frac{1}{2}\left(\lim_{n \to \infty} x_n + \frac{\alpha}{\lim_{n \to \infty} x_n}\right) =
\frac{1}{2}\left(x + \frac{\alpha}{x}\right) = x \,,
\]where we have used Theorem 3.3. Solving this simple equation for $x$ results in $x = \lim_{n \to \infty} x_n = \sqrt\alpha$.
(b) We have simply\[
\epsilon_{n+1} = x_{n+1} – \sqrt\alpha =
\frac{1}{2}\left(x_n + \frac{\alpha}{x_n}\right) – \sqrt\alpha =
\frac{1}{2x_n}(x_n^2 + \alpha) – \sqrt\alpha =
\frac{1}{2x_n}(x_n^2 + \alpha – 2x_n\sqrt\alpha)
\]\[
\frac{(x_n – \sqrt\alpha)^2}{2x_n} =
\frac{\epsilon_n^2}{2x_n} <
\frac{\epsilon_n^2}{2\sqrt\alpha}
\]since $x_n > \sqrt\alpha$. So then, letting $\beta = 2\sqrt\alpha$, we claim that
\[ \epsilon_{n+1} < \beta\left(\frac{\epsilon_1}{\beta}\right)^{2^n} \]
for all $n \geq 1$. A simple proof by induction shows that this is indeed the case.
For $n=1$ we have\[
\epsilon_{n+1} = \epsilon_2 < \frac{\epsilon_1^2}{\beta} =
\beta\frac{\epsilon_1^2}{\beta^2} =
\beta\left(\frac{\epsilon_1}{\beta}\right)^2 =
\beta\left(\frac{\epsilon_1}{\beta}\right)^{2^1} =
\beta\left(\frac{\epsilon_1}{\beta}\right)^{2^n} \,.
\]Now assume that
\[ \epsilon_{n+1} < \beta\left(\frac{\epsilon_1}{\beta}\right)^{2^n} \,. \]Then we have
\[
\epsilon_{n+2} < \frac{\epsilon_{n+1}^2}{\beta} <
\frac{1}{\beta}\left[\beta \left(\frac{\epsilon_1}{\beta}\right)^{2^n}\right]^2 =
\frac{1}{\beta}\left[\beta^2 \left(\frac{\epsilon_1}{\beta}\right)^{2^n \cdot 2}\right] =
\beta\left(\frac{\epsilon_1}{\beta}\right)^{2^{n+1}} \,.
\](c) For $\alpha = 3$ and $x_1 = 2$ we first show that $\epsilon_1 / \beta < 1/10$ without using decimal approximations for $\sqrt\alpha = \sqrt3$, starting with the obvious:
\[ 100 < 108 = 36 \cdot3 \]\[ \sqrt{100} < \sqrt{36}\sqrt{3} \]\[ 10 < 6\sqrt{3} \]\[ 10 – 5\sqrt{3} < \sqrt{3} \]\[ 20 – 10\sqrt{3} < 2\sqrt{3} \]\[ 10(2 – \sqrt{3}) < 2\sqrt{3} \]\[ \frac{2 – \sqrt{3}}{2\sqrt{3}} < \frac{1}{10} \]\[ \frac{x_1 – \sqrt\alpha}{2\sqrt\alpha} < \frac{1}{10} \]\[ \frac{\epsilon_1}{\beta} < \frac{1}{10} \,. \]We also have\[ 12 < 16 \]\[ 4 \cdot 3 < 16 \]\[ \sqrt{4}\sqrt{3} < \sqrt{16} \]\[ 2\sqrt{3} < 4 \]\[ \beta < 4 \,, \]from which it follows that\[
\epsilon_5 < \beta \left(\frac{\epsilon_1}{\beta}\right)^{2^4} <
4 \left(\frac{\epsilon_1}{\beta}\right)^{16} <
4 \left(\frac{1}{10}\right)^{16} =
4 \cdot 10^{-16} \,.
\]Likewise\[
\epsilon_6 < \beta \left(\frac{\epsilon_1}{\beta}\right)^{2^5} <
4 \left(\frac{\epsilon_1}{\beta}\right)^{32} <
4 \left(\frac{1}{10}\right)^{32} =
4 \cdot 10^{-32} \,.
\]
Exercise 17
(By Dan kyp44 Whitman)
(a) First we show that $x_n > \sqrt\alpha$ implies that $0 < x_{n+1} < \sqrt\alpha$. Since clearly $\sqrt\alpha > 1$ (since $\alpha > 1$) we have \begin{align*}
0 < 1 < \sqrt\alpha &< x_n \\
0 < \alpha &< x_n^2 \\
0 < \alpha(\alpha – 1) &< x_n^2(\alpha – 1) && \text{(since $\alpha – 1 > 0$)} \\
0 < \alpha^2 – \alpha &< \alpha x_n^2 – x_n^2 \\
0 < x_n^2 + \alpha < \alpha^2 + x_n^2 &< \alpha x_n^2 + \alpha \\
0 < 2\alpha x_n < \alpha^2 + 2\alpha x_n + x_n^2 &< \alpha x_n^2 + 2\alpha x_n + \alpha \\
0 < (\alpha + x_n)^2 &< \alpha (x_n + 1)^2 \\
0 < \alpha + x_n &< \sqrt\alpha (x_n + 1) && \text{(since all sides are $\geq 0$)} \\
0 < \frac{\alpha + x_n}{x_n + 1} &< \sqrt\alpha && \text{(since $x_n+1>0$)} \\
0 < x_{n+1} &< \sqrt\alpha \,.
\end{align*}Identical reasoning with the direction reversed shows that $0 < x_n < \sqrt\alpha$ implies that $x_{n+1} > \sqrt\alpha$. It follows immediately from these that $x_n > \sqrt\alpha$ if $n$ is odd and $0 < x_n < \sqrt\alpha$ if $n$ is even.
Now for any $n \geq 1$ we have \begin{gather}
x_{n+2} = \frac{\alpha + x_{n+1}}{1 + x_{n+1}} =
\frac{\alpha + \frac{\alpha + x_n}{1 + x_n}}{1 + \frac{\alpha + x_n}{1 + x_n}} =
\frac{\alpha(1 + x_n) + \alpha + x_n}{1 + x_n + \alpha + x_n} =
\frac{2\alpha + \alpha x_n + x_n}{\alpha + 2x_n + 1} \label{eqn:ex3.17-1}
\end{gather} So supposing that $n$ is odd we have \begin{align*}
\sqrt\alpha &< x_n \\
\alpha &< x_n^2 && \text{(since both sides are $\geq 0$)} \\
2\alpha &< 2x_n^2 \\
2\alpha + \alpha x_n + x_n &< 2x_n^2 + \alpha x_n + x_n = x_n (2x_n + \alpha + 1) \\
\frac{2\alpha + \alpha x_n + x_n}{\alpha + 2x_n + 1} &< x_n && \text{(since clearly $\alpha + 2x_n + 1 > 0$)} \\
x_{n+2} &< x_n \,.
\end{align*}Thus we have shown that the subsequence of odd indices is monotonically decreasing as required.
(b) An identical derivation as that above with the direction reversed shows that the subsequence of even indices is monotonically increasing.
(c) Consider the subsequence of odd indices, i.e. $\{x_{k_n}\}$ where $k_n = 2n – 1$ for $n \geq 1$. We have shown above that this subsequence is bounded (above by $x_1$ and below by $\sqrt\alpha$) and is monotonic. Therefore by Theorem~3.14 it converges. So let $x = \lim_{n \to \infty} x_{k_n}$. Then also clearly $\lim_{n \to \infty} x_{k_{n+1}} = \lim_{n \to \infty} x_{k_n} = x$. So since $$k_{n+1} = 2(n+1) – 1 = 2n + 1 = (2n-1)+2 = k_n + 2$$ by \eqref{eqn:ex3.17-1} we have \[
\lim_{n \to \infty} x_{k_{n+1}} =
\lim_{n \to \infty} \frac{2\alpha + \alpha x_{k_n} + x_{k_n}}{\alpha + 2x_{k_n} + 1} =
\frac{2\alpha + \alpha \lim_{n \to \infty} x_{k_n} + \lim_{n \to \infty} x_{k_n}}{\alpha + 2 \lim_{n \to \infty} x_{k_n} + 1} =
\frac{2\alpha + \alpha x + x}{\alpha + 2x + 1} = x \,.
\]Solving this for $x$ results in $x = \sqrt\alpha$. A similar argument for the subsequence of even indices, i.e. $\{x_{l_n}\}$ where $l_n = 2n$ for $n \geq 1$, shows that it also converges to $\sqrt\alpha$.
So consider any $\epsilon > 0$. Then there is an $N$ where $|x_{k_n} – \sqrt\alpha| < \epsilon$ for all $n \geq N$. Likewise there is an $M$ where $|x_{l_n} – \sqrt\alpha| < \epsilon$ for all $n \geq M$. So let $L = \max(k_N, l_M)$ and consider any $n \geq L$. If $n$ is odd then there is an $m \geq 1$ where $n = 2m-1 = k_m$. We then have \[ n = 2m-1 \geq L \geq k_N = 2N-1\]\[ 2m \geq 2N \] \[ m \geq N \] so that $|x_n – \sqrt\alpha| = |x_{k_m} – \sqrt\alpha| < \epsilon$. A similar argument shows that $|x_n – \sqrt\alpha| < \epsilon$ in the case when $n$ is even as well. Thus we have shown that $\lim_{n \to \infty} x_n = \sqrt\alpha$ by definition.
(d)
Lemma 3.17.1: If $s_n \to 0$ for a real sequence $\{s_n\}$ then also\[ \lim_{n \to \infty} (s_n)^n = 0 \,. \]Proof: Consider any $\epsilon > 0$.If $\epsilon \geq 1$ then is an $N \geq 1$ such that $|s_n| < 1$ for any $n \geq N$. So consider any $n \geq N$. We then have \[ |s_n^n – 0| = |s_n^n| = |s_n|^n < 1^n = 1 \leq \epsilon \,. \]If $\epsilon < 1$ then $\epsilon^n \leq \epsilon$ for $n \geq 1$. Also there is an $N \geq 1$ such that $|s_n| < \epsilon$ for all $n \geq N$. So consider any $n \geq N$, nothing that also $n \geq N \geq 1$ so that \[ |s_n^n – 0| = |s_n^n| = |s_n|^n < \epsilon^n \leq \epsilon\,. \] So since $\epsilon$ was arbitrary we have shown that $(s_n)^n \to 0$.
Lemma 3.17.2: For $0 < \xi < 1$ and positive real $a$ \[ \lim_{n \to \infty} \xi^{2^n} a^n = 0\,. \]Proof: First consider any $n \geq 2$. Then by the Binomial Theorem we have\[
2^n = (1+1)^n = \sum_{k=0}^n \binom{n}{k} =
\binom{n}{0} + \binom{n}{1} + \binom{n}{2} + \ldots >
\binom{n}{2} = \frac{n!}{2!(n-2)!} =
\frac{n(n-1)}{2} \,.
\]Since $\xi < 1$ and $2^n > n(n-1) / 2$ it follows that for $n \geq 2$\begin{gather}
0 < \xi^{2^n}a^n < \xi^\frac{n(n-1)}{2} a^n \,. \label{eqn:ex3.17-2}\end{gather}But we have
\begin{gather}
\xi^\frac{n(n-1)}{2} a^n = \left(\xi^\frac{n-1}{2} a\right)^n \,, \label{eqn:ex3.17-3}
\end{gather}and clearly, since $|\xi| < 1$, by Theorem 3.20e\[ \lim_{n\to\infty} \xi^\frac{n-1}{2} a = a \lim_{n\to\infty} \xi^\frac{n-1}{2} = 0 \]so that by Lemma 3.17.1\[ \lim_{n\to\infty} \left(\xi^\frac{n-1}{2} a\right)^n = 0 \,. \]The desired result then follows from \eqref{eqn:ex3.17-2}, \eqref{eqn:ex3.17-3}, and Theorem 3.19.
To start the main argument let $\{y_n\}$ denote the sequence from Exercise~3.16 and $\{x_n\}$ denote the sequence from this exercise, setting $y_1 = x_1 > \sqrt\alpha$. We also define $\delta_n = y_n – \sqrt\alpha$ and $\epsilon_n = |x_n – \sqrt\alpha|$, noting that the absolute value for $\epsilon_n$ is necessary since it was shown in part a that $x_n < \sqrt\alpha$ for even $n$.
First we show that $\delta_n \to 0$, which is straightforward given that it was shown in Exercise~3.16a that $y_n \to \sqrt\alpha$:\[
\lim_{n \to \infty} \delta_n =
\lim_{n \to \infty} (y_n – \sqrt\alpha) =
\lim_{n \to \infty} y_n – \sqrt\alpha =
\sqrt\alpha – \sqrt\alpha = 0
\]Next we show that there is an $N \geq 1$ where $\delta_N / \beta < 1$, where again $\beta = 2\sqrt\alpha$.This follows immediately from the fact that $\delta_n \to 0$ since this implies that there is an $N \geq 1$ such that $|\delta_n – 0| = |\delta_n| = \delta_n < \beta$ for all $n \geq N$, since $\beta > 0$. In particular $\delta_N < \beta$, from which it immediately follows that $\delta_N / \beta < 1$ since $\beta > 0$.
Now in Exercise~3.16b it was shown that $\delta_{n+1} < \delta_n^2 / \beta$ for all $n \geq 1$. Similar to the derivation that follows in Exercise~3.16b we claim that $\delta_n \leq \beta (\delta_N / \beta)^{2^{n-N}}$ for all $n \geq N$, and to the surprise of no one we show this by induction. For $n=N$ we have
\[
\delta_n = \delta_N \leq \delta_N =
\beta \frac{\delta_N}{\beta} = \beta \left(\frac{\delta_N}{\beta}\right)^1 =
\beta \left(\frac{\delta_N}{\beta}\right)^{2^0} =
\beta \left(\frac{\delta_N}{\beta}\right)^{2^{N-N}} =
\beta \left(\frac{\delta_N}{\beta}\right)^{2^{n-N}} \,.
\]Now assume that $\delta_n \leq \beta (\delta_N / \beta)^{2^{n-N}}$. We then have\[
\delta_{n+1} \leq \frac{\delta_n^2}{\beta} \leq
\frac{1}{\beta}\left[\beta \left(\frac{\delta_N}{\beta}\right)^{2^{n-N}}\right]^2 =
\frac{\beta^2}{\beta} \left(\frac{\delta_N}{\beta}\right)^{2^{n-N+1}} =
\beta \left(\frac{\delta_N}{\beta}\right)^{2^{(n+1)-N}} \,.
\]Turning our attention to $\{x_n\}$, we first claim that $\epsilon_1 > \epsilon_2$. To see this we have
\begin{align*}
\sqrt\alpha < x_1 &< 2 + x_1 \\
\sqrt\alpha – 1 &< 1 + x_1 \\
(x_1 – \sqrt\alpha)(\sqrt\alpha – 1) &< (1 + x_1)(x_1 – \sqrt\alpha) && \text{(since $x_1 – \sqrt\alpha > 0$)} \\
\sqrt\alpha + \sqrt\alpha x_1 – \alpha – x_1 &< (1 + x_1)(x_1 – \sqrt\alpha) \\
\sqrt\alpha(1 + x_1) – (\alpha + x_1) &< (1 + x_1)(x_1 – \sqrt\alpha) \\
\sqrt\alpha – \frac{\alpha + x_1}{1 + x_1} &< x_1 – \sqrt\alpha && \text{(since $1 + x_1 > 0$)} \\
\sqrt\alpha – x_2 &< x_1 – \sqrt\alpha \\
\epsilon_2 &< \epsilon_1 \,.
\end{align*}We also claim that $\epsilon_{n+1} \geq \epsilon_n \gamma$ for any $n \geq 1$, where we have let\[ \gamma = \frac{\sqrt\alpha – 1}{1 + x_1} \]To see this consider any $n \geq 1$. If $n$ is odd then we have
\begin{align*}
\epsilon_{n+1} &= |x_{n+1} – \sqrt\alpha| = \sqrt\alpha – x_{n+1} && \text{(since $n+1$ is even)} \\
&= \sqrt\alpha – \frac{\alpha + x_n}{1 + x_n} = \frac{\sqrt\alpha + \sqrt\alpha x_n – \alpha – x_n}{1+ x_n} \\
&= \frac{(\sqrt\alpha – x_n)(1 – \sqrt\alpha)}{1 + x_n} = \frac{(x_n – \sqrt\alpha)(\sqrt\alpha – 1)}{1 + x_n} \\
&= \epsilon_n \frac{\sqrt\alpha – 1}{1 + x_n} \geq \epsilon_n \frac{\sqrt\alpha – 1}{1 + x_1} = \epsilon_n \gamma\,, && \text{(since $n$ is odd)}
\end{align*}where we have used the fact that $x_n \leq x_1$ for all odd $n \geq 1$ since the odd indexed sequence decreases monotonically as shown in part a. On the other hand if $n$ is even then
\begin{align*}
\epsilon_{n+1} &= |x_{n+1} – \sqrt\alpha| = x_{n+1} – \sqrt\alpha && \text{(since $n+1$ is odd)} \\
&= \frac{\alpha + x_n}{1 + x_n} – \sqrt\alpha = \frac{\alpha + x_n – \sqrt\alpha – \sqrt\alpha x_n}{1+ x_n} \\
&= \frac{(\sqrt\alpha – x_n)(\sqrt\alpha – 1)}{1 + x_n} = \epsilon_n \frac{\sqrt\alpha – 1}{1 + x_n} && \text{(since $n$ is even)} \\
&\geq \epsilon_n \frac{\sqrt\alpha – 1}{1 + x_1} = \epsilon_n \gamma\,,
\end{align*} where we have used the fact that $x_n < \sqrt\alpha < x_1$ for even $n$ as shown in part a. So from this we can easily show that $\epsilon_n \geq \epsilon_1 \gamma^{n-1}$ for all $n \geq 1$ by induction. For $n=1$ we have
\[ \epsilon_n = \epsilon_1 \geq \epsilon_1 = \epsilon_1 \cdot 1 = \epsilon_1 \gamma^0 = \epsilon_1 \gamma^{1-1} = \epsilon_1 \gamma^{n-1} \,. \]Now assume that $\epsilon_n \geq \epsilon_1 \gamma^{n-1}$. Then we have \[ \epsilon_{n+1} \geq \epsilon_n \gamma \geq \epsilon_1 \gamma^{n-1} \gamma = \epsilon_1 \gamma^{n-1+1} = \epsilon_1 \gamma^{(n+1)-1} \,. \]
Since $0 < \delta_N / \beta < 1$ and $\gamma > 0$, Lemma~3.17.2 implies that there is an $M \geq 1$ where \[ \left(\frac{\delta_N}{\beta}\right)^{2^m} \left(\frac{1}{\gamma}\right)^m < \gamma^{N-1}\frac{\epsilon_1}{\beta} \]for all $m \geq M$, noting that the right hand side is positive. So then let $L = N + M$ and consider an arbitrary $n \geq L$. Letting $m = n – N$ we clearly have $n = m + N$ so that \[ n \geq L \]\[ m + N \geq N + M \]\[ m \geq M \,. \]Therefore we have
\begin{align*}
\left(\frac{\delta_N}{\beta}\right)^{2^m} \left(\frac{1}{\gamma}\right)^m &< \gamma^{N-1}\frac{\epsilon_1}{\beta} \\
\left(\frac{\delta_N}{\beta}\right)^{2^m} &< \gamma^{m + N -1} \frac{\epsilon_1}{\beta} && \text{(since $\gamma^m > 0$)} \\
\beta \left(\frac{\delta_N}{\beta}\right)^{2^m} &< \epsilon_1 \gamma^{m + N -1} && \text{(since $\beta > 0$)} \\
\beta \left(\frac{\delta_N}{\beta}\right)^{2^{n – N}} &< \epsilon_1 \gamma^{n – N + N -1} \\
\beta \left(\frac{\delta_N}{\beta}\right)^{2^{n – N}} &< \epsilon_1 \gamma^{n-1}
\end{align*}But since $n \geq L = N + M \geq N \geq 1$, by what was shown before and transitivity we have
\[ \delta_n \leq \beta\left(\frac{\delta_N}{\beta}\right)^{2^{n – N}} < \epsilon_1 \gamma^{n-1} \leq \epsilon_n \]Thus we have shown that for all $n \geq L$
\[ \delta_n < \epsilon_n\,. \]This shows that $\{y_n\}$ converges more rapidly than $\{x_n\}$.