1. Solution: We give a counterexample. Define $T \in \mathcal{L}(\mathcal{R}^2)$ by

$$ \begin{aligned} Te_1 = e_1\\ Te_2 = -e_2 \end{aligned} $$ where $e_1, e_2$ is the standard basis of $\mathbb{R}^2$. The matrix of $T$ with respect to this same basis is

$$ \begin{pmatrix}1 & 0\\0 & -1\end{pmatrix}, $$ which equals its transpose, therefore $T$ is self-adjoint. Moreover, the basis $\frac{1}{\sqrt{2}}(e_1 + e_2), \frac{1}{\sqrt{2}}(e_1 – e_2)$ is orthonormal and

$$ \begin{aligned} \left\langle T\left(\frac{1}{\sqrt{2}}(e_1 + e_2)\right), \frac{1}{\sqrt{2}}\left(e_1 + e_2\right) \right\rangle &= 0\\ \left\langle T\left(\frac{1}{\sqrt{2}}(e_1 – e_2)\right), \frac{1}{\sqrt{2}}\left(e_1 – e_2\right) \right\rangle &= 0, \end{aligned} $$ but $T$ is not positive because

$$ \langle Te_2, e_2 \rangle = \langle -e_2, e_2 \rangle = -1. $$

2. Solution: Note that $T$ is a positive operator on $V$, we have \begin{equation}\label{7CP2.1} \langle T(v-w),v-w\rangle\ge 0. \end{equation}On the other hand, \[Tv=w\quad\text{ and } \quad Tw=v\]imply that $T(v-w)=w-v$, hence \begin{equation}\label{7CP2.2}\langle T(v-w),v-w\rangle=-\langle v-w,v-w\rangle\le 0.\end{equation}Therefore $\langle v-w,v-w\rangle=0$ by (\ref{7CP2.1}) and (\ref{7CP2.2}), i.e. $v=w$.

3. Solution: For all $u \in U$, we have

$$ \langle T|_U u, u \rangle = \langle Tu, u \rangle = \langle u, Tu \rangle = \langle u, T|_U u \rangle. $$ Thus, $T|_U$ is self-adjoint. Furthermore,

$$ \langle T|_U u, u \rangle = \langle Tu, u \rangle \ge 0, $$ which shows that $T|_U$ is positive.

4. Solution: By 7.6 (c) and (e), we have $$(TT^*)*=(T^*)*T^*=TT^*,\quad (T^*T)^*=T^*(T^*)^*=T^*T.$$Hence both $TT^*$ and $T^*T$ are self-adjoint.

On the other hand, for any $v\in V$, we have\[\langle T^*Tv,v\rangle =\langle Tv,(T^*)^*v\rangle=\langle Tv,Tv\rangle\geqslant 0.\]Hence $T^*T$ is a positive operator.

Similarly, for any $w\in W$, we have\[\langle TT^*v,v\rangle =\langle T^*v,T^*v\rangle\geqslant 0.\]Hence $TT^*$ is a positive operator.

5. Solution: Suppose $T$ and $S$ are positive operators on $V$, then $T^*=T$ and $S^*=S$. Therefore, we have $$(T+S)^*=T^*+S^*=T+S.$$Hence $T+S$ is self-adjoint.

Again since $T$ and $S$ are positive operators on $V$,, for any $v\in V$, $\langle Tv,v\rangle \geqslant 0$ and $\langle Sv,v\rangle \geqslant 0$. Thus we have\[\langle(T+S)v,v \rangle=\langle Tv,v\rangle+\langle Sv,v\rangle\geqslant 0.\]Therefore $T+S$ is a positive operator.

6. Solution: Since $T$ is positive, it follows from 7.35 (a) $\iff$ (d) that there exists a self-adjoint operator $S$ such that $S^2=T$. For any positive integer $k$, we have $(S^k)^*=(S^*)^k=S^k$ by 7.6 (e) and the equality $S=S^*$. Hence $S^k$ is self-adjoint.

Note that we also have $(S^k)^2=(S^2)^k=T^k$, hence $T^k$ has a self-adjoint square root. It follows from 7.35 (a) $\iff$ (d) again that $T^k$ is positive.

7. Solution: Suppose $\langle Tv,v\rangle >0$ for every $v\in V$ with $v\ne 0$. If $T$ is not invertible, there must exist a nonzero $u\in V$ such that $Tu=0$, hence $\langle Tu,u\rangle =0$ for $u\ne 0$. Therefore we get a contradiction, which in turn implies that $T$ is invertible.

Conversely, suppose $T$ is invertible. Since $T$ is positive, it follows from 7.35 (a) $\iff$ (d) that there exists a self-adjoint operator $S$ such that $S^2=T$. Because $T$ is injective, so is $S$. Hence for every $v\in V$ with $v\ne 0$, we have $Sv\ne 0$. Moreover, since $S$ is self-adjoint (and $Sv\ne 0$), we have$$\langle Tv,v\rangle =\langle S^2v,v\rangle =\langle Sv,Sv\rangle >0.$$

8. Solution: If $\langle \cdot,\cdot\rangle_T$ is an inner product on $V$, then for any $v\in V$ we have $$\langle Tv,v\rangle =\langle v,v\rangle_T\geqslant 0.$$Hence $T$ is positive. Moreover, for any nonzero $v\in V$ we have $$\langle Tv,v\rangle =\langle v,v\rangle_T>0.$$It follows from Problem 7 that $T$ is invertible.

Conversely, suppose that $T$ is an invertible positive operator. We show that $\langle \cdot,\cdot\rangle_T$ is an inner product on $V$ by checking definition 6.3.

Positivity: Note that $T$ is positive, we have $$\langle v,v\rangle_T=\langle Tv,v\rangle \geqslant 0.$$ Definiteness: If $v=0$, then $$\langle v,v\rangle_T=\langle Tv,v\rangle \geqslant 0.$$If $\langle v,v\rangle_T=0$, since $T$ is invertible positive operator on $V$, it follows from Problem 7 that $v=0$.

Additivity, homogeneity, and conjugate symmetry can be checked directly with out any difficulties.

9. Solution: Let $e_1,e_2$ be an orthonormal basis of $\mb F^2$ and $\theta\in[0,2\pi)$, define $T_\theta\in\ca L(\mb F^2)$ by\[T_\theta e_1=\cos\theta e_1+\sin\theta e_2,\quad T_\theta e_2=\sin \theta e_1-\cos\theta e_2.\]Note that $$\cos\theta=\langle T_\theta e_1,e_1\rangle=\langle e_1,(T_\theta)*e_1\rangle,$$ $$\sin\theta=\langle T_\theta e_2,e_1\rangle=\langle e_2,(T_\theta)^*e_1\rangle,$$ we have $(T_\theta)^*e_1=\cos\theta e_1+\sin\theta e_2$. Similarly, note that $$\sin\theta=\langle T_\theta e_1,e_2\rangle=\langle e_1,(T_\theta)^*e_2\rangle,$$ $$-\cos\theta=\langle T_\theta e_2,e_2\rangle=\langle e_2,(T_\theta)^*e_2\rangle,$$ we have $(T_\theta)^*e_2=\sin\theta e_1-\cos\theta e_2$. Hence $T_\theta=(T_\theta)^*$, which implies that $T_\theta$ is self-adjoint. Also note that \begin{align*}& (T_\theta)^2e_1=T_\theta(\cos\theta e_1+\sin\theta e_2)\\ =&\cos\theta(\cos\theta e_1+\sin\theta e_2)+\sin\theta(\sin \theta e_1-\cos\theta e_2)\\=&(\cos^2\theta+\sin^2\theta)e_1=e_1,\end{align*} \begin{align*}& (T_\theta)^2e_2=T_\theta(\sin\theta e_1-\cos\theta e_2)\\ =&\sin\theta(\cos\theta e_1+\sin\theta e_2)-\cos\theta(\sin \theta e_1-\cos\theta e_2)\\=&(\cos^2\theta+\sin^2\theta)e_2=e_2,\end{align*}we have $(T_\theta)^2=\mathrm{id}$. Therefore we have infinitely many self-adjoint operators as the square root of $\mathrm{id}$.

The construction comes from the following idea. Let $T$ be self-adjoint such that $T^2=\mathrm{id}$. Let $\begin{pmatrix}a & b\\ c & d\end{pmatrix}$ be the matrix with respect to an orthonormal basis of $\mb F^2$. Since $T$ is self-adjoint, we have $$\begin{pmatrix}a & b\\ c & d\end{pmatrix}=\begin{pmatrix}\bar a & \bar c\\ \bar b & \bar d\end{pmatrix}.$$ Hence $a,d\in\mb R$ and $b=\bar c$. If $T^2=\mathrm{id}$, then we have $a^2+bc=1$, $ab+bd=0$, and $d^2+bc=1$. Hence we can take $a=-d$ and $b=c$. Then $a^2+b^2=1$. Take $a=\cos\theta$ and $b=\sin\theta$, we get the construction.

10. Solution: “(a)$\Longrightarrow$(b)” If $S$ is an isometry, so is $S^*$ by 7.42 (g)$\iff$(a). It follows from 7.42 (a)$\iff$(b) that $$\langle S^*u,S^*v\rangle =\langle u,v\rangle$$ for all $u,v\in V$.

“(b)$\Longrightarrow$(c)” If $e_1,\cdots,e_m$ is an orthonormal basis of $V$, we have $\langle e_i,e_j\rangle =\delta_{ij}$. Since $$\langle S^*u,S^*v\rangle =\langle u,v\rangle$$ for all $u,v\in V$, we have\[\langle S^*e_i,S^*e_j\rangle=\langle e_i,e_j\rangle =\delta_{ij}.\]Hence $S^*e_1,\cdots,S^*e_m$ is an orthornormal basis of $V$.

“(c)$\Longrightarrow$(d)” is trivial.

“(d)$\Longrightarrow$(a)” It follows from 7.42 (a)$\iff$(d) that $S^*$ is an isometry. So is $S$ by 7.42 (g)$\iff$(a) since $(S^*)^*=S$ from 7.6 (c).

11. Solution: Let $e_1, e_2, e_3$ and $f_1, f_2, f_3$ be orthonormal bases of $\mathbb{F}^3$ consisting of eigenvectors of $T_1$ and $T_2$, respectively, corresponding to the eigenvalues $2, 5, 7$.

Define $S$ by

$$ Se_j = f_j $$ for $j = 1, 2, 3$. One easily checks that $S$ is an isometry (using the Pythagorean Theorem). Then, because $S^{-1} = S^*$ (by 7.42), we have $S^*f_j = e_j$. Thus $$ T_1e_1 = 2e_1 = S^*(2f_1) = S^*(T_2f_1) = S^*T_2Se_1. $$ Similarly $T_1e_2 = S^*T_2Se_2$ and $T_1e_3 = S^*T_2Se_3$. Therefore $T_1 = S^*T_2S$.

12. Solution: Let $e_1, e_2, e_3, e_4$ denote an orthonormal basis of $\mathbb{F}^4$. Define $T_1, T_2 \in \mathcal{L}(\mathbb{F}^4)$ by

$$ \begin{aligned} T_1e_1 &= 2e_1\\ T_1e_2 &= 2e_2\\ T_1e_3 &= 5e_3\\ T_1e_4 &= 7e_4\\ \\ T_2e_1 &= 2e_1\\ T_2e_2 &= 5e_2\\ T_2e_3 &= 5e_3\\ T_2e_4 &= 7e_4. \end{aligned} $$ Then both $T_1$ and $T_2$ are self-adjoint (the matrices equal their transposes) and $2, 5, 7$ are their eigenvalues. Suppose by contradiction that $S$ is an isometry on $V$ such that $T_1 = S^*T_2S$. Let $v \in V$ be the vector that $S$ maps to $e_2$. Then

$$ T_1v = S^*T_2Sv = S^*T_2e_2 = 5S^*e_2 = 5v $$

Therefore $v \in E(T_1, 5) = \operatorname{span}(e_3)$. Let also $w \in V$ be the vector that $S$ maps to $e_3$. Note that $v, w$ is linearly independent, because $e_2, e_3$ is linearly independent. Then

$$ T_1w = S^*T_2Sw = S^*T_2e_3 = 5S^*e_3 = 5w. $$ Therefore $w \in E(T_1, 5) = \operatorname{span}(e_3)$. But this is a contradiction, because we can’t have a linearly independent list of length $2$, $v, w$, in a $1$-dimensional vector space, $\operatorname{span}(e_3)$. Hence, there does not exist such $S$.

Notice that it wasn’t necessary to require $S$ to be an isometry, we just needed to suppose, by contradiction, the existence of an invertible $S$ such that $T_1 = S^{-1}T_2S$. This $S$ does not exist. Since the desired isometry must satisfy the same property (because the adjoint of an isometry equals its inverse), it follows that there cannot exist such isometry. The key idea here is that the eigenspaces of $T_1$ and $T_2$ don’t fit.

13. Solution: It is false. Let $e_1, \dots, e_n$ be an orthonormal basis of $V$ and define $Se_j = e_1$ for $j = 1, \dots, n$. Then $||Se_j|| = 1$ for each $e_j$, but obviously $S$ is not invertible, therefore $S$ is not an isometry (7.42 requires isometries to be invertible).

14. Solution: In the exercise, $T$ was already shown to be self-adjoint. So $-T$ is also self-adjoint. Note that $$1, \cos x, \cos 2x, \dots, \cos nx, \sin x, \sin 2x, \dots, \sin nx$$ is a basis of $V$ consisting of eigenvectors of $-T$ whose corresponding eigenvalues are all nonnegative. Thus by 7.35 $-T$ is positive.

## David

28 Mar 2022For #11, aren't T1 and T2 equal? They have the same eigenvalues by assumption and the same eigenvectors per theorem 7.21. By theorem 7.22, for a normal op the eigenvectors corresponding to distinct eigenvalues are orthogonal, thus we can form a single basis for V consisting of eigenvectors of T1 and T2, and wrt to this, T1 and T2 have the same diagonal matrix. Thus T1 = T2. From here its trivial to show that there exists an S such that the product of S and S* must be I, i.e. there exists an isometry S that solves the problem.

## David

4 Dec 20217C3 alternate: Since T is positive it is self-adjoint and has all nonnegative eigenvalues (7.35B). Since T|u is invariant, it is also self-adjoint by 7.28. Thus, to show that T|u is positive we need only show that T|u has all nonnegative eigenvalues (7.35B again). Since T|u is just T restricted to a subspace of V, it's eigenvalues, if it has any, are a subset of the eigenvalues of T and are therefore all nonnegative and because it's self-adjoint it must have at least one. Thus T|u is positive.

## Zheng Chen

27 Feb 2020Question! How do we use U is invariant in Q3?

## Zheng Chen

27 Feb 2020Also, for Q3, what if v \in V but v \notin W, then Tv may not even make sense.

## Zheng Chen

27 Feb 2020Sorry, this is my mistake, please ignore this one

## Linearity

27 Feb 2020It was used in the equality $T|_Uu=Tu$. First $T|_Uu$ is well-defined on $U$ and second $T|_Uu=Tu$.

## David

4 Dec 2021You can use 7.28 with the fact that T|u is invariant to show that T|u is self-adjoint.

## Mike

19 Nov 20197c11 Such orthonormal bases might not exist. According to the spectrum theorem, T1 and T2 being normal can lead to the existence of orthonormal bases being eigenvectors only if F=C. for the instances of F=R, it requires T1 and T2 being self-adjoint.

## Marcel Ackermann

24 Mar 20187C11) Let u_2, u_5, u_7 the corresponding eigenvectors of T_1, and v_2, v_5, v_7 the ones of T_2. Define S as S u_i = v_i. We want to show that T_1 u_i = S* T_2 S u_i. So: T_1 u_i = S* T_2 S u_i <=> i u_i = S* T_2 v_i <=> i u_i = S* i v_i <=> u_i = S* v_i. According to 7.42d S is an isometry iff there is an orthonormal basis e_i of V s.t. S e_i is orthonormal, which we have. So S^-1 = S*.

## Marcel Ackermann

21 Sep 20177C6) Alternative: = geq 0 (T is positive, so it's self adjoint, the unequality is by positivity of inner product) => T^k is positive =

for k even:

for k odd:

== geq 0 (because T is positive) => T^k is positive ## Célio Passos

18 Sep 2017In the forward direction of Exercise 8, you have show that T is self-adjoint before claiming it is positive.

## Marcel Ackermann

29 Aug 20177C3) by definition $T|_U$ is an operator on U. geq 0$? Clearly yes, because T is (even more elements than the restriction fulfill the property).

Is it positive? Eg. is $

## Mohammad Rashidi

30 Aug 2017Thanks

## Marcel Ackermann

30 Jul 20177C1) Counter example: Let $T(x_1, x_2) = (-x_2, -x_1)$. It is self-adjoint: $ < t(v_1, v_2),(w_1,w_2) > = < (-v_1, -v_2), (w_1, w_2)> = -v_1 w_1 - v_2 w_2 = < (v_1, v_2), T(w_1, w_2) >$ and $ < t(e_j), e_j > =0$ for the standard basis. Yet, T is not a positive operator: $ < t(1,1),(1,1) > =-2$.

## Adol

24 Apr 2017#13 is in the solutions manual you refer us to commonly; it's in chapter 7 #21.

## Matt Lundy

17 Mar 2016What are your goals for this site? I would be happy to contribute some solutions if you are interested.

## Hunter

25 Mar 2016Please do this! there are a lot of missing solutions and this site has helped me tremendously!