Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
eigenvalues_and_eigenvectors [2022/10/06 20:20] – [Normal Operators] admineigenvalues_and_eigenvectors [2022/10/06 23:47] (current) – [Consequences of the Spectral Theorem] admin
Line 111: Line 111:
 Note that it is possible that one of the eigenvalues $a_j = 0$, in which case the term $a_j P_{a_j} = 0$ may be omitted from the spectral decomposition.  However, the completness relation $\sum_j \hat{P}_{a_j} = \hat{I}$ only holds if we include the projector onto the eigenspace with eigenvalue zero. Note that it is possible that one of the eigenvalues $a_j = 0$, in which case the term $a_j P_{a_j} = 0$ may be omitted from the spectral decomposition.  However, the completness relation $\sum_j \hat{P}_{a_j} = \hat{I}$ only holds if we include the projector onto the eigenspace with eigenvalue zero.
  
-Note, if the operator $\hat{A}$ has a continuous spectrum then we would have to write orthogonality as $\hat{P}_a\hat{P}_{a'} = \delta(a - a') \hat{P}_a$, completeness as+Note, if the operator $\hat{A}$ has a continuous spectrum on an interval $a_{\mathrm{min}} < a < a_{\mathrm{max}}$ then we would have to write orthogonality as $\hat{P}_a\hat{P}_{a'} = \delta(a - a') \hat{P}_a$, completeness as
 \[\int_{a_{\mathrm{min}}}^{a_{\mathrm{max}}} \hat{P}_{a} \, \mathrm{d}a = \hat{I},\] \[\int_{a_{\mathrm{min}}}^{a_{\mathrm{max}}} \hat{P}_{a} \, \mathrm{d}a = \hat{I},\]
-where the integral is over the spectrum of $\hat{A}$, and the spectral decomposition as+and the spectral decomposition as
 \[\hat{A} = \int_{a_{\mathrm{min}}}^{a_{\mathrm{max}}} a \hat{P}_{a} \, \mathrm{d}a.\] \[\hat{A} = \int_{a_{\mathrm{min}}}^{a_{\mathrm{max}}} a \hat{P}_{a} \, \mathrm{d}a.\]
-Even more generally, \hat{A} might have a discrete set $a_1,a_2$ of eigenvalues within some interval as well as a continuous set $a_{\mathrm{min}} < a < a_{\mathrm{max}}$ in a disjoint interval, and then we will have to write the spectral decomposition as+Even more generally, $\hat{A}might have a discrete set $a_1,a_2$ of eigenvalues within some interval as well as a continuous set $a_{\mathrm{min}} < a < a_{\mathrm{max}}$ in a disjoint interval, and then we would have to write the spectral decomposition as
 \[\hat{A} = \sum_j a_j \hat{P}_{a_j} + \int_{a_{\mathrm{min}}}^{a_{\mathrm{max}}} a \hat{P}_{a} \, \mathrm{d}a.\] \[\hat{A} = \sum_j a_j \hat{P}_{a_j} + \int_{a_{\mathrm{min}}}^{a_{\mathrm{max}}} a \hat{P}_{a} \, \mathrm{d}a.\]
  
Line 122: Line 122:
  
 In these notes, we will prove the finite-dimensional case.  That proof is fairly straightforward, but involved, so it has [[the_spectral_theorem|its own page]].  On this page, we will prove a small part of the theorem. In these notes, we will prove the finite-dimensional case.  That proof is fairly straightforward, but involved, so it has [[the_spectral_theorem|its own page]].  On this page, we will prove a small part of the theorem.
 +
 +**Theorem**
 +The eigenvectors corresponding to distinct eigenvalues of a Hermitian operator $\hat{A}$ are orthogonal.
 +
 +**Proof**
 +Let $a \neq b$ be distinct eigenvalues of $\hat{A}$ with eigenvectors $\ket{a}$ and $\ket{b}$.  Then we can calculate $\sand{b}{\hat{A}}{a}$ in two different ways.  First, acting with $\hat{A}$ on the right,
 +\[\sand{b}{\hat{A}}{a} = \bra{b} \left ( \hat{A} \ket{a}\right ) = a \braket{b}{a}.\]
 +Second, using hermiticity and acting with $\hat{A}$ to the left,
 +\[\sand{b}{\hat{A}}{a} = \sand{b}{\hat{A}^{\dagger}}{a} = \left ( \bra{b}\hat{A}^{\dagger} \right ) \ket{a} = b^* \braket{b}{a} = b\braket{b}{a},\]
 +where the last step follows because the eigenvalues of a Hermitian operator are real.
 +
 +Equating the two expressions gives $a \braket{b}{a} = b\braket{b}{a}$, which we can rearrange to
 +\[(a-b) \braket{b}{a} = 0.\]
 +In order to satisfy this equation, it must be the case that either $(a-b) = 0$ or $\braket{b}{a} = 0$, but we have assumed that $a$ and $b$ are distinct eigenvalues, so $(a-b) \neq 0$ and hence $\braket{b}{a} = 0$.
 +
 +====== Consequences of the Spectral Theorem ======
 +
 +One of the most important consequences of the spectral theorem is that any vector can be written in an eigenbasis of a normal operator, and, in that basis, the action of the operator is easy to compute.
 +
 +Let $\ket{a_j,1},\ket{a_j,2},\cdots$ be an orthonormal basis for the eigenspace corresponding to eigenvalue $a_j$.  Orthogonality and completeness imply that $\ket{a_1,1},\ket{a_1,2},\cdots,\ket{a_2,1},\ket{a_2,2},\cdots$ is a complete orthonormal basis for the whole Hilbert space:
 +\[\braket{a_j,k}{a_n,m} = \delta_{j,n}\delta_{km},\qquad\qquad\qquad \sum_{j,k} \proj{a_j,k} = \hat{I}\]
 +This implies that any vector $\ket{\psi}$ can be written as
 +\[\ket{\psi} = \sum_{j,k} b_{jk} \ket{a_j,k},\]
 +where $b_{jk} = \braket{a_j,k}{\psi}$.
 +
 +In this basis, the action of $\hat{A}$ can be computed as follows:
 +\[
 +  \hat{A}\ket{\psi} = \sum_{j,k} b_{jk} \hat{A}\ket{a_j,k} = \sum_{j,k} b_{jk} a_j \ket{a_j,k}.
 +\]
 +
 +Another way of saying the same thing is to note that the matrix representation of $\hat{A}$ in the $\ket{a_j,k}$ basis is diagonal.  The matrix elements are:
 +\[\sand{a_j,k}{\hat{A}}{a_n,m} = a_n \braket{a_j,k}{a_n,m} = a_n \delta_{jn}\delta_{km},\]
 +and if we choose the ordering of basis vectors $\ket{a_1,1},\ket{a_1,2},\cdots,\ket{a_2,1},\ket{a_2,2},\cdots$, this means that the matrix is
 +\[
 +\left ( \begin{array}{cccccc} 
 +a_1 & 0 & \cdots & 0 & 0 & \cdots \\
 +0 & a_1 & \cdots & 0 & 0 & \cdots \\
 +\vdots & \vdots & \ddots & \vdots & \vdots & \ddots \\
 +0 & 0 & \cdots & a_2 & 0 & \cdots \\
 +0 & 0 & \cdots & 0 & a_2 & \cdots \\
 +\vdots & \vdots & \ddots & \vdots & \vdots & \ddots
 +\end{array} \right )
 +\]
 +
 +Finally, in section 2.iii.2, we showed that a projection operator $\hat{P}$ is Hermitian $\hat{P}^{\dagger} = \hat{P}$ and idempotent $\hat{P}^2 = \hat{P}$.  We can now use the spectral theorem to prove the converse.
 +
 +Let $\ket{\psi}$ be a nonzero eigenvector of $\hat{P}$.  Since $\hat{P}$ is Hermitian, the corresponding eigenvalue, $a$, must be real.
 +\begin{equation}
 +\label{proj1}
 +\hat{P} \ket{\psi} = a\ket{\psi}.
 +\end{equation}
 +Now apply \hat{P} to both sides again
 +\[\hat{P}^2 \ket{\psi} = a\hat{P}\ket{\psi}.\]
 +On the left hand side, we will use idempotency $\hat{P}^2 = \hat{P}$ and on the right hand side we will use the eigenvalue equation $\hat{P} \ket{\psi} = a\ket{\psi}$ again.  This gives,
 +\begin{equation}
 +\label{proj2}
 +\hat{P} \ket{\psi} = a^2 \ket{\psi}.
 +\end{equation}
 +
 +Comparing equations \eqref{proj1} and \eqref{proj2}, we have $a^2 = a$, or $a(a-1)=0$. Since $a$ is real, this means that either $a=0$ or $a=1$.
 +
 +This means that $\hat{P}$ has (at most) two eigenspaces.  Let $\hat{P}_0$ be the projector onto the $a=0$ eigenspace and $\hat{P}_1$ the projector onto the $a=1$ eigenspace.  By the spectral decomposition:
 +\[\hat{P} = 0\hat{P}_0 + 1\hat{P}_1 = \hat{P}_1.\]
 +Since $\hat{P}_1$ is a projector, this implies that $\hat{P}$ is a projector.  In fact, it is the projector onto its eigenspace corresponding to the eigenvalue $1$.
 +
 +{{:question-mark.png?nolink&50|}}
 +====== In Class Activities ======
 +
 +  - Show that, if $\hat{A}^{-1}$ exists and $a$ is an eigenvalue of $\hat{A}$, then $\frac{1}{a}$ is an eigenvalue of $\hat{A}^{-1}$.
 +  - Show that, if $\hat{A}$ is unitary, i.e. $\hat{A}^{\dagger}\hat{A} = \hat{I}$, and $a$ and $b$ are distinct eigenvalues of $a$ with eigenvectors $\ket{a}, \ket{b}$ then $\braket{b}{a} = 0$.
 +
 +HINT: Compute $\sand{b}{\hat{A}^{\dagger}\hat{A}}{a}$ in two different ways.  Once using $\hat{A}^{\dagger}\hat{A} = \hat{I}$ and once using
 +\[\hat{A}\ket{a} = a\ket{a}, \qquad\qquad \hat{A}\ket{b} = b\ket{b}.\]
 +