Commutators

The commutator of two operators $\hat{A}$ and $\hat{B}$ is \[\boxed{[\hat{A},\hat{B} ] = \hat{A}\hat{B}-\hat{B}\hat{A}.}\]

Two operators commute if \[[\hat{A},\hat{B}] = 0,\] or, equivalently \[\hat{A}\hat{B} = \hat{B}\hat{A}.\]

Clearly, an operator always commutes with itself $[\hat{A},\hat{A}] = 0$.

A little less obviously, if $\hat{A}$, $\hat{B}$ and $\hat{A}\hat{B}$ are all Hermitian then $\hat{A}$ and $\hat{B}$ commute. Here is the proof: \begin{align*} [\hat{A},\hat{B}] & = \hat{A}\hat{B} - \hat{B}\hat{A} \\ & = \left ( \hat{A} \hat{B}\right )^{\dagger} - \hat{B}\hat{A}, \qquad \text{because }\hat{A}\hat{B}\text{ is Hermitian} \\ & = \hat{B}^{\dagger}\hat{A}^{\dagger} - \hat{B}\hat{A} \\ & = \hat{B}\hat{A} - \hat{B}\hat{A},\qquad \text{because }\hat{A}\text{ and }\hat{B}\text{ are Hermitian} \\ & = 0. \end{align*}

Properties of Commutators

The following properties of commutators all follow straightforwardly from the definition. You will prove two of them in an in class activity. It is very useful to use these properties when solving problems involving commutators to break down commutators of complicated operators into simpler ones. You should print out this list and stick it on your fridge.

  • $[\hat{A},\hat{B}] = -[\hat{B},\hat{A}]$
  • $[\hat{A},\hat{B} + \hat{C} + \hat{D} + \cdots] = [\hat{A},\hat{B}] + [\hat{A},\hat{C}] + [\hat{A},\hat{D}] + \cdots$
  • $[\hat{A},\hat{B}]^{\dagger} = [\hat{B}^{\dagger},\hat{A}^{\dagger}]$
  • $[\hat{A},\hat{B}\hat{C}] = [\hat{A},\hat{B}] \hat{C} + \hat{B} [\hat{A},\hat{C}]$
  • $[\hat{A}\hat{B},\hat{C}] = \hat{A}[\hat{B},\hat{C}] + [\hat{A},\hat{C}]\hat{B}$
  • The Jacobi identitiy: $\left [ \hat{A}, \left [ \hat{B},\hat{C} \right ]\right ] + \left [ \hat{B}, \left [ \hat{C},\hat{A} \right ]\right ] + \left [ \hat{C}, \left [ \hat{A},\hat{B} \right ]\right ] = 0$

Of these, the identities $[\hat{A},\hat{B}\hat{C}] = [\hat{A},\hat{B}] \hat{C} + \hat{B} [\hat{A},\hat{C}]$ and $[\hat{A}\hat{B},\hat{C}] = \hat{A}[\hat{B},\hat{C}] + [\hat{A},\hat{C}]\hat{B}$ turn out to be very useful in calculations. The way to remember them is that the operator that is on the left of the product comes out to the left and the operator that is on the right comes out to the right. For example, in $[\hat{A},\hat{B}\hat{C}]$, $\hat{B}$ is on the left of the product $\hat{B}\hat{C}$, so the term with $\hat{B}$ outside the commutator will be $\hat{B} [\hat{A},\hat{C}]$. Similarly, $\hat{C}$ is on the right of the product $\hat{B}\hat{C}$, so it will come out on the right of the commutator in the term $[\hat{A},\hat{B}] \hat{C}$. The same holds for the product $\hat{A}\hat{B}$ in the identity for $[\hat{A}\hat{B},\hat{C}]$.

Uncertainty Relations Between Operators

We will now rigorously prove a version of the uncertainty relation. This version is called the Roberson Uncertainty Relation.

We have already introduced the idea of an expectation value of a Hermitian operator. Let $\Expect{\hat{A}}$ and $\Expect{\hat{B}}$ be the expectation values of $\hat{A}$ and $\hat{B}$, i.e. \[\Expect{\hat{A}} = \sand{\psi}{\hat{A}}{\psi}, \qquad \Expect{\hat{B}} = \sand{\psi}{\hat{B}}{\psi}.\] Note that these expressions are valid when $\ket{\psi}$ is normalized, i.e. $\braket{\psi}{\psi} = 1$, which we assume throughout this section.

Recall that if these operators represent physical quantities like position, momentum or energy, then these are expectation values in the sense of probability theory. Similarly, we can define the standard deviations $\Delta A$ and $\Delta B$ in the usual way \[\Delta A = \sqrt{\Expect{\hat{A}^2} - \Expect{\hat{A}}^2},\qquad \Delta B = \sqrt{\Expect{\hat{B}^2} - \Expect{\hat{B}}^2},\] where \[\Expect{\hat{A}^2} = \sand{\psi}{\hat{A}^2}{\psi}, \qquad \Expect{\hat{B}^2} = \sand{\psi}{\hat{B}^2}{\psi}.\]

To prove the uncertainty relation we will use the Aharonov-Vaidman identity which states that if $\ket{\psi}$ is a normalized vector and $\hat{A}$ is a Hermitian operator then \[\boxed{\hat{A} \ket{\psi} = \Expect{A}\ket{\psi} + \Delta A \ket{\psi^{\perp}},}\] where $\ket{\psi^{\perp}}$ is a normalized vector that is orthogonal to $\ket{\psi}$.

Note that Yakir Aharonov is a professor at Chapman who believes that the Aharonov-Vaidman identity ought to appear in quantum mechanics textbooks. One of the reasons is that it leads to a simple proof of the Roberson uncertainty relation. Since the identity does not appear in any undergraduate quantum mechanics textbooks, what you are learning here is a unique Chapman way of doing things. You will prove the Aharonov-Vaidman identity in an in class activity.

From the Ahronov-Vaidman identity, we have \begin{align*} \hat{A}\ket{\psi} & = \Expect{\hat{A}}\ket{\psi} + \Delta A \ket{\psi^{\perp}_A}, \\ \hat{B}\ket{\psi} & = \Expect{\hat{B}}\ket{\psi} + \Delta B \ket{\psi^{\perp}_B}. \end{align*}

Taking the inner product of these two equations and its complex conjugate gives \begin{align*} \sand{\psi}{\hat{A}\hat{B}}{\psi} & = \Expect{\hat{A}}\Expect{\hat{B}} + \Delta A \Delta B \braket{\psi^{\perp}_A}{\psi^{\perp}_B} \\ \sand{\psi}{\hat{B}\hat{A}}{\psi} & = \Expect{\hat{A}}\Expect{\hat{B}} + \Delta A \Delta B \braket{\psi^{\perp}_B}{\psi^{\perp}_A}. \end{align*}

Subtracting these two equations gives \[ \sand{\psi}{(\hat{A}\hat{B}-\hat{B}\hat{A})}{\psi} = \Delta A \Delta B \left (\braket{\psi^{\perp}_A}{\psi^{\perp}_B} - \braket{\psi^{\perp}_B}{\psi^{\perp}_A} \right ), \] or, \[ \Expect{[\hat{A},\hat{B}]} = \Delta A \Delta B \left (\braket{\psi^{\perp}_A}{\psi^{\perp}_B} - \braket{\psi^{\perp}_B}{\psi^{\perp}_A} \right ), \]

Since $\braket{\psi^{\perp}_B}{\psi^{\perp}_A}$ is the complex conjugate of $\braket{\psi^{\perp}_A}{\psi^{\perp}_B}$, we can rewrite this as \[ \Expect{[\hat{A},\hat{B}]} = 2i \Delta A \Delta B \mathrm{Im} \left ( \braket{\psi^{\perp}_A}{\psi^{\perp}_B} \right ). \]

Taking the absolute value of both sides and rearranging gives \[ \Delta A \Delta B \Abs{\mathrm{Im} \left ( \braket{\psi^{\perp}_A}{\psi^{\perp}_B} \right )} = \frac{1}{2} \Abs{\Expect{[\hat{A},\hat{B}]}}. \]

Because $\ket{\psi^{\perp}_A}$ and $\ket{\psi^{\perp}_B}$ are unit vectors, $0 \leq \QProb{\psi^{\perp}_A}{\psi^{\perp}_B} \leq 1$, and hence the absolute value of the imaginary part of $\braket{\psi^{\perp}_B}{\psi^{\perp}_A}$ is also bounded between $0$ and $1$. Hence, we have \[ \boxed{\Delta A \Delta B \geq \frac{1}{2} \Abs{\Expect{[\hat{A},\hat{B}]}}}, \] which is the Robertson Uncertainty Relation.

This is an example of a preparation uncertainty relation. If you prepare a quantum system in the state $\ket{\psi}$ then you can contemplate measuring the physical quantity represented by $\hat{A}$ or the physical quantity represented by $\hat{B}$. Then, $\Delta A$ is the standard deviation of the probability distribution you would predict for $\hat{A}$ and $\Delta B$ is the standard deviation of probability distribution you would predict for $\hat{B}$. Note that, you cannot actually perform both measurements on the system, as performing one changes what you would predict for the other, so, to use some philosophical jargon, the two measurements are counterfactual alternatives.

The Case of Position and Momentum

If we take the special case of $\hat{A} = \hat{x}$ and $\hat{B} = \hat{p}$ then using $[\hat{x},\hat{p}] = i\hbar$ gives, \[\frac{1}{2} \Abs{[\hat{A},\hat{B}]} = \frac{1}{2} \Abs{[\hat{x},\hat{p}]} = \frac{1}{2} \Abs{i\hbar} = \frac{\hbar}{2}.\]

Hence, we have \[\Delta x \Delta p \geq \frac{\hbar}{2},\] which is a version of the Heisenberg uncertainty relation (for the case of preparation uncertainty).

In Class Activities

  1. Using the definition $[\hat{A},\hat{B}] = \hat{A}\hat{B} - \hat{B}\hat{A}$, prove the identity \[[\hat{A},\hat{B}\hat{C}] = \hat{B}[\hat{A},\hat{C}] + [\hat{A},\hat{B}]\hat{C}.\]
  2. Prove that commutators satisfy the Jacobi identity: \[\left [ \hat{A}, \left [ \hat{B},\hat{C} \right ]\right ] + \left [ \hat{B}, \left [ \hat{C},\hat{A} \right ]\right ] + \left [ \hat{C}, \left [ \hat{A},\hat{B} \right ]\right ] = 0.\]
  3. The Aharonov-Vaidman identity: If $\ket{\psi}$ is normalized and $\hat{A}$ is Hermitian, prove that \[\hat{A} \ket{\psi} = \Expect{A}\ket{\psi} + \Delta A \ket{\psi^{\perp}},\] where $\ket{\psi^{\perp}}$ is normalized and orthogonal to $\ket{\psi}$. HINTS:
    • For any states $\ket{\psi}$ and $\ket{\phi}$, we can always write \[\ket{\phi} = a\ket{\psi} + b\ket{\psi^{\perp}},\] where $\ket{\psi^{\perp}}$ is a normalized vector orthogonal to $\ket{\psi}$, and $a$ and $b$ are scalars. Apply this to $\ket{\phi} = \hat{A}\ket{\psi}$. Can you think of a way of determining $a$ and $b$?
    • Try taking the inner product of $\ket{\psi}$ and $\hat{A}\ket{\psi}$.
    • Try taking the inner product of $\hat{A}\ket{\psi}$ with itself.