# Chemistry - How to convert from spin orbitals to spatial orbitals in the Hartree-Fock approximation?

Technical Note: This page makes heavy use of MathJax,give it time to load.

$ %some shortcuts \newcommand{\op}[1]{\mathbf{#1}} \newcommand{\ve}[1]{\mathbf{#1}} \newcommand{\id}[1]{\mathrm{#1}} \newcommand{\bra}[1]{\left\langle#1\right|} \newcommand{\ket}[1]{\left|#1\right\rangle} \newcommand{\bracket}[2]{\left\langle#1\middle|#2\right\rangle} \newcommand{\diff}{\mathrm{d}} \newcommand{\eps}[1]{\varepsilon_{#1}} $

### Preamble

I need to get this out of the way first. While I think the exercises in Szabo-Ostlund are a great way to learn maths and understand how lazy chemists and physicists are when it comes to writing, I consider them all painful and of no particular pedagogical value. I don't think it is particular useful to explain everything with the dihydrogen model and leave the generalisation to the user. From my personal point of view, it should be the other way around. With that out of the way, I admire you for working with this manuscript.

It took me quite some time getting into this topic again and I spent various pencils and a lot of paper on it. As you mentioned, you have been doing the same. You also found that the same exercise will be given on S.O. page 352 again (6.8). Because you asked for hints, I will provide the key step as a single formula first, so that you have the chance of finding it without any help. At the end of the post I will explain the reasoning, too.

### The Goal

Let us reconsider what our exercise was:

Exercise 2.18(p. 85) In Chapter 6, where we consider perturbation theory, we show that the leading correction to the Hartree-Fock ground state energy is $$\begin{align} E_0^{(2)} &= \frac{1}{4} \sum_{abrs} \frac{\left|\bra{ab}\ket{rs}\right|^2} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \end{align}$$ Show that for a closed-shell system (where $\eps{i}=\eps{\bar{\imath}}$) this becomes $$\begin{align} E_0^{(2)} &= \sum_{a,b=1}^{N/2}\quad \sum_{r,s=N/2+1}^{K} \frac{\bracket{ab}{rs} \left(2 \bracket{rs}{ab} -\bracket{rs}{ba}\right)} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \end{align}$$

And because it is the same but different:

Exercise 6.8(p 352) Derive Eqs. (6.73) and (6.74) starting with Eq. (6.72). $$\begin{align} E_0^{(2)} &= \frac{1}{4} \sum_{abrs} \frac{\left|\bra{ab}\ket{rs}\right|^2} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \tag{6.72}\\ E_0^{(2)} &= \frac{1}{2} \sum_{abrs} \frac{\bracket{ab}{rs}\bracket{rs}{ab}} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} -\frac{1}{2} \sum_{abrs} \frac{\bracket{ab}{rs}\bracket{rs}{ba}} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \tag{6.73}\\ E_0^{(2)} &= 2 \sum_{abrs}^{N/2} \frac{\bracket{ab}{rs}\bracket{rs}{ab}} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} - \sum_{abrs}^{N/2} \frac{\bracket{ab}{rs}\bracket{rs}{ba}} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \tag{6.74}\\ \end{align}$$

*Annotation: It is actually just simplified Dirac notation. There is nothing especially "physicist" about it. And even though they state otherwise I consider this lazy. (p. 350)*

The notation $\bracket{ij}{kl}$ refers to the electron ordering 1,2,1,2.

What SO calls the chemists notation is better referred to as charge-cloud notation as the electron ordering is 1,1,2,2. It is usually denoted as $\left[ij\middle|kl\right]$.

Therefore $\bracket{ij}{kl}=\left[ik\middle|jl\right]$.

We need the following first $$\bra{ij}\ket{kl} = \bracket{ij}{kl}-\bracket{ij}{lk}.$$

We also need to consider that the indices early in the alphabet $a,b,\dots\in\mathbb{N}$ refer to occupied orbitals, hence $a,b,\dots<N$ with $N$ being the total number of electrons. Then $r,s,\dots\in\mathbb{N}$ belong to virtual orbitals.

It is further implied that summation signs without any upper limit are considered spin orbitals and to have $N$ as an upper boundary for occupied and an arbitrary number $K$ (total number of basis functions) for virtual.

A sum with multiple indices refers to as many sum signs. $$\sum_{ijkl} \text{terms} = \sum_i\sum_j\sum_k\sum_l \text{terms}$$

### Regular two electron integrals

I have no idea what this actually means, so I just made the assumption, that we just consider real functions. Therefore a lot of integrals will be symmetrical. $$\bracket{ij}{kl} = \bracket{kj}{il} = \bracket{il}{kj} = \bracket{kl}{ij} = \bracket{ji}{lk} = \bracket{li}{jk} = \bracket{jk}{li} = \bracket{lk}{ji} $$

First of all we decode the shorthand and expand the square $$\begin{align} E_0^{(2)} &= \frac{1}{4} \sum_{abrs} \frac{\left|\bra{ab}\ket{rs}\right|^2} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \\ E_0^{(2)} &= \frac{1}{4} \sum_{abrs} \frac{\left|\bracket{ab}{rs}-\bracket{ab}{rs}\right|^2} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \tag{1}\\ E_0^{(2)} &= \frac{1}{4} \sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1} \\&\qquad\qquad \left\{ \frac{\bracket{ab}{rs}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}} + \frac{\bracket{ab}{sr}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}} -2\frac{\bracket{ab}{rs}\bracket{ab}{sr}}{\eps{a} +\eps{b} -\eps{r} -\eps{s}} \right\} \tag{2}\\ \end{align}$$

We can further proof that
$$\begin{multline}
\frac{1}{4} \sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1}
\frac{\bracket{ab}{rs}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}}\\
=
\frac{1}{4} \sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1}
\frac{\bracket{ab}{sr}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}}\tag{3}
\end{multline}$$
because $a,b=1$ and $r,s=N+1$ and the integral permutations become identical. I will explain this at the end with an example. *This is the key.* Therefore we can further reduce to
$$\begin{align}
E_0^{(2)} &=
\frac{1}{4} \sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1}
2\frac{\bracket{ab}{rs}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}}
\\&\qquad
- \frac{1}{4} \sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1}
2\frac{\bracket{ab}{rs}\bracket{ab}{sr}}{\eps{a} +\eps{b} -\eps{r} -\eps{s}}
.\tag{4}\\
\end{align}$$
We use symmetry and find (6.73)
$$\begin{align}
E_0^{(2)} &=
\frac{1}{2} \sum_{abrs}
\frac{\bracket{ab}{rs}\bracket{rs}{ab}}
{\eps{a} +\eps{b} -\eps{r} -\eps{s}}
-\frac{1}{2} \sum_{abrs}
\frac{\bracket{ab}{rs}\bracket{rs}{ba}}
{\eps{a} +\eps{b} -\eps{r} -\eps{s}}
\tag{5 $\equiv$ 6.73}\\
\end{align}$$

### Closed shell

This is what you tried first, me too. This is where short notation will gently kicks you into your derrière, pardon my French.

Let us consider the definitions for spin orbitals and spatial orbitals: \begin{align} \chi_1(\ve{x}_1) &=\psi_1(\ve{r}_1)\alpha(s_1)\\ \chi_2(\ve{x}_2) &=\psi_1(\ve{r}_2)\beta(s_2)\\ \chi_3(\ve{x}_3) &=\psi_2(\ve{r}_3)\alpha(s_3)\\ \chi_4(\ve{x}_4) &=\psi_2(\ve{r}_4)\beta(s_4)\\ \vdots &\qquad\vdots\\ \chi_{N-1}(\ve{x}_{N-1}) &=\psi_{N/2}(\ve{r}_{N-1})\alpha(s_{N-1})\\ \chi_{N}(\ve{x}_{N}) &=\psi_{N/2}(\ve{r}_{N})\beta(s_{N})\\ \vdots &\qquad\vdots\\ \end{align}

Since $a,b,\dots,r,s,\dots$ are arbitrary natural numbers it does not really matter which index we choose, but it can lead to great confusion if on the left side $a$ refers to a different number than on the right side.

Hence I choose, for a completely arbitrary number $酒$ of spinfunctions $\chi$ to represent them as a product of a spatial function $\psi$ with a spin component $\alpha,\beta$. The set of functions is therefore half the number of the complete set. $$\sum_{a}^{(酒)} \chi_a = \sum_{c}^{(酒/2)}\psi_c\alpha + \sum_{c}^{(酒/2)}\psi_c\beta,$$ We are basically introducing just a new name of indices to avoid confusion. I will now shorten $\psi_c\alpha$ to $c$ and $\psi_c\beta$ to $\bar{c}$. And this will apply analogous to all other kinds of indices. Therefore we write symbolically: $$\sum_{a}^{(酒)} a = \sum_{c}^{(酒/2)}c + \sum_{\bar{c}}^{(酒/2)}\bar{c}$$

Now we need to expand equation 5 into this formalism. We then will see that it simplifies quite a lot. Without spoiling the surprise, we can see that the number of integrals that survive spin integration is exactly twice in the first term than in the second term, just because it is symmetric.

In general we know that any integral where electron one has $alpha$ spin on the left side and $beta$ spin on the right side will be zero, hence vanish. Therefore the integrals of the form $$ \bracket{\bar{i}j}{kl}; \bracket{i\bar{j}}{kl}; \bracket{ij}{\bar{k}l}; \bracket{ij}{k\bar{l}};\\ \bracket{\bar{i}\bar{j}}{kl}; \bracket{ij}{\bar{k}\bar{l}}; \bracket{\bar{i}j}{k\bar{l}}; \bracket{i\bar{j}}{\bar{k}l};\\ \bracket{i\bar{j}}{\bar{k}\bar{l}}; \bracket{\bar{i}j}{\bar{k}\bar{l}}; \bracket{\bar{i}\bar{j}}{k\bar{l}}; \bracket{\bar{i}\bar{j}}{\bar{k}l}; $$ will all vanish.

I will provide the full terms at the end, so you can check your own attempt, but here is equation 5 with all non-vanishing terms. Note that we need to include the limits to the sum here, because we have expanded the spinorbitals into spatial orbitals. Because we are having a closed shell system, the number of electrons $N$ is even, while the total number of basis functions $M$ could be anything. Please also note, that I am not including the starting numbers for the indices, because it will look a bit ugly. Hence it is implied by the notation, that $a,b,c,d=1$, $r,s=N+1$ and $t,u=N/2+1$. (If we're lazy, we go the whole nine yards.) $$\begin{align} E_0^{(2)} &= \frac{1}{2} \sum_{abrs} \frac{\bracket{ab}{rs}\bracket{rs}{ab}} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} -\frac{1}{2} \sum_{abrs} \frac{\bracket{ab}{rs}\bracket{rs}{ba}} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \tag{5 $\equiv$ 6.73}\\ &=\phantom{-}\Bigg( \frac{1}{2} \sum_{cd}^{N/2}\sum_{tu}^M \frac{\bracket{cd}{tu}\bracket{tu}{cd}} {\eps{c} +\eps{d} -\eps{t} -\eps{u}} +\frac{1}{2} \sum_{\bar{c}d}^{N/2}\sum_{\bar{t}u}^M \frac{\bracket{\bar{c}d}{\bar{t}u}\bracket{\bar{t}u}{\bar{c}d}} {\eps{\bar{c}} +\eps{d} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\Bigg(} +\frac{1}{2} \sum_{c\bar{d}}^{N/2}\sum_{t\bar{u}}^M \frac{\bracket{c\bar{d}}{t\bar{u}}\bracket{t\bar{u}}{c\bar{d}}} {\eps{c} +\eps{\bar{d}} -\eps{t} -\eps{\bar{u}}} +\frac{1}{2} \sum_{\bar{c}\bar{d}}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\bracket{\bar{c}\bar{d}}{\bar{t}\bar{u}} \bracket{\bar{t}\bar{u}}{\bar{c}\bar{d}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} \Bigg)\\&\phantom{=} -\left( \frac{1}{2} \sum_{cd}^{N/2}\sum_{tu}^M \frac{\bracket{cd}{tu}\bracket{tu}{dc}} {\eps{c} +\eps{d} -\eps{t} -\eps{u}} +\frac{1}{2} \sum_{\bar{c}\bar{d}}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\bracket{\bar{c}\bar{d}}{\bar{t}\bar{u}} \bracket{\bar{t}\bar{u}}{\bar{d}\bar{c}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} \right)\tag{6} \end{align}$$

Now we can get rid of the spin and notice, that all integrals in the first parenthesis are equal, as wall as in the second. $$\begin{align} E_0^{(2)} &= 2 \sum_{cd}^{N/2}\sum_{tu}^M \frac{\bracket{cd}{tu}\bracket{tu}{cd}} {\eps{c} +\eps{d} -\eps{t} -\eps{u}} \sum_{cd}^{N/2}\sum_{tu}^M \frac{\bracket{cd}{tu}\bracket{tu}{dc}} {\eps{c} +\eps{d} -\eps{t} -\eps{u}} \tag{7}\end{align}$$

When we now relabel $c,d\to a,b$ and $t,u\to s,r$ we will get (6.74) and when we move $\bracket{cd}{tu}\bracket{tu}{cd}$ in front of the parenthesis we have our desired equation for exercise 2.18. $$\begin{align} E_0^{(2)} &= \sum_{ab}^{N/2}\quad \sum_{rs}^{M} \frac{\bracket{ab}{rs} \left(2 \bracket{rs}{ab} -\bracket{rs}{ba}\right)} {\eps{a} +\eps{b} -\eps{r} -\eps{s}} \end{align}$$

### Additional notes

After doing this exercise and I also spent quite a few hours on it, I still have the feeling I have not learned anything new. Surely, I do now understand the lazy notation better. I can also see why introducing the restriction of closed shell orbitals is a quite good idea from a computational point of view. But I did know that before and it seems more than logical when you consider that you slash your complete basis in halves.

Well, since I like crosswords and the likes, I actually liked thinking about this. Apart from this I don't think this exercise furthers understanding quantum chemistry.

While I did some reading along the lines, I came across a book on the shelf of my sensei, which has also a focus on implementation. I guess it doesn't hurt checking it out, if you have access to it: David B. Cook; Hondbook of Computational Chemistry; Oxford University Press: Oxford, New York, Tokyo, 1998. (New edition 2005 @ Dover publications)

### Proof (deduction) of (3)

Recall what I used earlier, I already threw away the $\frac{1}{4}$:
$$\small
\sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1}
\frac{\bracket{ab}{rs}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}}
= \sum_{a=1}\sum_{b=1}\sum_{r=N+1}\sum_{s=N+1}
\frac{\bracket{ab}{sr}^2}{\eps{a} +\eps{b} -\eps{r} -\eps{s}}\tag{3'}
$$
I am no mathematician. The only way I knew how to tackle this was to actually expand each term (completely). And yes, that is quite some work.

To make things a little easier, we can reduce the terms a little further, because the denominator on both sides is equal when $a,b,r,s\in\mathbb{N}$; $a=[1,N]$; $b=[1,N]$; $r=[N+1,M]$; $s=[N+1,M]$; and $N,M$ are the same on the left and the right side.

Therefore we "simply" need to show that
$$
\sum_{a=1}^N\sum_{b=1}^N\sum_{r=N+1}^M\sum_{s=N+1}^M
\bracket{ab}{rs}^2
= \sum_{a=1}^N\sum_{b=1}^N\sum_{r=N+1}^M\sum_{s=N+1}^M
\bracket{ab}{sr}^2\tag{A1}
$$

Now we again need to simplify our notation or we will not get anywhere. For $a=1$ we simply write $1$ and for $r=N+1$ we also simply write $1$, where the position of the number implies if it is part of the first set or the second. (I can't believe I am actually writing this.) In the integral we will from now on use a semicolon to separate the functions. Since there are so many squares and we don't need them, we also get rid of that. Let's make a practical example: $$\text{For} a=1,b=2,r=N+1,s=N+2: \bracket{ab}{rs}^2:=\bracket{1;2}{1;2}$$

Let's tackle the left side first: $$\small\begin{align} \sum_{a=1}^N\sum_{b=1}^N\sum_{r=N+1}^M\sum_{s=N+1}^M \bracket{ab}{rs}^2 &= \begin{array}{l} \color{\green}{ \phantom{+} \bracket{1;1}{1;1} +\bracket{2;1}{1;1} +\bracket{1;2}{1;1} +\bracket{2;2}{1;1} +\cdots+\bracket{N;N}{1;1}}\\ \color{\red}{ +\bracket{1;1}{2;1} +\bracket{2;1}{2;1} +\bracket{1;2}{2;1} +\bracket{2;2}{2;1} +\cdots+\bracket{N;N}{2;1}}\\ \color{\navy}{ +\bracket{1;1}{1;2} +\bracket{2;1}{1;2} +\bracket{1;2}{1;2} +\bracket{2;2}{1;2} +\cdots+\bracket{N;N}{1;2}}\\ \color{\green}{ +\bracket{1;1}{2;2} +\bracket{2;1}{2;2} +\bracket{1;2}{2;2} +\bracket{2;2}{2;2} +\cdots+\bracket{N;N}{2;2}}\\ +\qquad\vdots\\ +\bracket{1;1}{M;M} +\bracket{2;1}{M;M} +\bracket{1;2}{M;M} +\bracket{2;2}{M;M} +\cdots+\bracket{N;N}{M;M}\\ \end{array} \end{align}$$

This doesn't really help, because we also need the right side. $$\small\begin{align} \sum_{a=1}^N\sum_{b=1}^N\sum_{r=N+1}^M\sum_{s=N+1}^M \bracket{ab}{sr}^2 &= \begin{array}{l} \color{\green}{ \phantom{+} \bracket{1;1}{1;1} +\bracket{2;1}{1;1} +\bracket{1;2}{1;1} +\bracket{2;2}{1;1} +\cdots+\bracket{N;N}{1;1}}\\ \color{\navy}{ +\bracket{1;1}{1;2} +\bracket{2;1}{1;2} +\bracket{1;2}{1;2} +\bracket{2;2}{1;2} +\cdots+\bracket{N;N}{1;2}}\\ \color{\red}{ +\bracket{1;1}{2;1} +\bracket{2;1}{2;1} +\bracket{1;2}{2;1} +\bracket{2;2}{2;1} +\cdots+\bracket{N;N}{2;1}}\\ \color{\green}{ +\bracket{1;1}{2;2} +\bracket{2;1}{2;2} +\bracket{1;2}{2;2} +\bracket{2;2}{2;2} +\cdots+\bracket{N;N}{2;2}}\\ +\qquad\vdots\\ +\bracket{1;1}{M;M} +\bracket{2;1}{M;M} +\bracket{1;2}{M;M} +\bracket{2;2}{M;M} +\cdots+\bracket{N;N}{M;M}\\ \end{array} \end{align}$$

Now you can probably already guess from the colour-coding where I am going with this. We can easily see now, that the sums are equivalent since $a,b$ and $r,s$ run over the same indices in both cases. The only difference we have is in which order the integrals will appear. So the green lines are identical in both cases (and zero btw.) while the red and navy ones just interchange positions.

### Full expansion of (6)

Note that I actually appreciate how insane this is. I am including this, because I noticed that your expansion was incomplete. It still leads to the correct conclusion, because the terms you did not include are all vanishing. $$%\require{cancel} \newcommand{\cancel}[1]{\color{\red}{#1}} \begin{align} E_0^{(2)} &= \phantom{-}\frac{1}{2}\Bigg( %all alpha \sum_{cd}^{N/2}\sum_{tu}^M \frac{\bracket{cd}{tu}\bracket{tu}{cd}} {\eps{c} +\eps{d} -\eps{t} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} %three alpha one beta + \sum_{\bar{c}d}^{N/2}\sum_{tu}^M \frac{\cancel{\bracket{\bar{c}d}{tu}} \cancel{\bracket{tu}{\bar{c}d}}} {\eps{\bar{c}} +\eps{d} -\eps{t} -\eps{u}} + \sum_{c\bar{d}}^{N/2}\sum_{tu}^M \frac{\cancel{\bracket{c\bar{d}}{tu}} \cancel{\bracket{tu}{c\bar{d}}}} {\eps{c} +\eps{\bar{d}} -\eps{t} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{cd}^{N/2}\sum_{\bar{t}u}^M \frac{\cancel{\bracket{cd}{\bar{t}u}} \cancel{\bracket{\bar{t}u}{cd}}} {\eps{c} +\eps{d} -\eps{\bar{t}} -\eps{u}} + \sum_{cd}^{N/2}\sum_{t\bar{u}}^M \frac{\cancel{\bracket{cd}{t\bar{u}}} \cancel{\bracket{t\bar{u}}{cd}}} {\eps{c} +\eps{d} -\eps{t} -\eps{\bar{u}}} \\&\phantom{=-\frac{1}{2}\Bigg(} %two alpha two beta + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{tu}^M \frac{\cancel{\bracket{\bar{c}\bar{d}}{tu}} \cancel{\bracket{tu}{\bar{c}\bar{d}}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{t} -\eps{u}} + \sum_{c\bar{d}}^{N/2}\sum_{\bar{t}u}^M \frac{\cancel{\bracket{c\bar{d}}{\bar{t}u}} \cancel{\bracket{\bar{t}u}{c\bar{d}}}} {\eps{c} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{cd}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\cancel{\bracket{cd}{\bar{t}\bar{u}}} \cancel{\bracket{\bar{t}\bar{u}}{cd}}} {\eps{c} +\eps{d} -\eps{\bar{t}} -\eps{\bar{u}}} + \sum_{\bar{c}d}^{N/2}\sum_{\bar{t}u}^M \frac{\bracket{\bar{c}d}{\bar{t}u}\bracket{\bar{t}u}{\bar{c}d}} {\eps{\bar{c}} +\eps{d} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{c\bar{d}}^{N/2}\sum_{t\bar{u}}^M \frac{\bracket{c\bar{d}}{t\bar{u}}\bracket{t\bar{u}}{c\bar{d}}} {\eps{c} +\eps{\bar{d}} -\eps{t} -\eps{\bar{u}}} + \sum_{\bar{c}d}^{N/2}\sum_{t\bar{u}}^M \frac{\cancel{\bracket{\bar{c}d}{t\bar{u}}} \cancel{\bracket{t\bar{u}}{\bar{c}d}}} {\eps{\bar{c}} +\eps{d} -\eps{t} -\eps{\bar{u}}} \\&\phantom{=-\frac{1}{2}\Bigg(} %one alpha three beta + \sum_{c\bar{d}}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\cancel{\bracket{c\bar{d}}{\bar{t}\bar{u}}} \cancel{\bracket{\bar{t}\bar{u}}{c\bar{d}}}} {\eps{c} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} + \sum_{\bar{c}d}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\cancel{\bracket{\bar{c}d}{\bar{t}\bar{u}}} \cancel{\bracket{\bar{t}\bar{u}}{\bar{c}d}}} {\eps{c} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{t\bar{u}}^M \frac{\cancel{\bracket{\bar{c}\bar{d}}{t\bar{u}}} \cancel{\bracket{t\bar{u}}{\bar{c}\bar{d}}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{t} -\eps{\bar{u}}} + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{\bar{t}u}^M \frac{\cancel{\bracket{\bar{c}\bar{d}}{\bar{t}u}} \cancel{\bracket{\bar{t}u}{\bar{c}\bar{d}}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} %all beta + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\bracket{\bar{c}\bar{d}}{\bar{t}\bar{u}} \bracket{\bar{t}\bar{u}}{\bar{c}\bar{d}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} \Bigg)\\ &\phantom{=} -\frac{1}{2}\Bigg( %all alpha \sum_{cd}^{N/2}\sum_{tu}^M \frac{\bracket{cd}{tu}\bracket{tu}{dc}} {\eps{c} +\eps{d} -\eps{t} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} %three alpha one beta + \sum_{\bar{c}d}^{N/2}\sum_{tu}^M \frac{\cancel{\bracket{\bar{c}d}{tu}} \cancel{\bracket{tu}{d\bar{c}}}} {\eps{\bar{c}} +\eps{d} -\eps{t} -\eps{u}} + \sum_{c\bar{d}}^{N/2}\sum_{tu}^M \frac{\cancel{\bracket{c\bar{d}}{tu}} \cancel{\bracket{tu}{\bar{d}c}}} {\eps{c} +\eps{\bar{d}} -\eps{t} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{cd}^{N/2}\sum_{\bar{t}u}^M \frac{\cancel{\bracket{cd}{\bar{t}u}} \cancel{\bracket{\bar{t}u}{dc}}} {\eps{c} +\eps{d} -\eps{\bar{t}} -\eps{u}} + \sum_{cd}^{N/2}\sum_{t\bar{u}}^M \frac{\cancel{\bracket{cd}{t\bar{u}}} \cancel{\bracket{t\bar{u}}{dc}}} {\eps{c} +\eps{d} -\eps{t} -\eps{\bar{u}}} \\&\phantom{=-\frac{1}{2}\Bigg(} %two alpha two beta + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{tu}^M \frac{\cancel{\bracket{\bar{c}\bar{d}}{tu}} \cancel{\bracket{tu}{\bar{d}\bar{c}}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{t} -\eps{u}} + \sum_{c\bar{d}}^{N/2}\sum_{\bar{t}u}^M \frac{\cancel{\bracket{c\bar{d}}{\bar{t}u}} \bracket{\bar{t}u}{\bar{d}c}} {\eps{c} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{cd}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\cancel{\bracket{cd}{\bar{t}\bar{u}}} \cancel{\bracket{\bar{t}\bar{u}}{dc}}} {\eps{c} +\eps{d} -\eps{\bar{t}} -\eps{\bar{u}}} + \sum_{\bar{c}d}^{N/2}\sum_{\bar{t}u}^M \frac{\bracket{\bar{c}d}{\bar{t}u} \cancel{\bracket{\bar{t}u}{d\bar{c}}}} {\eps{\bar{c}} +\eps{d} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{c\bar{d}}^{N/2}\sum_{t\bar{u}}^M \frac{\bracket{c\bar{d}}{t\bar{u}} \cancel{\bracket{t\bar{u}}{\bar{d}c}}} {\eps{c} +\eps{\bar{d}} -\eps{t} -\eps{\bar{u}}} + \sum_{\bar{c}d}^{N/2}\sum_{t\bar{u}}^M \frac{\cancel{\bracket{\bar{c}d}{t\bar{u}}} \bracket{t\bar{u}}{d\bar{c}}} {\eps{\bar{c}} +\eps{d} -\eps{t} -\eps{\bar{u}}} \\&\phantom{=-\frac{1}{2}\Bigg(} %one alpha three beta + \sum_{c\bar{d}}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\cancel{\bracket{c\bar{d}}{\bar{t}\bar{u}}} \cancel{\bracket{\bar{t}\bar{u}}{\bar{d}c}}} {\eps{c} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} + \sum_{\bar{c}d}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\cancel{\bracket{\bar{c}d}{\bar{t}\bar{u}}} \cancel{\bracket{\bar{t}\bar{u}}{d\bar{c}}}} {\eps{c} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} \\&\phantom{=-\frac{1}{2}\Bigg(} + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{t\bar{u}}^M \frac{\cancel{\bracket{\bar{c}\bar{d}}{t\bar{u}}} \cancel{\bracket{t\bar{u}}{\bar{d}\bar{c}}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{t} -\eps{\bar{u}}} + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{\bar{t}u}^M \frac{\cancel{\bracket{\bar{c}\bar{d}}{\bar{t}u}} \cancel{\bracket{\bar{t}u}{\bar{d}\bar{c}}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{u}} \\&\phantom{=-\frac{1}{2}\Bigg(} %all beta + \sum_{\bar{c}\bar{d}}^{N/2}\sum_{\bar{t}\bar{u}}^M \frac{\bracket{\bar{c}\bar{d}}{\bar{t}\bar{u}} \bracket{\bar{t}\bar{u}}{\bar{d}\bar{c}}} {\eps{\bar{c}} +\eps{\bar{d}} -\eps{\bar{t}} -\eps{\bar{u}}} \Bigg)\tag{6} \end{align}$$

I indicated the vanishing elements with red.

If there are any remaining questions, don't hesitate and visit the associated chat. *(If this chat is no longer available, go to the main chat and ask there, please.)*

And because that was quite tedious, have a coffee stain to go:

Image is courtesy of Roger Karlsson (http://www.free-photo-gallery.org/photos/coffee-stain/) obtained from flicker.
*(I assume this is now quite close to my longest proof.)*