A Determinantal Identity.

TL;DR:

In this note, we discuss a determinantal identity and proceed to prove it. En route, we also show that the square of a number of the form $(a^3 + b^3 + c^3 - 3 a b c)$ is also of the same form.

Target Audience:

High-school students, puzzle enthusiasts.

Prerequisites:

Background:

This problem of an identity in determinants appears as a problem in Hall & Knight's algebra textbook, Higher Algebra (problem 290, in the section on Miscellaneous Examples on page 522). The book indicates that this problem originally appeared in the mathematics exam at the Trinity College, Cambridge.

As a corollary of that, we are able to show that the square of a number of the form $(a^3 + b^3 + c^3 - 3 a b c)$ is also of the same form.

Why:

The Story:

For this problem, we will do the following:

The original problem

Let us state the original identity that appears in Hall & Knight:

Show that
$\mathrm{det}\begin{pmatrix}yz - x^2&zx - y^2&xy - z^2\\ zx - y^2&xy - z^2&yz - x^2\\xy - z^2&yz - x^2&zx - y^2\end{pmatrix} = \mathrm{det}\begin{pmatrix}r^2&u^2&u^2\\u^2&r^2&u^2\\u^2&u^2&r^2\end{pmatrix}\tag{1}$
where $r^2 = x^2 + y^2 + z^2$ and $u^2 = xy + yz + zx$.

Note that this can surely be proven by painstakingly expanding all the determinants and multiplying all the products out; but that is not the solution we are aspiring for. We are looking for a cleaner, methodical solution that will hopefully throw some insight as to why such an identity is true.

The very first thing to appreciate upon seeing an identity of this sort is to appreciate the sheer symmetry. Such mathematical beauty never fails to remind me of Blake's immortal lines in "The Tyger":

Tyger Tyger, burning bright,
In the forests of the night;
What immortal hand or eye,
Could frame thy fearful symmetry?


Armed with this Tyger, let us proceed to tease apart the symmetry of the identity above.

In order to understand the symmetry, let us consider the matrix on the LHS of the identity. Substituting variables $a = yz - x^2, b = zx - y^2, c = xy - z^2$, the matrix on the LHS can be seen to be:

$\mathrm{det}\begin{pmatrix}a&b&c\\b&c&a\\c&a&b\end{pmatrix}$.

Subproblem 1: Understand the LHS

In order to warm up on the problem, we will first show the following, that we call the

Basic Identity:

$\mathrm{det}\begin{pmatrix}a&b&c\\b&c&a\\c&a&b\end{pmatrix} = -(a^3 + b^3 + c^3 - 3 abc)$.

Let us take a moment to admire how pretty this factoid is. The form on the RHS (sans the sign) i.e. $a^3 + b^3 + c^3 - 3 abc$ is quite common in mathematics (inequalities etc.). For instance, the AM-GM inequality for 3 quantities is equivalent to the statement that $a^3 + b^3 + c^3 - 3 abc \geqslant 0$ (for $a, b, c \geqslant 0$).

How do you prove this statement? Well - it is not too hard to compute the determinant by (almost) brute-force. One may simplify the determinant (and here, use the familiar properties of determinants, see a list here) as follows. Replace the first column by the sum of all columns of the matrix (and recall that this does not change the value of the determinant), and then take the common $(a + b + c)$ factor out.

$\mathrm{det}\begin{pmatrix}a&b&c\\b&c&a\\c&a&b\end{pmatrix} = \mathrm{det}\begin{pmatrix}a + b + c&b&c\\a + b + c&c&a\\a + b + c&a&b\end{pmatrix} = (a + b + c) \mathrm{det}\begin{pmatrix}1&b&c\\1&c&a\\1&a&b\end{pmatrix}$.

At this point, I leave it to the reader to complete the proof - noting also that $a^3 + b^3 + c^3 - 3 abc = (a + b + c)(a^2 + b^2 + c^2 - ab - bc - ca)$.

So this gives us some "contextual" information about the LHS.

How about the RHS?

Subproblem 2: Understand the RHS

Here again, guided by considerations of symmetry, we will claim that:

$\mathrm{det}\begin{pmatrix}r^2&u^2&u^2\\u^2&r^2&u^2\\u^2&u^2&r^2\end{pmatrix} = {\mathrm{det}\begin{pmatrix}x&y&z\\z&x&y\\y&z&x\end{pmatrix}}^2$ where $r^2 = x^2 + y^2 + z^2$ and $u^2 = xy + yz + zx$.

How did we jump to such a claim? Well in this regard, one has to be familiar with the following property of determinants: $\mathrm{det}(AB) = \mathrm{det}(A)\mathrm{det}(B)$. Given that one knows this, there will be a conscious effort to mould the determinant on the LHS into something simpler (in this case, the square of something simpler).

Now, we notice that with $A = B = \begin{pmatrix}x&y&z\\z&x&y\\y&z&x\end{pmatrix}$, we have that $AB = \begin{pmatrix}x^2 + y^2 + z^2&xy + yz + zx&zx + xy + yz\\zx + xy + yz &x^2 + y^2 + z^2&yz + zx + xy \\yz + zx + xy&zx + xy + yz &x^2 + y^2 + z^2\end{pmatrix}$.

Also note that ${\mathrm{det}\begin{pmatrix}x&y&z\\z&x&y\\y&z&x\end{pmatrix}}^2$ $={\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}}^2$.

What happened here? The second and third rows in the matrix got swapped - while this caused a change in the sign of the determinant, since there is a square on top, the squares of the two determinants are still the same.

Summarize so far:

Having simplified the LHS and the RHS, let us take stock of what the problem wants us to prove. We have now transformed (1) so we now need to show that:
$\mathrm{det}\begin{pmatrix}a&b&c\\b&c&a\\c&a&b\end{pmatrix} = {\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}}^2 \tag{1'}$,
where $a = yz - x^2, b = zx - y^2, c = xy - z^2$.

Subproblem 3: Pulling the LHS and RHS closer to each other

Now that we have a good handle on both the RHS as well as the LHS, the task of pulling them closer to each other appears less daunting.

In fact, the proof in Subproblem 2 gives us the following

Approach (that does not work):

What we want:
We would ideally like to manipulate the matrices $\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}$ being multiplied so that the determinant does not change, and the product of the two matrices result in what we want, i.e. the matrix $\begin{pmatrix}yz - x^2&zx - y^2&xy - z^2\\ zx - y^2&xy - z^2&yz - x^2\\xy - z^2&yz - x^2&zx - y^2\end{pmatrix}$?

Why it might work:
After all, such a product (see Subproblem 2 above) did give us $\begin{pmatrix}x^2 + y^2 + z^2&xy + yz + zx&zx + xy + yz\\zx + xy + yz &x^2 + y^2 + z^2&yz + zx + xy \\yz + zx + xy&zx + xy + yz &x^2 + y^2 + z^2\end{pmatrix}$, right?

The main crux is how to get the negative signs, that too at the appropriate terms. Here's an example manipulation.

Note that going from the matrix $\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}$ to the matrix $\begin{pmatrix}x&y&z\\z&x&y\\y&z&x\end{pmatrix}$ changes the sign of the determinant - this is because the $2^{nd}$ and the $3^{rd}$ rows have been swapped. We can restore the sign of the determinant by negating the second row:

$\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix} = \mathrm{det}\begin{pmatrix}x&y&z\\-z&-x&-y\\y&z&x\end{pmatrix}$

Taking $ A = \begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}$ and $B = \begin{pmatrix}x&y&z\\-z&-x&-y\\y&z&x\end{pmatrix}$ we see that $AB = \begin{pmatrix} x^2&z^2&2xz- y^2\\ 2xy - z^2&y^2&x^2\\ y^2&2yz-x^2&z^2\end{pmatrix}$

This appears somewhat promising, enough for us to try many different substitutions and tricks using the determinants. At times in this pursuit, it felt like we were really close to the actual form desired. However, despite numerous methodical efforts, I was unable to prove the identity via this approach.

If someone succeeds using this approach, I would very much like to know in the comments below.

For now, it was time to rethink our strategy.

Strategy Overhaul

Since most of the current problem deals with the properties of the interesting matricial form $\begin{pmatrix}a&b&c\\b&c&a\\c&a&b\end{pmatrix}$, let us take another look at the Basic Identity (restated in terms of $x, y, z$ in place of $a, b, c$):
$\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix} = -(x^3 + y^3 + z^3 - 3 xyz) = (-x)^3 + (-y)^3 + (-z)^3 - 3(-x)(-y)(-z)$.

and fiddle around with other proofs for this. We are backtracking, but this is not a deterministic process in mathematics - there is always an element of luck in backtracking in the right area.

This is where complex numbers enter the stage! We will view this determinant as a function of the three variables $x, y$ and $z$; in fact this is a polynomial function (of the $3^{rd}$ degree). So if we are able to find $3$ distinct factors of this determinantal polynomial, we would be done: the product of the three factors would equal (modulo the sign) the determinant!

Well, we already know one factor of the polynomial i.e. $(x + y + z)$; see the proof above (with the necessary mapping of variables $a, b, c \rightarrow x, y, z$). Now, let $\omega$ be the cube root of unity, so that all the cube roots of $1$ are $1, \omega, \omega^2$ (where $1$ is the only real root, while $\omega, \omega^2$ are the complex roots of unity). Recall some properties of these cube roots:

Another way to prove the Basic Identity is to prove that
$\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix} = -(x + y + z) (x + y\omega + z\omega^2) (x + y\omega^2 + z\omega) \tag{2}$
and recall that
$x^3 + y^3 + z^3 - 3 xyz = (x + y + z) (x + y\omega + z\omega^2) (x + y\omega^2 + z\omega) \tag{3}.$

Compiling the results, we have: $(x + y + z) (x + y\omega + z\omega^2) (x + y\omega^2 + z\omega) = - \mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix} = \mathrm{det}\begin{pmatrix}-x&-y&-z\\-y&-z&-x\\-z&-x&-y\end{pmatrix} \tag{4}$.

I invite the reader to attempt proving (2) above; the proof involves some cute manipulation of determinants. For completeness the proof is attached below.

Note that the determinant in (2) has $(x + y + z)$ as a factor, as we saw earlier. So proving (2) boils down to proving that the other two expressions on the RHS are also factors of the LHS. Thus, for instance, we want to prove that the complex expression $(x + y\omega + z\omega^2)$ is also a factor of the above determinantal polynomial. To do this, multiply the second column of the matrix by $\omega$, the third column by $\omega^2$ and (to balance it out) divide by $\omega \times \omega^2 = \omega^3 = 1$. We get:

$ \mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix} = \mathrm{det}\begin{pmatrix}x&y\omega&z\omega^2\\y&z\omega&x\omega^2\\z&x\omega&y\omega^2\end{pmatrix}$.

Again, using $\omega^3 = 1$ in the RHS above, and now multiplying the second row by $\omega$ and the third row by $\omega^2$, we see $\mathrm{det}\begin{pmatrix}x&y\omega&z\omega^2\\y&z\omega&x\omega^2\\z&x\omega&y\omega^2\end{pmatrix} = \mathrm{det}\begin{pmatrix}x&y\omega&z\omega^2\\y\omega&z\omega^2&x\\z\omega^2&x&y\omega\end{pmatrix}$.

Now sum up the columns to see that $(x + y\omega + z\omega^2)$ is also a factor of the polynomial. Analogously, we may also see that $(x + y\omega^2 + z\omega)$ is also a factor. Also, it is easy to check that these factors are all distinct. We can thereby conclude that the product of all three of these factors must divide the determinant. Now, checking the signs of $x^3$ on both sides (or set $y = z = 0, x = 1$, and evaluate the expressions on both sides), we arrive at the conclusion that:

$ \mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix} = -(x + y + z) (x + y\omega + z\omega^2) (x + y\omega^2 + z\omega)$.

The Final Solution

Although it may not feel like it, we are really close to our final solution armed with these identities (2)-(4). From identity (2), we can claim that $ {\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}}^2 = {(x + y + z)}^2 {(x+ y\omega + z\omega^2)}^2 {(x + y\omega^2 + z\omega)}^2$.

Now stagger the terms of the RHS above as follows: $[(x + y + z)(x + y\omega + z\omega^2)][(x + y\omega + z\omega^2)(x + y\omega^2 + z\omega)] [(x + y\omega^2 + z\omega)(x + y + z)]$.

Consider the first square bracket: $[(x + y + z)(x + y\omega + z\omega^2)]$. Expanding this out and simplifying using that $(1 + \omega + \omega^2) = 0$, we get

$(x + y + z)(x + y\omega + z\omega^2)$ $= (x^2 + y^2\omega + z^2\omega^2) + xy(1 + \omega) +xz(1 + \omega^2) + yz(\omega + \omega^2) $
$= (x^2 + y^2\omega + z^2\omega^2) - xy \omega^2 - xz\omega - yz$
$= (x^2 - yz) + (y^2 - zx)\omega + (z^2 - xy)\omega^2 $.

Now the pattern is clear: the reader is invited to check the other two square brackets to see that the products yield the results as indicated below.

Thus we have that $ {\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}}^2 $
$= [(x^2 - yz) + (y^2 - zx)\omega + (z^2 - xy)\omega^2][(x^2 - yz) + (y^2 - zx)\omega^2 + (z^2 - xy)\omega] [(x^2 - yz) + (y^2 - zx) + (z^2 - xy)]$

Using identity (4) above

$=\mathrm{det}\begin{pmatrix}yz - x^2&zx - y^2&xy - z^2\\ zx - y^2&xy - z^2&yz - x^2\\xy - z^2&yz - x^2&zx - y^2\end{pmatrix}$
thereby proving (1) or (1').

QED.

Corollary:

The identity $ {\mathrm{det}\begin{pmatrix}x&y&z\\y&z&x\\z&x&y\end{pmatrix}}^2 = \mathrm{det}\begin{pmatrix}yz - x^2&zx - y^2&xy - z^2\\ zx - y^2&xy - z^2&yz - x^2\\xy - z^2&yz - x^2&zx - y^2\end{pmatrix}$ yields that ${(x^3 + y^3 + z^3 - 3 xyz)}^2 = (\alpha^3 + \beta^3 + \gamma^3 - 3\alpha\beta\gamma)$ where $\alpha = x^2 - yz, \beta = y^2 - zx, \gamma = z^2 - xy$.

Thus, the square of a number of the form $x^3 + y^3 + z^3 - 3 xyz$ is also of the same form!

Takeaways

Created 23 March 2017.