Does this formula have a rigorous meaning, or is it merely formal?

But there is a commutative ring available, along the lines of what Mariano says. If $k$ is a field and $V$ is a vector space, then $k \oplus V$ is a commutative ring by the rule that a scalar times a scalar, or a scalar times a vector, or a vector times a scalar, are all what you think they are. The only missing part is a vector times a vector, and you can just set that to zero. The dot product is then a special bilinear form on the algebra. In the formalism, I think that everything that you wrote makes sense.


Theo says in a comment that "even better", one should work over $\Lambda^*(V)$, the exterior algebra over $V$. The motivation is that this algebra is supercommutative. I considered mentioning this solution, and supposed that I really should have, because it arises in important formulas. For example, the Gauss formula for the linking number between two knots $K_1, K_2 \subseteq \mathbb{R}^3$ is: $$\mathrm{lk}(K_1,K_2) = \int_{K_1 \times K_2} \frac{\det \begin{bmatrix} \vec{x} - \vec{y} \\ d\vec{x} \\ d\vec{y} \end{bmatrix}}{4\pi |\vec{x} - \vec{y}|^3}$$ $$= \int_{K_1 \times K_2} \frac{\det \begin{bmatrix} x_1 - y_1 & x_2 - y_2 & x_3 - y_3 \\ dx_1 & dx_2 & dx_3 \\ dy_1 & dy_2 & dy_3 \end{bmatrix}}{4\pi |\vec{x} - \vec{y}|^3}.$$ The right way to write and interpret this formula is indeed as a determinant in the exterior algebra of differential forms. For one reason, it makes it easy to generalize Gauss' formula to higher dimensions.

However, supercommutative is not the same as commutative, and this type of determinant has fewer properties than a determinant over a commutative ring. And different properties. Such a determinant has a broken symmetry: you get a different answer if you order the factors in each term by rows than by columns. (I am using row ordering.) Indeed, the row-ordered determinant can be non-zero even if it has repeated rows. To give two examples, the determinant in the generalized Gauss formula has repeated rows, and the standard volume form in $\mathbb{R}^n$ is $$\omega = \frac{\det ( d\vec{x}, d\vec{x}, \ldots, d\vec{x} )}{n!}.$$

Happily, for Dick's question, you can truncate the exterior algebra at degree 1, which is exactly what I did. This truncation is both supercommutative and commutative.


Back in the 19th century, when people had been experimenting with determinants a lot, they might have interpreted the above definition of $B\times C$ in terms of quaternions. If $i$, $j$, and $k$ denote basis elements of $\mathbb H$ and $${\mathbf x}=x_1i+x_2j+x_3k,$$ $${\mathbf y}=y_1i+y_2j+y_3k\quad$$ are pure imaginary elements of $\mathbb H$, then the vector part $\Im(\mathbf{xy})$ of the Hamilton product $\mathbf{xy}$ is equal to the determinant

$$\Im(\mathbf{xy})=\Im(\mathbf{x})\times \Im(\mathbf{y})=\det \begin{vmatrix} i & j & k \\\\ x_1 & x_2 & x_3 \\\\ y_1 & y_2 & y_3\\\\ \end{vmatrix}.$$

There is a note by Sir Arthur Cayley where he introduces the notion of a quaternion determinant. He mentions several identities of the form

$$ \det \begin{vmatrix} {\mathbf x} & {\mathbf x} \\\\ {\mathbf y} & {\mathbf y} \\\\ \end{vmatrix} = -2\det \begin{vmatrix} i & j & k \\\\ x_1 & x_2 & x_3 \\\\ y_1 & y_2 & y_3\\\\ \end{vmatrix} $$ and $$ \det \begin{vmatrix} {\mathbf x } & {\mathbf x } & {\mathbf x } \\\\ {\mathbf y } & {\mathbf y } & {\mathbf y } \\\\ {\mathbf z } & {\mathbf z } & {\mathbf z } \\\\ \end{vmatrix} = -2\det \begin{vmatrix} {3} & i & j & k \\\\ x_0 & x_1 & x_2 & x_3 \\\\ y_0 & y_1 & y_2 & y_3\\\\ z_0 & z_1 & z_2 & z_3\\\\ \end{vmatrix} $$ where $\mathbf x$, $\mathbf y$, $\mathbf z$ are arbitrary quaternions $${\mathbf x}=x_0+x_1i+x_2j+x_3k, \mbox{ etc.}$$


I guess I'm not sure about the difference between "rigorous" and "formal". To me, $|\cdots|$ can be viewed as defining the exterior product of $n$ vectors in an $n$-dimensional vector space. So if you leave the first row blank, then what you have is a linear functional defined by taking the exterior product of $n-1$ vectors. If you fill in the blank row with the basis (that you're writing everything with respect to) $e_1, \dots, e_n$, then clearly taking the dot product of this with an arbitrary $n$th vector gives the exterior product of all $n$ vectors. It follows that this vector is just the Hodge star of the exterior product of the original $n-1$ vectors. But this is just a formal discussion, right?