Why isn't 'intuitive' vector multiplication useful?

The componentwise product is occasionally useful, in fact it has a name (the Hadamard product). But it is less fundamentally important than the dot product because it does not have a nice geometric interpretation, it is not invariant under rotations and translations, it does not generalize nicely to an abstract vector space, and probably other reasons too. One reason the dot product is so useful is that we always want to write a vector as a linear combination of some other vectors, and the dot product helps us do that (at least in the case where the other vectors form an orthonormal set).


If one equips $F^n$ with usual vector addition and component-wise multiplication, then it becomes a bona fide commutative ring called the direct product $F\times F\times\cdots\times F$ of $n$ copies of the ring (in fact field)$~F$. If you also want scalar multiplication by elements of $F$ in the picture, then this ring becomes an $F$-algebra.

So this approach is not unheard of. It just does not sit well in the context where linear algebra is studied.

One of the main characteristics of linear algebra is the high degree of symmetry that the objects it studies (namely vector spaces) possess. In the case of a finite dimensional space, one can choose almost any collection of the correct number (the dimension$~d$) of vectors, and use that as a basis to establish an isomorphism with a standard vector space $F^d$; it suffices to avoid the linearly dependent families, which (among the families of $d$ vectors) are quite rare. The fact that so many choices give essentially the same result, and therefore the space has a huge number of symmetries to itself, is really a distinguishing feature of linear algebra. If you weaken the condition of working over a field to working over a ring (even a very nice one like a PID) then you loose this overwhelming symmetry aspect (though some symmetry does remain) and this radically changes the subject. Even groups, which are the essence of symmetry itself, do not have the kind of symmetry that vector spaces have.

(On the other hand, note that working over a skew field, also known as division ring, the symmetry properties are preserved, and it is quite reasonable that a beginning course in Linear Algebra should work over a skew fields. But as far as I know nobody except Bourbaki or R. Godement actually does that.)

So here is why the component-wise product is really not useful in linear algebra: it completely breaks the symmetry property of vector spaces. Unlike nonzero vectors in $F^n$, nonzero elements of the ring $F\times F\times\cdots\times F$ come in many kinds (there are for instance invertible ones, zero divisors, and idempotents), and there are consequently very few symmetries of the ring $F\times F\times\cdots\times F$ (though permutation of the factors still count). By comparison, the introduction of a scalar product does break symmetry somewhat, but to a lesser extent. Instead of the freedom of all bases, one has to restrict to orthonormal bases to get the same results among all choices. And in addition, in some applications (notably from physics) the symmetry is already reduced in the situation one intends to study (nature singles out a specific meaningful inner product).


Vector addition is useful because it has physical and geometric interpretations (like the combination of two or more forces in physics and the "parallelogram rule"). The dot product is useful because it is related to length and angle, through the formulas $||v||^2 = v \cdot v$ and $||u||||v||\cos \theta = u \cdot v$. The cross product is useful because it produces orthogonal vectors (and it also has a geometric interpretation -- $||u \times v||$ is the area of the parallelogram spanned by $u$ and $v$).

Does the multiplication you're describing have any interpretations like these? I would guess not.