Is hermiticity a basis-dependent concept?

The relation $$ \langle Ay | x \rangle = \langle y | A x \rangle \text{ for all } x \in \text{Domain of } A\tag1$$ makes no reference to any basis at all, so it is indeed basis-independent.


In fact, this definition, which seems pretty strange when you first meet it, arises precisely out of a desire to make things basis-independent. The particular observation that sparks the definition is this:

Let $V$ be a complex vector space with inner product $⟨·,·⟩$, and let $\beta=\{v_1,\ldots,v_n\}$ be an orthonormal basis for $V$ and $A:V\to V$ a linear operator with matrix representation $A_{ij}$ over $\beta.$ Then, if this matrix representation is hermitian, i.e. if $$A_{ji}^*=A_{ij}\tag2$$ when $A$ is represented on any single orthonormal basis, then $(2)$ holds for all such orthonormal bases.

(Similarly, for a real vector space simply remove the complex conjugate.)

Now this is a weird property: it makes an explicit mention of a basis, and yet it is basis independent. Surely there must be some invariant way to define this property without any reference to a basis at all? Well, yes: it's the original statement in $(1)$.


To see how we build the invariant statement out of the matrix-based porperty, it's important to keep in mind what the matrix elements are: they are the coefficients over $\beta$ of the action of $A$ on that basis, i.e. they let us write $$ Av_j = \sum_i A_{ij}v_i.$$ Moreover, in an inner product space, the coefficients of a vector on any orthonormal basis are easily found to be the inner products of the vector with the basis: if $v=\sum_j c_j v_j$, then taking the inner product of $v$ with $v_i$ gives you $$\langle v_i,v\rangle = \sum_j c_j \langle v_i,v_j\rangle = \sum_j c_j \delta_{ij} = c_i,$$ which then means that you can always write $$v=\sum_i \langle v_i,v \rangle v_i.$$ (Note that if $V$ is a complex inner product space I'm taking $⟨·,·⟩$ to be linear in the second component and conjugate-linear in the first one.)

If we then apply this to the action of $A$ on the basis, we arrive at $$ Av_j = \sum_j A_{ij}v_i = \sum_i \langle v_i, Av_j\rangle v_i, \quad\text{i.e.}\quad A_{ij} = \langle v_i, Av_j\rangle,$$ since the matrix coefficients are unique. We have, then, a direct relation between matrix element and inner products, and this looks particularly striking when we use this language to rephrase our property $(2)$ above: the matrix for $A$ over $\beta$ is hermitian if and only if $$ A_{ji}^* = \langle v_j, Av_i\rangle^* = \langle v_i, Av_j\rangle = A_{ij}, $$ and if we use the conjugate symmetry $\langle u,v\rangle^* = \langle v,u\rangle$ of the inner product, this reduces to $$ \langle Av_i, v_j\rangle = \langle v_i, Av_j\rangle. \tag 3 $$ Now, here is where the magic happens: this expression is exactly the same as the invariant property $(1)$ that we wanted, only it is specialized for $x,y$ set to members of the given basis. This means, for one, that $(1)$ implies $(2)$, so that's one half of the equivalence done.

In addition to this, there's a second bit of magic we need to use: the equation in $(3)$ is completely (bi)linear in both of the basis vectors involved, and this immediately means that it extends to any two vectors in the space. This is a bit of a heuristic statement, but it is easy to implement: if $x=\sum_j x_j v_j$ and $y=\sum_i y_i v_i$, then we have \begin{align} \langle A y, x\rangle & = \left\langle A \sum_i y_i v_i, \sum_j x_j v_j\right\rangle && \\ & = \sum_i \sum_j y_i^* x_j \langle A v_i, v_j\rangle &&\text{by linearity} \\ & = \sum_i \sum_j y_i^* x_j \langle v_i, Av_j\rangle &&\text{by }(3) \\ & = \left\langle \sum_i y_i v_i, A\sum_j x_j v_j\right\rangle &&\text{by linearity} \\ & = \langle y, A x\rangle,&& \end{align} and this shows that you can directly build the invariant statement $(1)$ out of its restricted-to-a-basis version, $(3)$, which is itself a direct rephrasing of the matrix hermiticity condition $(2)$.

Pretty cool, right?


The definition that you have cited is indeed basis-independent as it only makes reference to the inner product $\langle\cdot,\cdot\rangle$ and the domain of $A$, neither of which is basis-dependent.

Note that "symmetric" in your above sense and "self-adjoint" in the broader sense are connected by the Hellinger-Toeplitz theorem which says that if the domain is the full Hilbert space, then the operator is self-adjoint: and this in turn means that what physicists mean by "self-adjoint" or "Hermitian" is in fact your notion of "symmetric;" operators like the Hamiltonian usually are not defined over the whole Hilbert space since they're symmetric but not bounded.

This has been argued to lead to a sort of mathematical incompleteness of quantum mechanics (Warning: PDF, warning: philosophy), however it doesn't impact most day-to-day applications in physics and isn't even present in most references about it. Here's a set of lecture notes which mentions it.


Symmetric operators are usually employed when working on real vector space, whereas Hermitian operators are usually employed when working on complex vector spaces.

In finite dimension, the associated matrix is symmetric in the first case ($a_{ij}=a_{ji}$ for all $i,\,j$), whereas it is equal to its complex conjugate transposed matrix in the second case ($a_{ij}=\overline{a_{ji}}$ for all $i,\,j$).

In both cases, the property (of being symmetric / Hermitian) is independent of the choice of basis but dependent on the choice of scalar product for the first case, Hermitian product for the second case.