Expected determinant of a random NxN matrix

If $N \ge 2$, then the expected value is $0$ since interchanging two rows preserves the distribution but negates the determinant.


As everyone above has pointed out, the expected value is $0$.

I expect that the original poster might have wanted to know about how big the determinant is. A good way to approach this is to compute $\sqrt{E((\det A)^2)}$, so there will be no cancellation.

Now, $(\det A)^2$ is the sum over all pairs $v$ and $w$ of permutations in $S_n$ of $$(-1)^{\ell(v) + \ell(w)} (1/2)^{2n-\# \{ i : v(i) = w(i) \}}$$

Group together pairs $(v,w)$ according to $u := w^{-1} v$. We want to compute $$(n!) \sum_{u \in S_n} (-1)^{\ell(u)} (1/2)^{2n-\# (\mbox{Fixed points of }u)}$$

This is $(n!)^2/2^{2n}$ times the coefficient of $x^n$ in $$e^{2x-x^2/2+x^3/3 - x^4/4 + \cdots} = e^x (1+x).$$

So $\sqrt{E((\det A)^2)}$ is $$\sqrt{(n!)^2/2^{2n} \left(1/n! + 1/(n-1)! \right)} = \sqrt{(n+1)!}/ 2^n$$


It is a little more convenient to work with random (-1,+1) matrices. A little bit of Gaussian elimination shows that the determinant of a random n x n (-1,+1) matrix is $2^{n-1}$ times the determinant of a random n-1 x n-1 (0,1) matrix. (Note, for instance, that Turan's calculation of the second moment ${\bf E} \det(A_n)^2$ is simpler for (-1,+1) matrices than for (0,1) matrices, it's just n!. It is also clearer why the determinant is distributed symmetrically around the origin.)

The log $\log |\det(A_n)|$ of a (-1,+1) matrix is known to asymptotically be $\log \sqrt{n!} + O( \sqrt{n \log n} )$ with probability $1-o(1)$; see this paper of Vu and myself. A more precise result should be that the logarithm is asymptotically normally distributed with mean $\log \sqrt{(n-1)!}$ and variance $2 \log n$. This result was claimed by Girko; the proof is unfortunately not quite complete, but the result is still likely to be true.