Is a sinc-distance matrix positive semidefinite?

No, the matrix is not necessarily positive definite. For any $k$, there's a simplex of $k+1$ points in $\mathbb R^k$ all at distance $\frac{3\pi}2$ from each other. The corresponding matrix has $1$ on the diagonal and $-2/(3\pi)$ elsewhere, and its eigenvalues are $1+2/(3\pi)$ and $1-2k/(3\pi)$, so for $k\gt\frac{3\pi}2\lt5$ the matrix is indefinite.


In the arXiv paper "Schoenberg matrices of radial positive definite functions and Riesz sequences in $L^2(\mathbb{R}^n)$" by L. Golinskii, M. Malamud and L. Oridoroga, Definition 1.1 states

Let $n \in \mathbb{N}$. A real-valued and continuous function $f$ on $\mathbb{R}_+ = [0,\infty)$ is called a radial positive definite function, if for an arbitrary finite set $\{x_1,\ldots,x_m\}$, $x_k \in \mathbb{R}^n$, and $\{\xi_1,\ldots,\xi_m\} \in \mathbb{C}^m$ $$\sum_{k,j = 1}^{n}f(|x_k-x_j|)\xi_k\overline{\xi}_j \ge 0.$$

Note that this definition is equivalent to saying that the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = f(\|x_k-x_j\|)$ is positive semidefinite for any choice of points $x_1,\ldots,x_m \in \mathbb{R}^n$.

Also, Theorem 1.2 states the following:

A function $f \in \Phi_n$, $f(0) = 1$ if and only if there exists a probability measure $\nu$ on $\mathbb{R}_+$ such that $$f(r) = \int_{0}^{\infty}\Omega_n(rt)\nu(dt), ~~~~ r \in \mathbb{R}_+$$ where $$\Omega_n(s) := \Gamma(q+1)\left(\dfrac{2}{s}\right)^qJ_q(s) = \sum_{j = 0}^{\infty}\dfrac{\Gamma(q+1)}{j!\Gamma(j+q+1)}\left(-\dfrac{s^2}{4}\right)^j, ~~~~ q = \dfrac{n}{2}-1,$$ $J_q$ is the Bessel function of the first kind and order $q$.

(In their paper $\Phi_n$ denotes the set of radial positive definite functions on $\mathbb{R}^n$.)

Theorem 1 in "Metric Spaces and Completely Monotone Functions" by I. J. Schoenberg gives the proof of this result.

By applying this theorem for $n = 3$ and $\nu = \delta_1$, you get that $f(r) = \Omega_3(r) = \dfrac{\sin r}{r}$ is a radial positive definite function in $\mathbb{R}^3$. In other words, for any choice of $m$ points $x_1,\ldots,x_m \in \mathbb{R}^3$, the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = \dfrac{\sin(\|x_k-x_j\|)}{\|x_k-x_j\|}$ is positive semidefinite.


It should be clear that if you choose $m$ points $x_1,\ldots,x_m$ in $\mathbb{R}$ or $\mathbb{R}^2$, then $S$ is still guaranteed to be positive semidefinite. However, that is not the case if the points are chosen in $\mathbb{R}^n$ where $n \ge 4$. joriki's answer gives a nice counterexample for $n \ge 5$. Here is an ugly counterexample (found by randomly generating points in MATLAB) with $6$ points in $\mathbb{R}^4$:

$x_1 = (-0.7,0.8,-0.3,-0.4)$

$x_2 = (0.8,-0.8,-0.9,0.7)$

$x_3 = (-0.1,0.0,0.1,-0.1)$

$x_4 = (-0.8,-0.7,-0.1,0.2)$

$x_5 = (0.2,-0.2,0.5,-0.9)$

$x_6 = (0.4,0.8,0.8,0.9)$

You can check that the $6 \times 6$ sinc distance matrix $S$ for these points has an eigenvalue of $\approx -0.0024103$, and thus, $S$ is not positive semi definite.


Here is another approach (still incomplete).

For $x_{1}, ..., x_{N} \in \mathbb{R}^{d}$, let $\Vert\cdot \Vert_{d}$ denote the $d$-dimensional norm. We have $$S(i,j) = \frac{\sin\left(\lVert x_{i} - x_{j} \rVert_{d}\right)}{\lVert x_{i} - x_{j} \rVert_{d}} = \prod_{k=1}^{\infty}\cos\left(\frac{\lVert x_{i} - x_{j} \rVert_{d}}{2^{k}}\right),$$ where the last equality is due to Viete (see eq. (18) here). Due to Schur product theorem, answering the OP's question is then equivalent to answer whether the matrix $$C_{k}(i,j) = \cos\left(\frac{\lVert x_{i} - x_{j} \rVert_{d}}{2^{k}}\right),$$ where $k$ is a fixed natural number, is positive semidefinite or not. The previous sentence is, in turn, same as asking whether $\cos(\cdot)$ is a positive semidefinite radial function or not. It is easy to verify that the answer is yes for $d=1$. Sometime back, I remember seeing that for higher $d$ the answer is generally not true, as joriki pointed out, following some integral transform results due to I.J. Schoenberg, but I don't seem to find the exact reference for the cosine result now. Somebody more familiar with radial basis functions and approximation theory literature may help here to pinpoint the result.