If a matrix $A \in \mathbb{R}^{N\times N}$ is both row and column diagonally dominant, will it satisfy $(x^{2p-1})^T A x \geq 0, p \geq 1$?

The inequality in question is a direct consequence of the first part of the theorem below if we put $y=x^{2p-1}$. For convenience, we call a matrix $A\in M_n(\mathbb R)$ doubly dominant if it has a nonnegative diagonal and it is both diagonally dominant on each row and each column, and we call it perfectly dominant if $a_{kk}=\sum_{j\ne k}|a_{kj}|=\sum_{i\ne k}|a_{ik}|$ for each $k$.

Theorem. Let $A\in M_n(\mathbb R)$ is doubly dominant and $x,y\in\mathbb R^n$, then $y^TAx\ge0$ when

  1. $|y_{\sigma(1)}|\ge\cdots\ge|y_{\sigma(n)}|$ and $|x_{\sigma(1)}|\ge\cdots\ge|x_{\sigma(n)}|$ for some permutation $\sigma$, and
  2. $y_ix_i\ge0$ for each $i$.

If, in addition, that $A$ is perfectly dominant and all of its off-diagonal entries are non-positive, then $y^TAx$ is also nonnegative when $y_{\rho(1)}\ge\cdots\ge y_{\rho(n)}$ and $x_{\rho(1)}\ge\cdots\ge x_{\rho(n)}$ for some permutation $\rho$.

Proof. Given any doubly dominant $A$, we may define a direct graph $G$ without self-loops such that for every $i\ne j$, node $i$ is connected to node $j$ if and only if $a_{ij}\ne0$. Note that the structure of the graph $G$ solely depends on the off-diagonal entries of $A$. We do not use the diagonal entries of $A$ to construct any self-loop even if $a_{ii}\ne0$.

Every doubly dominant matrix $A$ can be written in the form of $D+\sum_{k=1}^mA_k$, where $D$ is a nonnegative diagonal matrix and each $A_k$ is a doubly dominant matrix whose graph is either a cycle or an acyclic path. This can be done recursively.

First, suppose $G$ contains some cycle $C$. Without loss of generality, assume that $C$ is $1\to2\to\cdots\to L\to1$. Let $m=\min\{|a_{12}|,\,|a_{23}|,\ldots,|a_{L-1,L}|,\,|a_{L1}|\}$ and $B$ be the matrix whose only nonzero off-diagonal entries are $b_{ij}=m\operatorname{sign}(a_{ij})$ for each edge $i\to j$ in $C$ and whose only nonzero diagonal entries are $b_{11}=\cdots=b_{LL}=m$. Then $B$ is perfectly dominant and every nonzero off-diagonal entry of $B$ has the same sign of its counterpart in $A$. Therefore $A-B$ is doubly dominant, but it has fewer nonzero entries than $A$. So, if we replace $A$ by $A-B$ and continue in this manner, we will eventually reduce $A$ to a doubly dominant matrix whose graph is acyclic.

Now suppose $G$ is acyclic. Consider a path $P$ in $G$ of maximum length. Without loss of generality, assume that $P$ is $1\to2\to\cdots\to L$. Then we must have $a_{Lj}=0$ for all $j<L$ (otherwise $L\to j\to\cdots\to L$ is a cycle), $a_{Lj}=0$ for all $j>L$ (otherwise $1\to \cdots\to L\to j$ is a longer path than $P$) and $a_{i1}=0$ for all $i>1$ (otherwise $i\to1\to\cdots\to L$ is a longer path than $P$). In other words, all off-diagonal entries on the first column and the $L$-th row of $A$ are zero.

Similar to the way we remove cycles from $A$, let $m=\min\{|a_{12}|,\,|a_{23}|,\ldots,|a_{L-1,L}|\}$ and $B$ be the matrix whose only nonzero off-diagonal entries are $b_{ij}=m\operatorname{sign}(a_{ij})$ for each edge $i\to j$ in $P$ and whose only nonzero diagonal entries are $b_{11}=\cdots=b_{LL}=m$. Then $B$ is doubly dominant by construction. Since every nonzero off-diagonal entry of $B$ has the same sign as its counterpart in $A$, and all off-diagonal entries on the first column and the $L$-th row of $A$ are zero, $A-B$ is also doubly dominant. Again, as $A-B$ has fewer nonzero entries than $A$, if we replace $A$ by $A-B$ and continue in this manner, we will eventually reduce $A$ to a doubly dominant matrix whose graph is empty. Hence $A$ becomes a nonnegative diagonal matrix and our recursion stops.

This shows that the $A$ in question is equal to $D+\sum_{k=1}^mA_k$, where $D$ is a nonnegative diagonal matrix and each $A_k$ up to reindexing is in the form of $$ A_k=m\pmatrix{1&s_1\\ &1&s_2\\ &&\ddots&\ddots\\ &&&\ddots&s_{L-1}\\ s_L&&&&1}\oplus0,\tag{3} $$ where $m>0,\,s_1,s_2,\ldots,s_{L-1}=\pm1$ and $s_L\in\{0,1,-1\}$ (the graph of $A_k$ is a cycle if $s_L=\pm1$ or an acyclic path if $s_L=0$). With this reindexing, we see that \begin{aligned} \frac{1}{m}y^TA_kx &=\sum_{i=1}^Ly_ix_i+\sum_{\text{cyc}}s_iy_ix_{i+1}\\ &\ge\sum_{i=1}^Ly_ix_i-\sum_{\text{cyc}}|y_i||x_{i+1}|\\ &\ge\sum_{i=1}^Ly_ix_i-\sum_{\text{cyc}}|y_i||x_i|\quad\text{(by rearrangement ineq. and condition 1)}\\ &=0\quad\text{(by condition 2)}. \end{aligned} Since $y^TDx=\sum_id_{ii}y_ix_i$ is also nonnegative (by condition 2), we see that $y^TAx\ge0$. This concludes the first part of the theorem.

For the second part, if $A$ is perfectly dominant and all of its off-diagonal entries are non-positive, then in the decomposition $A=D+\sum_{k=1}^mA_k$ above, the graph of each $A_k$ must be a cycle, $s_1=s_2=\cdots=s_L=-1$ in $(3)$ and $D$ must be zero. Indeed, after the removal of all cycles, the reduced $X$ will still be perfectly dominant. If its graph is non-empty, we may assume (by reindexing if necessary) that it contains an acyclic path $1\to2\to\cdots\to L$ of maximum length, and our previous argument shows that all off-diagonal entries on the first column and the $L$-th row of $X$ are zero. Hence $X$ is not perfectly dominant, which is a contradiction. Thus the graph of $X$ is empty when all cycles are removed. However, as $X$ is perfectly dominant, it must be zero when its graph is empty. Hence $D=0$ and each $A_k$ represents a cycle. It follows from the rearrangement inequality that $\frac{1}{m}y^TA_kx=\sum_{i=1}^Ly_ix_i-\sum_{\text{cyc}}y_ix_{i+1}\ge0$ when $A_k$ takes the form of $(3)$. Hence $y^TAx\ge0$.


The answer is yes.

Let $B = \frac{1}{2}\left(A+A^T\right)$. Then $B$ is a symmetric matrix. Also, for all $i=1,\ldots,N$, we have

$$\begin{align*}\sum\limits_{j\ne i}\left|b_{i,j}\right| &= \frac{1}{2}\sum\limits_{j\ne i}\left|a_{i,j}+a_{j,i}\right| \\ &\le \frac{1}{2}\left(\sum\limits_{j\ne i}\left|a_{i,j}\right| + \sum\limits_{j\ne i}\left|a_{j,i}\right|\right) \quad (\text{triangle inequality}) \\ &\le \frac{1}{2}\left(a_{i,i}+ a_{i,i}\right) \\ &= a_{i,i} \\ &= b_{i,i}. \end{align*} $$

Hence $B$ is a real symmetric matrix that is diagonally dominant and has non-negative diagonal entries. This implies that $B$ is positive semi-definite, so $\mathbf{x}^T B\mathbf{x}\ge 0$ for all $\mathbf{x}\in \Bbb{R}^N$. Since $\mathbf{x}^T B\mathbf{x} = \mathbf{x}^T A\mathbf{x}$, we have $\mathbf{x}^T A\mathbf{x}\ge 0$ for all $\mathbf{x}\in \Bbb{R}^N$.