On the existence a symmetric positive definite matrix

Extend $u_1=\frac{u}{\|u\|}$ to an orthonormal basis $\{u_1,u_2,\ldots,u_n\}$ of $\mathbb R^n$. Let $a=\langle v,u_1\rangle$ and $$ V=\pmatrix{\frac{1}{\sqrt{a}}v&u_2&\cdots&u_n}. $$ Since $\langle v,u_1\rangle>0$, $v$ is independent of $\{u_2,\ldots,u_n\}$. Thus $V$ is nonsingular and $$ Q=\frac{1}{\|u\|}VV^T $$ is positive definite. Also, $Qu=VV^Tu_1=V(\sqrt{a}e_1)=v$ where $e_1=(1,0,\ldots,0)^T$.


Not a full answer but might help. For a real matrix to be positive definite its eigenvalues must be all positive (or all negative). Consider the vector $$ v = \begin{bmatrix} a \\ b \end{bmatrix} $$ where $a>0$ and $b>0$ and the vector $$u = \begin{bmatrix} c \\ d\end{bmatrix}$$ where $c > 0$ and $d > 0$. Then by your equation

$$\begin{bmatrix} a \\ b \end{bmatrix} = Q \begin{bmatrix} c \\ d\end{bmatrix}$$

Now the eigenvalues must all be positive so if we construct a matrix in this form where $$ Q = \begin{bmatrix} \frac{a}{c} & 0 \\ 0 &\frac{b}{d} \end{bmatrix} $$

Then the eigenvalues of Q are $\frac{a}{c}$ and $\frac{b}{d}$ which are always positive so a positive definite matrix exists. It should be easy to prove that all vectors $v$ and $u$ who's elements are $>0$, have a $Q$ who is positive definite and that the matrix is diagonal. Note that this does not prove that only diagonal matrices will be valid answers but this seems to be an easy way to form Q.


A more tedious approach:

If $u,v$ are colinear then $I={\|v\| \over \|u\|} u$ will suffice, so we can suppose that $u,v$ are linearly independent.

To start with, suppose $u,v \in \mathbb{R}^2$, let $e=(1,1)^T, f=(1,-1)^T$ and choose the (unique) rotation $R$ such that $R( \hat{u}+\hat{v} ) = \lambda e$ for some $\lambda >0$, where $\hat{u}={u \over \|u\|}, \hat{v}={v \over \|v\|}$.

Let $u'=R\hat{u}, v=R\hat{v}$. Note that $u'v'$ are unit vectors, $u'+v'$ is a positive multiple of $e$ and $\langle u', v' \rangle >0$.

I claim that $u',v' >0$ (that is, all components strictly positive).

We can write $u'=\lambda t + t f, v'=\lambda t - t f$, with $\lambda >0$. Since $\langle u', v' \rangle >0$ we have $\lambda > |t|$. Hence $u'=(\lambda+t, \lambda-t) >0$ and similarly for $v'$.

Now let $D = {\| v \| \over \|u\|}\operatorname{diag}({v'_1 \over u'_1}, {v'_2 \over u'_2})$ and we have $v = R^T D R u$, and $R^TDR$ is symmetric positive definite.

Now suppose $u,v \in \mathbb{R}^n$ with $n >2$.

Now extend $u,v$ with $b_3,...,b_n$ so to get a basis for $\mathbb{R}^n$ and then use Gram Schmidt to orthonormalise. Let $U$ be the matrix formed with the basis, note that $U^Tu=e_1$ and $U^Tu,U^Tv$ span $\operatorname{sp}\{ e_1 , e_2 \}$.

Apply the above technique to the top two components of $U^Tu,U^Tv$ (the other components are zero) to get the $R^TDR$ matrix. Then form the matrix $A=\begin{bmatrix} R^TDR & 0 \\ 0 & I\end{bmatrix}$ and note that $AU^Tu = U^T v$.

Finally let $Q= U A U^T$ which is symmetric positive definite and $Qu=v$.