"Square root" of Beta(a,b) distribution

A partial answer is that $f_{a,b}$ exists for every positive integer $b$.

To see this, first recall that for every positive $s$ and $a$ the distribution Gamma$(s,a)$ has density proportional to $z^{s-1}e^{-az}$ on $z\ge0$ and that the sum of independent Gamma$(s,a)$ and Gamma$(t,a)$ is Gamma$(s+t,a)$.

If $b=1$, choose $X=e^{-Z}$ and $Y=e^{-T}$ where $Z$ and $T$ are i.i.d. and Gamma$(1/2,a)$. Then $X$ and $Y$ are i.i.d. and $XY=e^{-(Z+T)}$ where $Z+T$ is Gamma$(1,a)$, that is, exponential of parameter $a$, hence $XY$ is Beta$(a,1)$ and you are done if $b=1$.

From there, recall that the product of two independent Beta$(a,c)$ and Beta$(a+c,b-c)$ is Beta$(a,b)$ for every $c$ in $(0,b)$. Assume that $b$ is a positive integer and choose $X=e^{-Z_1-Z_2-\cdots-Z_b}$ and $Y=e^{-T_1-T_2-\cdots-T_b}$ where all the $Z_k$ and $T_i$ are independent and, for each $k$, $Z_k$ and $T_k$ are both Gamma$(1/2,a+k-1)$. Then, by the $b=1$ case, each $e^{-(Z_k+T_k)}$ is Beta$(a+k-1,1)$ hence $XY$ is the product of some independent Beta$(a,1)$, Beta$(a+1,1)$, ..., Beta$(a+b-1,1)$, hence it is Beta$(a,b)$ and you are done for every positive integer $b$.


If you're only interested in the existence of a solution Jeff's suggestion of looking at the logarithms seems to be the correct approach. Let $Z$ have Beta$(a,b)$ distribution. Suppose that there is a solution to the "square root" problem in this case, and let $X$ be a random variable with this distribution. Then, if $\phi(u) = E[e^{i u \log(Z)}] = E[Z^{i u}]$ and $\psi(u) = E[X^{i u}]$ are the characteristic functions (or Fourier transforms) of $\log(Z)$ it must be the case that $\psi(u)^2 = \phi(u)$ for all $u\in\mathbb{R}$.

Because we know a lot about Beta distributions we can actually solve explicitly that $\phi(u) = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)} \frac{\Gamma(\alpha+i u)}{\Gamma(\alpha+\beta+i u)}$. Then we can prove the existence of a solution by defining $\psi(u) = \sqrt{\phi(u)}$ and then letting $X$ be a random variable with distribution $\psi(u)$.

There are a couple of issues that need to be smoothed over. First of all, $\phi(u)$ is a complex-valued function so we need to be careful with the square root. However, since $\phi(u)\neq 0$ for all $u\in\mathbb{R}$ we can take the square root in a continuous manner. The bigger issue is that we need to be able to prove that $\psi(u)$ is a characteristic function of some probability distribution. For this we need some Fourier inversion theorems.

Theorem 14 in A Modern Approach to Probability Theory by Fristedt and Gray gives sufficient conditions for a function to be the characteristic function of a real-valued random variable. There are several conditions to check. First, that $\psi(0)=1$ and $\psi$ is continuous. These are obviously satisfied. Next, that $\psi(u)$ is "positive definite". That is, $\sum_{k=1}^n \sum_{j=1}^n \psi(u_k-u_j)z_j\bar{z}_k \geq 0$ for any complex $n$-tuple $(z_1,\ldots, z_n)$ and real $n$-tuple $(u_1,\ldots, u_n)$. I don't know how to check to see if this is true. The final condition is that $\int| \psi(u)| du < \infty$. I'm not an expert in the $\Gamma$ function, so I plotted $\psi(u)$ with Mathematica and it looks like $|\psi(u)|$ decays roughly like $|u|^{-b}$ as $|u|\rightarrow \infty$ so that $\int |\psi(u)| du < \infty$ if $b>1$.

I still can't think of how to check the "positive definite" condition, but if one can check that the problem would be solved for $b>1$.


Seeing this late, let me give an answer which everybody now is surely aware of. The answer is yes for every positive $a,b >0$. The reason is that $\log\operatorname{Beta}(a,b)$ is infinitely divisible additively so that $B(a,b)$ is infinitely divisible multiplicatively. To see the ID property for the $\log$, compute the Mellin transform with the Gamma function (as suggested above) and apply the Malmsten formula for the latter, which is in this case a Lévy-Khintchine formula.