Density of $X-Y$ where $X,Y$ are independent random variables with common PDF $f(x) = e^{-x}$?

If I understood you correctly, you have, both $X$ and $Y$ being distributed by an exponential distribution, where $\lambda$ equals one. Now you want to know about the distribution of their difference, namely $Z=X-Y$. Their mass is $$P(z\ge Z)=P(z\ge X-Y)=P(z)$$ which is (for $z\le 0$) $$P(z)=\int^\infty_{0}\int^{\infty}_{x-z}e^{-x}e^{-y}\,dy\,dx,$$ as the area of interest is $y\ge x-z$. Next, we know that the density $$p(z)=\frac{d}{dz}P(z),$$ is the derivative of the mass. Using the Leibnitz rule, this is $$\frac{d}{dz}\int^\infty_{0}\int^\infty_{x-z}e^{-x}e^{-y} \, dy \, dx = \int^\infty_0 \frac{d}{dz}\int^\infty_{x-z}e^{-x}e^{-y}\,dy\,dx$$ $$\int^\infty_{-\infty} e^{-x}e^{-(x-z)} \, dx=\frac{e^z}{2}$$ After repeating the computation of $z\ge 0$, which would entail calculating $$\frac{d}{dz}P(z)=\int^\infty_0 \int^{x+z}_0 e^{-x}e^{-y} \, dy \, dx,$$ we arrive at $$p(z)=\frac{e^{-|z|}}{2}$$

Note that this known as the Laplace distribution.


\begin{align} \underbrace{\text{For } u>0} \text{ we have } f_{X-Y}(u) & = \frac d {du} \Pr(X-Y\le u) \\[10pt] & = \frac d {du} \operatorname{E}(\Pr(X-Y \le u \mid Y)) \\[10pt] & = \frac d {du} \operatorname{E}(\Pr(X \le u+Y\mid Y)) \\[10pt] & = \frac d {du} \operatorname{E}(1-e^{-(u+Y)}) \\[10pt] & = \frac d {du} \int_0^\infty (1 - e^{-(u+y)} ) e^{-y} \, dy \\[10pt] & = \frac d {du} \int_0^\infty (e^{-y} - e^{-u} e^{-2y}) \, dy \\[10pt] & = \frac d {du} \left( 1 - \frac 1 2 {} e^{-u} \right) \\[10pt] & = \frac 1 2 e^{-u}. \end{align} A similar thing applied when $u<0$ gives you $\dfrac 1 2 e^u,$ so you get $\dfrac 1 2 e^{-|u|}.$

But a simpler way to deal with $u<0$ is to say that since the distribution of $X-Y$ is plainly symmetric about $0$ (since $X-Y$ has the same distribution as $Y-X$), if you get $\dfrac 1 2 e^{-u}$ when $u>0,$ you have to get $\dfrac 1 2 e^u$ when $u<0.$


The transformation is $(X,Y)\rightarrow (Y_1,Y_2)$.

$Y_1=X+Y, Y_2=\dfrac{X-Y}{X+Y}$.

Let $y_1=x+y,y_2=\dfrac{x-y}{x+y}$, i.e., $x=\dfrac{y_1(1+y_2)}{2},y=\dfrac{y_1(1-y_2)}{2}$. Now $x>0,y>0$, hence $y_1>0, -1<y_2<1$

$J=\begin{bmatrix}\dfrac{1+y_2}{2}&\dfrac{y_1}{2}\\\dfrac{1-y_2}{2}&\dfrac{-y_1}{2}\end{bmatrix}$. Here, $\det(J)=\dfrac{-y_1}{2}$

Now $\begin{align}f_{(Y_1,Y_2)}(y_1,y_2)=|\det(J)|f_{(X,Y)}(x,y)=\dfrac{y_1e^{-y_1}}{2}I(y_1>0,-1<y_2<1)\\=y_1e^{-y_1}I(y_1>0)\cdot\dfrac{1}{2}I(-1<y_2<1)\end{align}$

Here $I(\cdot)$ is indicator function.


But I doubt you can recover the pdf of $X-Y$ easily. So, one way to do this analogous to the way you want is taking $Y_1=X-Y, Y_2=\dfrac{X+Y}{X-Y}$.


the reason Rohatgi Probability and statistics used this technique is because of independence of $X+Y,\dfrac{X-Y}{X+Y}$. But that will not work here and eventually the calculation will become very messy.