Sum of dependent random variables

This particular problem can be completely performed with Mathematica without resorting to paper and pencil.

The joint density of $X$ and $Y$ is given by the product of the marginal density of $X$ (which is $1\over2$) and the conditional density of $Y\mid X$ (which is $1\over{2\mid X \mid}$):

f[x_, y_] := Piecewise[{{(1/2)*(1/(2 Abs[x]), -1 <= x <= 1 && Abs[y] <= Abs[x] && x != 0},
   {∞, x == 0}}, 0]

Apply ProbabilityDistribution on the joint density:

d = ProbabilityDistribution[f[x, y], {x, -1, 1}, {y, -Abs[x], Abs[x]}]

Distribution of X and Y

Now find the distribution of $Z=X+Y$:

dz = TransformedDistribution[x + y, {x, y} \[Distributed] d];
pdf = PDF[dz, z]

Density of sum of X and Y

Plot[pdf, {z, -2.5, 2.5}]

Plot of density function

The expression of the pdf can be simplified to

pdf = Piecewise[{{Log[2/Abs[z]]/4, -2 < z < 2}}, 0]

Short version of pdf


This becomes quite straight forward if we note that y can be written as the product of x and an independent uniformly distributed random variable u

d = Block[{y = x u}, 
   TransformedDistribution[x + y, {x \[Distributed] UniformDistribution[{-1, 1}], 
     u \[Distributed] UniformDistribution[{-1, 1}]}]];

PDF[d, t] // InputForm
(* Piecewise[{{Log[2]/4, t == 0}, {Log[-2/t]/4, Inequality[-2, Less, t, Less, 0]}, 
  {Log[2/t]/4, Inequality[0, Less, t, Less, 2]}}, 0] *)

You can construct any distribution from pdfs of other distributions with ProbabilityDistribution, and Method -> "Normalize" will normalize it:

dist = ProbabilityDistribution[
         PDF[UniformDistribution[{-1, 1}], u] PDF[UniformDistribution[{-Abs[u], Abs[u]}], v]
       , {u, -1, 1}, {v, -1, 1}, Method -> "Normalize"];

pdf = PDF@TransformedDistribution[u + v, {u, v} \[Distributed] dist]