Rationalizing the denominator having square roots and cube roots

Hint:  for a shortcut in this particular case, let $a = 2+\sqrt{2}$ then use that:

$$ \frac{1}{a+\sqrt[3]{2}} = \frac{a^2-a\sqrt[3]{2}+\sqrt[3]{4}}{a^3+2} $$

The denominator now contains only integers and terms in $\sqrt{2}\,$ after expansion, which is the case you know how to rationalize.


It's always possible.

You want to multiply top and bottom by $M$ to get that $denominator*M$ has not radical.

As you have figured out: If the denominator is $a + b\sqrt{c}$ you mulitply by the conjugate to get $(a + b\sqrt{c})(a - b\sqrt{c}) = a^2 - b^2*c$.

This will also work with $(\sqrt a + \sqrt b)(\sqrt a - \sqrt b) = a - b$.

So it's the same idea for $a + \sqrt[k] b$. The trick is to realize that $(a + \sqrt[k]b)(a^{k-1} - a^{k-2}\sqrt[k]b + a^{k-3}(\sqrt[k]b)^2-..... \pm a(\sqrt[k]b)^{k-2} \mp (\sqrt[k]b)^{k-1} = a^k \pm b$.

Example: To deradicalize $5 + \sqrt[3]7$ multiply by $5^2 - 5*\sqrt[3]7 + (\sqrt[3]7)^2$ to get $(5 + \sqrt[3]7)(5^2 - 5*\sqrt[3]7 + (\sqrt[3]7)^2) = 5^3 + 5^2\sqrt[3]7 -5^2\sqrt[3]7 - 5*(\sqrt[3]7)^2 + 5*(\sqrt[3]7)^2 + (\sqrt[3]7)^3 = 125 + 7$.

So to deradicalize $(2 + \sqrt 2 + \sqrt[3] 2)$ just deradicalize it term by term.

First let's get rid of the $\sqrt[3]2$ term. So we multiply top and bottom by $(2+\sqrt 2)^2 - (2 +\sqrt 2)*\sqrt[3]2 + (\sqrt[3]2)^2$ to get $(2 + \sqrt 2 + \sqrt[3] 2)*[(2+\sqrt 2)^2 - (2 +\sqrt 2)*\sqrt[3]2 + (\sqrt[3]2)^2] = (2 + \sqrt 2)^3 + 2= 8 + 12 \sqrt 2 + 12\sqrt 2 + 2\sqrt 2 + 2 = 10 + 26\sqrt 2$. Then we multiply that by $10 - 26 \sqrt 2$ to get $(10 + 26\sqrt 2)(10 - 26\sqrt 2) = 100 - 2*26^2$.

So example:

\begin{align} &\frac 1 {2 + \sqrt 2 + \sqrt[3] 2} \\&= \frac {(2 + \sqrt 2)^2 - (2+\sqrt2)\sqrt[3]2 + \sqrt[3]2^2}{(2+\sqrt 2)^3 + 2}\\&= \frac {(4 + 4\sqrt 2 + 2) -2\sqrt[3] 2 - \sqrt 2\sqrt[3]2 + \sqrt[3]2^2}{10 + 26\sqrt 2}\\&= \frac {[(4 + 4\sqrt 2 + 2) -2\sqrt[3] 2 - \sqrt 2\sqrt[3]2 + \sqrt[3]2^2](10 - 26\sqrt{2})}{100 - 2*26^2} \end{align}

Okay... admittedly that is a bear... but it is doable.


There is a very general procedure for this kind of a question.

You see, in essence, rationalization of a fraction is converting the denominator into a rational, right? But indeed, it's a deeper process : Suppose a fraction of the form $\frac 1b$ can be rationalized, say written in the form $\frac cd$, where $d$ is rational.

Cross multiplying, we get $bc = d$, or that $b$ times something is a rational. This equates to the invertibility of the number $b$ in the field of real numbers, which is true all the time. So every fraction of this kind is indeed rationalizable. But the question then comes down to how to find this inverse.

One way of finding the inverse is finding the minimal polynomial with rational coefficients, which $b$ satisfies. I'll explain why.

Suppose $b$ satisfies the polynomial $\sum_{i=0}^n a_ix^i = 0$. Then, $\sum_{i=0}^n a_ib^i = 0$, so that $\sum_{i=1}^n a_ib^i = -a_0$, from where it follows that $b \left(\sum_{i=1}^n a_i b^{i-1}\right) = a_0$.

Rewriting, $$ \frac{1}{b} = \frac{\left(\sum_{i=1}^n a_i b^{i-1}\right)}{a_0} $$

which is the rationalized form.

So all you need to do, is to find a polynomial which the given surd, in our case $\sqrt 2 + \sqrt[3]2 + 2$, satisfies.

The minimal such polynomial is $x^6 - 12 x^5 + 54 x^4 - 116 x^3 + 132 x^2 - 120 x + 92$, which I found online. There's a better answer above on how to actually find a polynomial, so I will avoid that part, but at least this shows that fractions with "algebraic" denominators can be rationalized using the polynomial they satisfy.