Why is the SVM margin equal to $\frac{2}{\|\mathbf{w}\|}$?

Let $\textbf{x}_0$ be a point in the hyperplane $\textbf{wx} - b = -1$, i.e., $\textbf{wx}_0 - b = -1$. To measure the distance between hyperplanes $\textbf{wx}-b=-1$ and $\textbf{wx}-b=1$, we only need to compute the perpendicular distance from $\textbf{x}_0$ to plane $\textbf{wx}-b=1$, denoted as $r$.

Note that $\frac{\textbf{w}}{\|\textbf{w}\|}$ is a unit normal vector of the hyperplane $\textbf{wx}-b=1$. We have $$ \textbf{w}(\textbf{x}_0 + r\frac{\textbf{w}}{\|\textbf{w}\|}) - b = 1 $$ since $\textbf{x}_0 + r\frac{\textbf{w}}{\|\textbf{w}\|}$ should be a point in hyperplane $\textbf{wx}-b = 1$ according to our definition of $r$.

Expanding this equation, we have \begin{align*} & \textbf{wx}_0 + r\frac{\textbf{w}\textbf{w}}{\|\textbf{w}\|} - b = 1 \\ \implies &\textbf{wx}_0 + r\frac{\|\textbf{w}\|^2}{\|\textbf{w}\|} - b = 1 \\ \implies &\textbf{wx}_0 + r\|\textbf{w}\| - b = 1 \\ \implies &\textbf{wx}_0 - b = 1 - r\|\textbf{w}\| \\ \implies &-1 = 1 - r\|\textbf{w}\|\\ \implies & r = \frac{2}{\|\textbf{w}\|} \end{align*}


SVM

Let $X_+$ be a positive example on one gutter, such that $$\textbf{w} \cdot \textbf{x}_+ - b = 1$$

Let $X_-$ be a negative example on another gutter, such that $$\textbf{w} \cdot \textbf{x}_- - b = -1$$

The width of margin is the projection of $\textbf{x}_+ - \textbf{x}_-$ on unit normal vector , that is the dot production of $\textbf{x}_+ - \textbf{x}_-$ and $\frac{\textbf{w}}{\|\textbf{w}\|}$

\begin{align} width & = (\textbf{x}_+ - \textbf{x}_-) \cdot \frac{\textbf{w}}{\|\textbf{w}\|} \\ & = \frac {(\textbf{x}_+ - \textbf{x}_-) \cdot {\textbf{w}}}{\|\textbf{w}\|} \\ & = \frac{2}{\|\textbf{w}\|} \end{align}

The above refers to MIT 6.034 Artificial Intelligence