Support Vector Machines are one of the most widely used machine learning techniques for classification and regression analysis. They were once top classifiers before the rise of deep learning with neural networks before the year 2010.

These are my notes that cover the mathematical foundations of Support Vector Machines, taken while attending CSCI-UA 9473 - Foundations of Machine Learning at NYU Paris. Instead of using statistics and probability theory, they make use of geometry and the idea of drawing lines to separate points.

Suppose there are a number of points (xi,yi)(\bold{x}_i, y_i) where xiRd\bold{x}_i \in \R^d and yi{+1,1}y_i \in \{+1, -1\}

x1=(x11,x12,,x1d)x2=(x21,x22,,x2d)\begin{align} \bold{x}_1 &= (x_{11}, x_{12}, \dots, x_{1d}) \\ \bold{x}_2 &= (x_{21}, x_{22}, \dots, x_{2d}) \\ \vdots \\ \end{align}