Skip to content

Latest commit

 

History

History
20 lines (11 loc) · 1.22 KB

README.md

File metadata and controls

20 lines (11 loc) · 1.22 KB

SVM-Optimization

Support Vector Machines (SVMs) are a well established technique for the solution of classification problems in machine learning. A mathematical description of Support Vector Classifiers (SVCs) is provided in the SMO notebook.

Two methods are implemented to optimize SVCs (to solve non-linear classification problems):

  • Sequential Minimal Optimization (SMO),
  • Primal Estimated Sub-GrAdient SOlver for SVM (PEGASOS).

SMO

SMO is a popular method for optimizing SVCs. It solves the quadratic programming (QP) problem (expressed in the dual form) by breaking it down into a series of smallest possible subproblems. The implementation is based on John Platt's original paper.

PEGASOS

PEGASOS is an effective stochastic sub-gradient descent algorithm for solving the primal objective function. The implementation is based on the kernelized PEGASOS algorithm from original paper.

Evaluation

The implementations are tested on the moons dataset from Scikit-learn. For both methods, we plot the decision boundary and a confusion matrix showing the classification results.