< | Next >
Share this: Google+
What is SVM?
Support Vector Machines (SVM) is a supervised learning algorithm that classifies both linear and nonlinear data based on maximizing margin between support points and a nonlinear mapping to transform the original training data into a higher dimension. SVM was originally developed by Vapnik and Cortes and colleagues in 1992 based on the groundwork from Vapnik & Chervonenkis� statistical learning theory in 1960s. SVM has been successfully applied in many applications including handwritten recognition, time-series prediction, speech Recognition, database marketing, protein sequence problem, breast cancer diagnosis and many more.
This tutorial will give you a very gentle introduction to SVM by giving simple step by step numerical solution using Microsoft Excel. You will learn about how you can train the SVM model, how to evaluate and use the SVM model to predict the classification. You will also learn by complete numerical example about how to handle linearly non-separable cases using slack variables and kernel tricks. By reading and do the practice of the numerical examples of this tutorial to the end, at least you will be able to ready to read other more advanced SVM books.
The topics of this tutorial is as follow
Supervised Learning Illustrative Example
What is SVM? An intuitive introduction
Linearly Separable Case
SVM Training for Two Linearly Separable Cases
Evaluating SVM Training
Class Prediction with SVM
Linearly Non Separable Case
SVM using Slack Variables
Numerical Example of SVM with Slack Variables
The Kernel Trick
Examples of Kernel Transformation
Numerical Example for SVM training with Kernel and Slack Variables
SVM for Multi classes
Winner takes all
What is the strength of SVM?
What is the weakness or limitation of SVM?
When to use SVM?
Dual Problem of SVM Formula