Kardi Teknomo
Kardi Teknomo Kardi Teknomo Kardi Teknomo
   
 
  Research
  Publications
  Tutorials
  Resume
  Resources
  Contact

Visit Tutorials below:
Adaptive Learning from Histogram
Adjacency matrix
Analytic Hierarchy Process (AHP)
ArcGIS tutorial
Arithmetic Mean
Bayes Theorem
Bootstrap Sampling
Bray Curtis Distance
Break Even Point
Chebyshev Distance
City Block Distance
Conditional Probability
Continued Fraction
Data Analysis from Questionnaire
Data Revival from Statistics
Decimal to Rational
Decision tree
Difference equations
Digital Root
Discriminant analysis
Divisibility
Eigen Value using Excel
Euclidean Distance
Euler Integration
Euler Number
Excel Iteration
Excel Macro
Excel Tutorial
Feasibility Study
Financial Analysis
Generalized Inverse
Generalized Mean
Geometric Mean
Ginger Bread Man and Chaos
Graph Theory
Growth Model
Hamming Distance
Harmonic Mean
Hierarchical Clustering
Independent Events
Incident matrix
Jaccard Coefficient
Kernel basis function
Kernel Regression
k-Means clustering
K Nearest Neighbor
LAN Connections Switch
Learning from data
Lehmer Mean
Linear Algebra
Logarithm Rules
Mahalanobis Distance
Market Basket Analysis
Mean Absolute Deviation
Mean and Average
Mean, median, mode
Minkowski Distance
Minkowski Mean
Monte Carlo Simulation
Multi Agent System
Multicriteria decision making
Mutivariate Distance
Newton Raphson
Non-Linear Transformation
Normalization Index
Normalized Rank
Ordinary Differential Equation
Page Rank
Palindrome
PI
Power rules
Prime Factor
Prime Number
Q Learning
Quadratic Function
Queueing Theory
Rank Reversal
Recursive Statistics
Regression Model
Reinforcement Learning
Root of Polynomial
Runge-Kutta
Scenario Analysis
Sierpinski gasket
Sieve of Erastosthenes
Similarity and Distance
Solving System Equation
Standard deviation
Summation Tricks
Support Vector Machines
System dynamic
Time Average
Tower of Hanoi
Variance
Vedic Square
Visual Basic (VB) tutorial
What If Analysis

 

SVM Tutorial

By Kardi Teknomo, PhD.

SVM e-book
< | Next >

Share this: Google+

What is SVM?

Support Vector Machines (SVM) is a supervised learning algorithm that classifies both linear and nonlinear data based on maximizing margin between support points and a nonlinear mapping to transform the original training data into a higher dimension. SVM was originally developed by Vapnik and Cortes and colleagues in 1992 based on the groundwork from Vapnik & Chervonenkis’ statistical learning theory in 1960s. SVM has been successfully applied in many applications including handwritten recognition, time-series prediction, speech Recognition, database marketing, protein sequence problem, breast cancer diagnosis and many more.

This tutorial will give you a very gentle introduction to SVM by giving simple step by step numerical solution using Microsoft Excel. You will learn about how you can train the SVM model, how to evaluate and use the SVM model to predict the classification. You will also learn by complete numerical example about how to handle linearly non-separable cases using slack variables and kernel tricks. By reading and do the practice of the numerical examples of this tutorial to the end, at least you will be able to ready to read other more advanced SVM books.

The topics of this tutorial is as follow

Supervised Learning Illustrative Example
What is SVM? An intuitive introduction
Linearly Separable Case
SVM Training for Two Linearly Separable Cases
Evaluating SVM Training
Class Prediction with SVM
Linearly Non Separable Case
SVM using Slack Variables
Numerical Example of SVM with Slack Variables
The Kernel Trick
Examples of Kernel Transformation
Numerical Example for SVM training with Kernel and Slack Variables
SVM for Multi classes
Winner takes all
Pair-wise classification
What is the strength of SVM?
What is the weakness or limitation of SVM?
When to use SVM?
Dual Problem of SVM Formula

< | Next >

Do you have question regarding this SVM tutorial? Ask your question here!


 

 
© 2007 Kardi Teknomo. All Rights Reserved.
Designed by CNV Media