Support Vector Machine (SVM) in Machine Learning

In machine learning, a support vector machine is a supervised learning model that makes use of associated learning algorithms to analyze the data for classification and regression problems. Before riding forward you should have knowledge of logistic regression and linear regression. If not feel free to first learn from my linked blog. If you are continuing to read I assume you know about those domains. From a set of training examples where each one marked to one or another group, there SVM develops a model that can divide the new instances to either one of them, which means it is a non-probabilistic binary linear classifier. For Instance, you are given a plot of two classes on a graph as shown in the diagram below:
Support Vector Machine (SVM) in Machine Learning

 It is obvious that you can draw a separating line there. You might have a solution as given in figure 2. The new line fairly separates the two label classes. This is exactly what an SVM does.
Support Vector Machine (SVM) in Machine Learning

The target of the Support Vector Machine is to locate the best splitting boundary between data. In two dimensional space, you can think about this like the best fit line that partitions your dataset. With a Support Vector Machine, we're managing in vector space, hence the isolating line is really an isolating hyperplane. The best isolating hyperplane is characterized as the hyperplane that contains the "most stretched out" edge between support vectors. The hyperplane may likewise be alluded to as a decision boundary. The most straightforward approach to pass on this is through pictures.

Hyperplanes and Support Vectors:

Hyperplanes are decision limits that help classify the data focuses. Data points falling on either side of the hyperplane can be credited to various classes. Likewise, the dimension of the hyperplane relies on the number of highlights. In the event that the quantity of information highlights is 2, at that point, the hyperplane is only a line. On the off chance that the quantity of information highlights is 3, at that point, the hyperplane turns into a two-dimensional plane. It gets hard to envision when the quantity of highlights surpasses 3.
Support Vector Machine (SVM) in Machine Learning

Support vectors are data points that are nearer to the hyperplane and impact the position and direction of the hyperplane. Utilizing these support vectors, we augment the edge of the classifier. Erasing the support vectors will change the situation of the hyperplane. These are the focuses that assist us in building our SVM.
Support Vector Machine (SVM) in Machine Learning

Merits of Support Vector Machine Algorithm :

  • Precision
  • Works very well with restricted datasets
  • Kernel SVM contains a non-direct transformation capacity to change over the entangled non-straightly divisible data into straightly detachable data. 

Demerits of Support Vector Machine Algorithm 

  • Doesn't function admirably with bigger datasets
  •  In some cases, training time with SVMs can be high


For Implementation of Support Vector machine, we are going to choose a which is available freely on the internet. We can implement SVM in this data as follow:
import numpy as np
from sklearn import preprocessing, cross_validation, neighbors, svm
import pandas as pd

df = pd.read_csv('')
df.replace('?',-99999, inplace=True)
df.drop(['id'], 1, inplace=True)

X = np.array(df.drop(['class'], 1))
y = np.array(df['class'])

X_train, X_test, y_train, y_test = cross_validation.train_test_split(X, y, test_size=0.2)

clf = svm.SVC(), y_train)
confidence = clf.score(X_test, y_test)

example_measures = np.array([[4,2,1,1,1,2,3,2,1]])
example_measures = example_measures.reshape(len(example_measures), -1)
prediction = clf.predict(example_measures)

The output looks like below:

Post a Comment

Previous Post Next Post