scikit-learn, linearsvc - how to get support vectors from the trained SVM?

This could help you.

clf = svm.SVC( kernel='rbf',C=0.05)
clf.fit(traindata,y)
print clf.support_vectors_

This link can you more information if needed. http://scikit-learn.org/stable/modules/svm.html


I am not sure if it helps, but I was searching for something similar and the conclusion was that when:

clf = svm.LinearSVC()

Then this:

clf.decision_function(x)

Is equal to this:

clf.coef_.dot(x) + clf.intercept_

Unfortunately there seems to be no way to do that. LinearSVC calls liblinear (see relevant code) but doesn't retrieve the vectors, only the coefficients and the intercept.

One alternative would be to use SVC with the 'linear' kernel (libsvm instead of liblinear based), but also poly, dbf and sigmoid kernel support this option:

from sklearn import svm

X = [[0, 0], [1, 1]]
y = [0, 1]

clf = svm.SVC(kernel='linear')
clf.fit(X, y)
print clf.support_vectors_

Output:

[[ 0.  0.]
 [ 1.  1.]]

liblinear scales better to large number of samples, but otherwise the are mostly equivalent.