load fisheriris
data = [meas(:,), meas(:,)];
groups = ismember(species,'setosa');
[train, test] = crossvalind('holdOut',groups);
cp = classperf(groups);
svmStruct = svmtrain(data(train,:),groups(train),'showplot',true); classes = svmclassify(svmStruct,data(test,:),'showplot',true);
classperf(cp,classes,test);
svmstruct = svmtrain(Training, Group)
Rows of TRAINING correspond to observations; columns correspond to features. Y is a column vector that contains the known class labels for TRAINING.
Y is a grouping variable, i.e., it can be a categorical, numeric, or logical vector; a cell vector of strings; or a character matrix with each row representing a
class label (see help for groupingvariable). Each element of Y specifies the group the corresponding row of TRAINING belongs to.
TRAINING and Y must have the same number of rows. SVMSTRUCT contains information about the trained classifier, including the support vectors, that
is used by SVMCLASSIFY for classification. svmtrain treats NaNs, empty strings or 'undefined' values as missing values and ignores the corresponding
rows in TRAINING and Y.
Group = svmclassify(SVMStruct, Sample)
>> help svmclassify
svmclassify Classify data using a support vector machine
GROUP = svmclassify(SVMSTRUCT, TEST) classifies each row in TEST using the support vector machine classifier structure SVMSTRUCT created
using SVMTRAIN, and returns the predicted class level GROUP. TEST must have the same number of columns as the data used to train the
classifier in SVMTRAIN. GROUP indicates the group to which each row of TEST is assigned.
GROUP = svmclassify(...,'SHOWPLOT',true) plots the test data TEST on the figure created using the SHOWPLOT option in SVMTRAIN.
-----------------------------------------------------------------------------------------------
-----------------------------------------------------------------------------------------------
利用libsvm做多分类问题的经典案例:
[y, x] = libsvmread('iris.scale.txt');
m = svmtrain(y, x, '-t 0');
test_y=[1;2;3];
test_x=[-0.555556 0.25 -0.864407 -0.916667;
0.444444 -0.0833334 0.322034 0.166667 ;
-0.277778 -0.333333 0.322034 0.583333 ];
[predict_label, accuracy, prob_estimates] = svmpredict(test_y, test_x, m);
数据:'iris.scale'可在Libsvm网站上有。共有三类。
iris.scale.txt 文档为: :-0.555556 :0.25 :-0.864407 :-0.916667
:-0.666667 :-0.166667 :-0.864407 :-0.916667
:-0.777778 :-0.898305 :-0.916667
:-0.833333 :-0.0833334 :-0.830508 :-0.916667
:-0.611111 :0.333333 :-0.864407 :-0.916667
:-0.388889 :0.583333 :-0.762712 :-0.75
:-0.833333 :0.166667 :-0.864407 :-0.833333
:-0.611111 :0.166667 :-0.830508 :-0.916667
:-0.944444 :-0.25 :-0.864407 :-0.916667
:-0.666667 :-0.0833334 :-0.830508 :-
:-0.388889 :0.416667 :-0.830508 :-0.916667
:-0.722222 :0.166667 :-0.79661 :-0.916667
:-0.722222 :-0.166667 :-0.864407 :-
:- :-0.166667 :-0.966102 :-
:-0.166667 :0.666667 :-0.932203 :-0.916667
:-0.222222 : :-0.830508 :-0.75
:-0.388889 :0.583333 :-0.898305 :-0.75
:-0.555556 :0.25 :-0.864407 :-0.833333
:-0.222222 :0.5 :-0.762712 :-0.833333
:-0.555556 :0.5 :-0.830508 :-0.833333
:-0.388889 :0.166667 :-0.762712 :-0.916667
:-0.555556 :0.416667 :-0.830508 :-0.75
:-0.833333 :0.333333 :- :-0.916667
:-0.555556 :0.0833333 :-0.762712 :-0.666667
:-0.722222 :0.166667 :-0.694915 :-0.916667
:-0.611111 :-0.166667 :-0.79661 :-0.916667
:-0.611111 :0.166667 :-0.79661 :-0.75
:-0.5 :0.25 :-0.830508 :-0.916667
:-0.5 :0.166667 :-0.864407 :-0.916667
:-0.777778 :-0.79661 :-0.916667
:-0.722222 :-0.0833334 :-0.79661 :-0.916667
:-0.388889 :0.166667 :-0.830508 :-0.75
:-0.5 :0.75 :-0.830508 :-
:-0.333333 :0.833333 :-0.864407 :-0.916667
:-0.666667 :-0.0833334 :-0.830508 :-
:-0.611111 :-0.932203 :-0.916667
:-0.333333 :0.25 :-0.898305 :-0.916667
:-0.666667 :-0.0833334 :-0.830508 :-
:-0.944444 :-0.166667 :-0.898305 :-0.916667
:-0.555556 :0.166667 :-0.830508 :-0.916667
:-0.611111 :0.25 :-0.898305 :-0.833333
:-0.888889 :-0.75 :-0.898305 :-0.833333
:-0.944444 :-0.898305 :-0.916667
:-0.611111 :0.25 :-0.79661 :-0.583333
:-0.555556 :0.5 :-0.694915 :-0.75
:-0.722222 :-0.166667 :-0.864407 :-0.833333
:-0.555556 :0.5 :-0.79661 :-0.916667
:-0.833333 :-0.864407 :-0.916667
:-0.444444 :0.416667 :-0.830508 :-0.916667
:-0.611111 :0.0833333 :-0.864407 :-0.916667
:0.5 :0.254237 :0.0833333
:0.166667 :0.186441 :0.166667
:0.444444 :-0.0833334 :0.322034 :0.166667
:-0.333333 :-0.75 :0.0169491 :-4.03573e-08
:0.222222 :-0.333333 :0.220339 :0.166667
:-0.222222 :-0.333333 :0.186441 :-4.03573e-08
:0.111111 :0.0833333 :0.254237 :0.25
:-0.666667 :-0.666667 :-0.220339 :-0.25
:0.277778 :-0.25 :0.220339 :-4.03573e-08
:-0.5 :-0.416667 :-0.0169491 :0.0833333
:-0.611111 :- :-0.152542 :-0.25
:-0.111111 :-0.166667 :0.0847457 :0.166667
:-0.0555556 :-0.833333 :0.0169491 :-0.25
:-1.32455e-07 :-0.25 :0.254237 :0.0833333
:-0.277778 :-0.25 :-0.118644 :-4.03573e-08
:0.333333 :-0.0833334 :0.152542 :0.0833333
:-0.277778 :-0.166667 :0.186441 :0.166667
:-0.166667 :-0.416667 :0.0508474 :-0.25
:0.0555554 :-0.833333 :0.186441 :0.166667
:-0.277778 :-0.583333 :-0.0169491 :-0.166667
:-0.111111 :0.288136 :0.416667
:-1.32455e-07 :-0.333333 :0.0169491 :-4.03573e-08
:0.111111 :-0.583333 :0.322034 :0.166667
:-1.32455e-07 :-0.333333 :0.254237 :-0.0833333
:0.166667 :-0.25 :0.118644 :-4.03573e-08
:0.277778 :-0.166667 :0.152542 :0.0833333
:0.388889 :-0.333333 :0.288136 :0.0833333
:0.333333 :-0.166667 :0.355932 :0.333333
:-0.0555556 :-0.25 :0.186441 :0.166667
:-0.222222 :-0.5 :-0.152542 :-0.25
:-0.333333 :-0.666667 :-0.0508475 :-0.166667
:-0.333333 :-0.666667 :-0.0847458 :-0.25
:-0.166667 :-0.416667 :-0.0169491 :-0.0833333
:-0.0555556 :-0.416667 :0.38983 :0.25
:-0.388889 :-0.166667 :0.186441 :0.166667
:-0.0555556 :0.166667 :0.186441 :0.25
:0.333333 :-0.0833334 :0.254237 :0.166667
:0.111111 :-0.75 :0.152542 :-4.03573e-08
:-0.277778 :-0.166667 :0.0508474 :-4.03573e-08
:-0.333333 :-0.583333 :0.0169491 :-4.03573e-08
:-0.333333 :-0.5 :0.152542 :-0.0833333
:-1.32455e-07 :-0.166667 :0.220339 :0.0833333
:-0.166667 :-0.5 :0.0169491 :-0.0833333
:-0.611111 :-0.75 :-0.220339 :-0.25
:-0.277778 :-0.416667 :0.0847457 :-4.03573e-08
:-0.222222 :-0.166667 :0.0847457 :-0.0833333
:-0.222222 :-0.25 :0.0847457 :-4.03573e-08
:0.0555554 :-0.25 :0.118644 :-4.03573e-08
:-0.555556 :-0.583333 :-0.322034 :-0.166667
:-0.222222 :-0.333333 :0.0508474 :-4.03573e-08
:0.111111 :0.0833333 :0.694915 :
:-0.166667 :-0.416667 :0.38983 :0.5
:0.555555 :-0.166667 :0.661017 :0.666667
:0.111111 :-0.25 :0.559322 :0.416667
:0.222222 :-0.166667 :0.627119 :0.75
:0.833333 :-0.166667 :0.898305 :0.666667
:-0.666667 :-0.583333 :0.186441 :0.333333
:0.666667 :-0.25 :0.79661 :0.416667
:0.333333 :-0.583333 :0.627119 :0.416667
:0.611111 :0.333333 :0.728813 :
:0.222222 :0.38983 :0.583333
:0.166667 :-0.416667 :0.457627 :0.5
:0.388889 :-0.166667 :0.525424 :0.666667
:-0.222222 :-0.583333 :0.355932 :0.583333
:-0.166667 :-0.333333 :0.38983 :0.916667
:0.166667 :0.457627 :0.833333
:0.222222 :-0.166667 :0.525424 :0.416667
:0.888889 :0.5 :0.932203 :0.75
:0.888889 :-0.5 : :0.833333
:-0.0555556 :-0.833333 :0.355932 :0.166667
:0.444444 :0.59322 :0.833333
:-0.277778 :-0.333333 :0.322034 :0.583333
:0.888889 :-0.333333 :0.932203 :0.583333
:0.111111 :-0.416667 :0.322034 :0.416667
:0.333333 :0.0833333 :0.59322 :0.666667
:0.611111 :0.694915 :0.416667
:0.0555554 :-0.333333 :0.288136 :0.416667
:-1.32455e-07 :-0.166667 :0.322034 :0.416667
:0.166667 :-0.333333 :0.559322 :0.666667
:0.611111 :-0.166667 :0.627119 :0.25
:0.722222 :-0.333333 :0.728813 :0.5
: :0.5 :0.830508 :0.583333
:0.166667 :-0.333333 :0.559322 :0.75
:0.111111 :-0.333333 :0.38983 :0.166667
:-1.32455e-07 :-0.5 :0.559322 :0.0833333
:0.888889 :-0.166667 :0.728813 :0.833333
:0.111111 :0.166667 :0.559322 :0.916667
:0.166667 :-0.0833334 :0.525424 :0.416667
:-0.0555556 :-0.166667 :0.288136 :0.416667
:0.444444 :-0.0833334 :0.491525 :0.666667
:0.333333 :-0.0833334 :0.559322 :0.916667
:0.444444 :-0.0833334 :0.38983 :0.833333
:-0.166667 :-0.416667 :0.38983 :0.5
:0.388889 :0.661017 :0.833333
:0.333333 :0.0833333 :0.59322 :
:0.333333 :-0.166667 :0.423729 :0.833333
:0.111111 :-0.583333 :0.355932 :0.5
:0.222222 :-0.166667 :0.423729 :0.583333
:0.0555554 :0.166667 :0.491525 :0.833333
:-0.111111 :-0.166667 :0.38983 :0.416667