SSVM2 | ![]() |
Single-class SVM with L2-soft margin.
Synopsis:
model = ssvm2(X,options)
Description:
This function trains the single-class SVM with L2-soft margin.
The task is to find hyperplane passing through the origin
such that the input vectors lie on one side of the hyperplane.
The margin between data and the hypeplane should be maximized.
In the case of data its convex hull contains the origin
(non-separable data) the L2-soft margin relaxation is applied.
The hyperplane is sought for in the feature space induced by
the prescribed kernel.
Input:
X [dim x num_data] Input vectors.
options [struct] Control parameters:
.ker [string] Kernel identifier.
.arg [1xnars] Kernel argument(s).
.solver [string] Used quadratic programming solver; options are
'mdm','kozinec' or 'npa' (default).
.tmax [1x1] Maximal number of iterations.
.tolabs [1x1] Absolute tolerance stopping condition.
.tolrel [1x1] Relative tolerance stopping condition.
Output:
model [string] Hyperplane in the kernel feature sapce:
.Alpha [nsv x 1] Weights.
.b [1x1] Bias equal to 0.
.sv.X [dim x nsv] Selected vectors.
.options [struct] Copy of input options.
.exitflag [1x1] Indicates which stopping condition was used:
UB <= tolabs -> exit_flag = 1 Abs. tolerance.
(UB-LB)/(LB+1) <= tolrel -> exit_flag = 2 Relative tolerance.
t >= tmax -> exit_flag = 0 Number of iterations.
.UB [1x1] Upper bound on the optimal solution.
.LB [1x1] Lower bound on the optimal solution.
.t [1x1] Number of iterations.
.kercnt [1x1] Number of kernel evaluations.
.margin [1x1] Achieved margin.
.trnerr [1x1] Training error.
.History [2x(t+1)] UB and LB with respect to number of iterations.
Example:
load('scales','X');
model = ssvm2(X);
figure; axis([-1 1 -1 1]); grid on;
ppatterns(X); ppatterns(model.sv.X,'ob',13);
pline(model);
See also
SVMCLASS, SVM.
Modifications:
15-jun-2004, VF
23-Jan-2004, VF
22-Jan-2004, VF
19-Oct-2003, VF