Classification Space & Margin
Class -1
Class +1
Support Vector
Decision Function f(x)
f(x) = w·x + b
Classify: sign(f(x))
Hinge Loss Visualization
Training Progress
Add Data Points
Margin Width
--
2 / ||w||
0 Support Vectors
Regularization (C)
Small C → wider margin, more errors
Large C → narrower margin, fewer errors
Kernel Function
Model Statistics
Accuracy
--
Loss
--
Class -1
0
Class +1
0
Quick Examples
How it works: SVM finds the hyperplane that maximizes the margin between classes. Support vectors are the critical points that define this margin. The kernel trick enables nonlinear boundaries.