K-Nearest Neighbors (K-NN)

Harvard EPS-210 | Interactive tutorial — Click to add data points and explore instance-based learning

Classification Space

Class 0
Class 1
Query Point

Nearest Neighbors Detail

d(x, x') = √Σ(xᵢ - x'ᵢ)²
Euclidean distance

Decision Boundary vs K

Distance Distribution

Add Data Points

Click on the classification canvas to add points

K Parameter

3
Neighbors

Small K → complex boundary, Large K → smooth boundary

Distance Weighting

vote = 1
All neighbors vote equally

Model Statistics

Accuracy
--
Total Points
0
Class 0
0
Class 1
0
Hover to probe a point
Query: --
Prediction: --
Votes (C0:C1): --

Quick Examples

How it works: K-NN classifies a point by finding its K nearest neighbors in the training data and taking a majority vote. It's a "lazy learner" — no training phase, all computation happens at prediction time.