Decision Tree Classifier

Harvard EPS-210 | Interactive tutorial — Explore recursive partitioning and tree-based classification

Classification Space

Class 0
Class 1
Query

Tree Structure

Split Quality (Information Gain)

Impurity Measures

Gini = 1 - Σpᵢ²
Entropy = -Σpᵢ log₂(pᵢ)

Add Data Points

Tree Statistics

0
Depth
0
Nodes
0
Leaves

Tree Parameters

Max Depth 5
Min Samples Split 2

Model Statistics

Accuracy
--
Total Points
0
Class 0
0
Class 1
0
Hover to trace decision path
Position: --
Prediction: --
Decision path will appear here

Quick Examples

How it works: Decision trees recursively partition the feature space using axis-aligned splits. At each node, the split that maximizes information gain (or minimizes impurity) is chosen.