Classification Space
Class 0
Class 1
Query
Tree Structure
Split Quality (Information Gain)
Impurity Measures
Gini = 1 - Σpᵢ²
Entropy = -Σpᵢ log₂(pᵢ)
Add Data Points
Tree Statistics
0
Depth
0
Nodes
0
Leaves
Tree Parameters
Model Statistics
Accuracy
--
Total Points
0
Class 0
0
Class 1
0
Hover to trace decision path
Position:
--
Prediction:
--
Decision path will appear here
Quick Examples
How it works: Decision trees recursively partition the feature space using axis-aligned splits. At each node, the split that maximizes information gain (or minimizes impurity) is chosen.