Gaussian Mixture Models

Harvard EPS-210 | Interactive tutorial — Explore probabilistic clustering with the EM algorithm

Mixture Model

1σ ellipse
2σ ellipse
Mean

Soft Assignments (Responsibilities)

γₖ(xᵢ) = πₖ N(xᵢ|μₖ,Σₖ) / Σⱼ πⱼ N(xᵢ|μⱼ,Σⱼ)
Posterior probability of component k

Log-Likelihood

Probability Density

Data Points

Click on the mixture canvas to add points

Number of Components (K)

3
Gaussians

EM Algorithm

0
Iteration
Ready

Covariance Type

Model Statistics

Points
0
Log-Lik
--
BIC
--
AIC
--

Quick Examples

How it works: GMM models data as a mixture of K Gaussians. The EM algorithm alternates: E-step computes soft assignments (responsibilities), M-step updates means, covariances, and mixing weights. Unlike K-means, GMM provides probabilistic cluster memberships.