Machine Learning #76 Gaussian Mixture Model

Machine Learning #76 Gaussian Mixture Model
A Gaussian mixture model is a probabilistic model that assumes all the data points are generated from a mixture of a finite number of Gaussian distributions with unknown parameters. One can think of mixture models as generalizing k-means clustering to incorporate information about the covariance structure of the data as well as the centers of the latent Gaussians.
The GaussianMixture object implements the expectation-maximization (EM) algorithm for fitting mixture-of-Gaussian models. It can also draw confidence ellipsoids for multivariate models, and compute the Bayesian Information Criterion to assess the number of clusters in the data. A method is provided that learns a Gaussian Mixture Model from train data. Given test data, it can assign to each sample the Gaussian it mostly probably belong to using the GaussianMixture.predict method.

Pros of aussian Mixture Model:
– It is the fastest algorithm for learning mixture models.
– As this algorithm maximizes only the likelihood, it will not bias the means towards zero, or bias the cluster sizes to have specific structures that might or might not apply.

Machine Learning Complete Tutorial/Lectures/Course from IIT (nptel) @
Discrete Mathematics for Computer Science @ (IIT Lectures for GATE)
Best Programming Courses @
Operating Systems Lecture/Tutorials from IIT @
MATLAB Tutorials @