Gaussian Distribution
Table of Contents
Gaussian Distribution
\begin{align*} \mathcal N (x; \mu, \sigma) = \frac 1 {\sqrt {2\pi \sigma^2}} \exp \left(-\frac {||x - \mu||_2^2} {2\sigma^2} \right) \end{align*}- \(\mu \in \mathbb R\) is mean
\(\sigma \in (0, \infty)\) is standard deviation.
Or use can used inverse of variance, called precision \(\beta \in (0, \infty) = \frac 1 {\sigma^2}\)
It is useful and good default choice in many places because:
- Central Limit Theorem: Sum of many independent random variables is approximately normally distributed. So, a complex system with many parts can be modeled as gaussian noise.
- Of all probability distribution with same variance, normal distribution has the highest uncertainity (See Entropy). So, it encodes the least amount of prior knowledge into the model.
1. Multivariate Normal Distribution
- \(\mathcal N(x; \mu; \Sigma)\)
- \(\Sigma\) is covariance matrix - a positive definite symmetric matrix
- \(\beta = \Sigma^{-1}\) is precision matrix
- Isotropic Gaussian means \(\Sigma = \sigma I\)
2. Gaussian Mixture
Gaussian mixture model is a universal approximator of densities. Any smooth density can be approximated with any specific, non-zero amount of error by a gaussian mixture model with enough components.
3. References
- Deep Learning by Goodfellow, Bengio, Courville