Hun Learning
In Search Of The Truth Projected Onto A Finite Dimension
  • Author

Variational Inference and Bayesian Gaussian Mixture Model

When your computer cannot handle the burden of MCMC, you might as well allow for some bias and do some heavy math yourself

Kang Gyeonghun
2020-08-25
Probabilistic Machine Learning
Variational Inference and Bayesian Gaussian Mixture Model

  • KL divergence
  • Latent Variable
  • Posterior Approximation
  • Bayesian GMM
  • Clustering
Kang Gyeonghun avatar
I study statistics, machine learning, data science or whatever that concerns making inference on infinitie dimension from a limited sample in fintie dimension. This blog is an archive of my journey of study.
« Previous

Forward and Reverse KL divergence

Next »

Rtools를 윈도우 환경변수 PATH에 추가하는 방법

comments powered by Disqus

Recent Posts

  • Introduction to Rcpp and RcppArmadillo
  • Rtools를 윈도우 환경변수 PATH에 추가하는 방법
  • Variational Inference and Bayesian Gaussian Mixture Model
  • Forward and Reverse KL divergence
  • Interpretation of MLE in terms of KL divergence
  • Note on Kullback-Leibler Divergence

Tags

ADABOOST Adaptive Basis Model Bayes Rule Bayesian GMM Bayesian Hierarchy Bayesian Networks Bootstrap CART Clustering Conjugacy D-seperation Determinant Diagonalization EM algorithm Ensemble Learning Frequentist Gaussian Mixtures Generalized Additive Models Gibbs Sampling Intro to Statistical Learning Jacobian k-CV K-means Kernel KL divergence Lagrangian Duality Lasso Latent Variable LDA Linear Adjoint Logistic Regression Markov Chain Matrix Derivatives MCMC Metropolis Hastings MSE Multivariate Normal Naive Bayes Classifier OLS PCA Posterior Approximation Rcpp Regression Splines Ridge Self Adjoint Similar Matrices Singular Value Decomposition Stochastic Process SVM vector derivatives
© 2021 Kang Gyeonghun. Generated with Hugo and Mainroad theme.