Hun Learning
In Search Of The Truth Projected Onto A Finite Dimension
  • Author

01 Vector Derivatives

When you got tired of mires of summation symbols, vector notation comes to rescue

Kang Gyeonghun
2020-03-29
Vector, Matrix Derivatives
01 Vector Derivatives

훈러닝 채널에 벡터 및 행렬 미분 강의 있습니다!

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

93C5DB1E001

  • vector derivatives
Kang Gyeonghun avatar
I study statistics, machine learning, data science or whatever that concerns making inference on infinitie dimension from a limited sample in fintie dimension. This blog is an archive of my journey of study.
« Previous

04 Least Squares Projection

Next »

02 Example: Change of Variables

comments powered by Disqus

Recent Posts

  • Introduction to Rcpp and RcppArmadillo
  • Rtools를 윈도우 환경변수 PATH에 추가하는 방법
  • Variational Inference and Bayesian Gaussian Mixture Model
  • Forward and Reverse KL divergence
  • Interpretation of MLE in terms of KL divergence
  • Note on Kullback-Leibler Divergence

Tags

ADABOOST Adaptive Basis Model Bayes Rule Bayesian GMM Bayesian Hierarchy Bayesian Networks Bootstrap CART Clustering Conjugacy D-seperation Determinant Diagonalization EM algorithm Ensemble Learning Frequentist Gaussian Mixtures Generalized Additive Models Gibbs Sampling Intro to Statistical Learning Jacobian k-CV K-means Kernel KL divergence Lagrangian Duality Lasso Latent Variable LDA Linear Adjoint Logistic Regression Markov Chain Matrix Derivatives MCMC Metropolis Hastings MSE Multivariate Normal Naive Bayes Classifier OLS PCA Posterior Approximation Rcpp Regression Splines Ridge Self Adjoint Similar Matrices Singular Value Decomposition Stochastic Process SVM vector derivatives
© 2021 Kang Gyeonghun. Generated with Hugo and Mainroad theme.