Processing math: 100%

Matrix Derivatives

Common Used in Machine Learning

There are two types of layout in the expression of matrix derivatives: denominator and numberator layout.

Here we always use the denominator layout.

  1. vector to vector
Axx=A
xAx=A
  1. scaler to vector
xAxx=(A+A)x
xTxx=2x

These two conclusions are very popular in the derivative calculation of Machine Learning.

Reference

  1. 矩阵求导、几种重要的矩阵及常用的矩阵求导公式