Deep Learning Batch Normalization

Why we need batch normalization in neural network?

It can help the neural network to converge more quickly.

Make the different features into the same scale, get rid of the influence of different scale.

防止梯度爆炸和梯度消失

Reference

  1. zhihu: 神经网络中的归一化除了减少计算量,还有什么作用?
  2. towards data science: Batch normalization in Neural Networks