Batch normalization

Mau Rua
1 min readAug 13, 2020

--

Batch normalization (also known as batch norm) is a technique for improving the speed, performance, and stability of artificial neural networks.

It is used to normalize the input layer by re-centering and re-scaling.

Gradient descent converges faster after normalization.

x = x / ||x||

--

--

Mau Rua
Mau Rua

Written by Mau Rua

Welcome to my software engineering blog. Please visit my career portfolio at https://mruanova.com 🚀🌎

No responses yet