Batch Normalization for NN

Dropout

Deep neural networks are prone to overfitting. One method to combat this is dropout. Dropout is randomly ommiting nodes during the training process.

Batch Normalization

Normalization involves normalizing

During Training

While running