0 votes
in Deep Learning by
How does L1/L2 regularization affect a neural network?

1 Answer

0 votes
by

Overfitting occurs in more complex neural network models (many layers, many neurons) and the complexity of the neural network can be reduced by using L1 and L2 regularization as well as dropout , Data augmenration and Dropaout. L1 regularization forces the weight parameters to become zero. L2 regularization forces the weight parameters towards zero (but never exactly zero|| weight deccay )

Smaller weight parameters make some neurons neglectable therfore neural network becomes less complex and less overfitting.

Regularisation has the following benefits:

Reducing the variance of the model over unseen data.

Makes it feasible to fit much more complicated models without overfitting.

Reduces the magnitude of weights and biases.

L1 learns sparse models that is many weights turn out to be 0.

Related questions

0 votes
asked Dec 13, 2022 in Deep Learning by Robindeniel
0 votes
asked Jul 21, 2023 in Deep Learning by rahuljain1
...