keras/docs/sources/regularizers.md
2015-05-10 16:35:19 +03:00

679 B

Usage of regularizers

Regularizers allow to apply penalties on network parameters during optimization.

The keyword arguments used for passing penalties to parameters in a layer will depend on the layer.

In the Dense layer it is simply W_regularizer for the main weights matrix, and b_regularizer for the bias.

from keras.regularizers import l2
model.add(Dense(64, 64, W_regularizer = l2(.01)))

Available penalties

  • l1(l=0.01): L1 regularization penalty, also known as LASSO
  • l2(l=0.01): L2 regularization penalty, also known as weight decay, or Ridge
  • l1l2(l1=0.01, l2=0.01): L1-L2 regularization penalty, also known as ElasticNet