BatchNormalization
keras.layers.normalization.BatchNormalization(axis=-1, momentum=0.99, epsilon=0.001, center=True, scale=True, beta_initializer='zeros', gamma_initializer='ones', moving_mean_initializer='zeros', moving_variance_initializer='ones', beta_regularizer=None, gamma_regularizer=None, beta_constraint=None, gamma_constraint=None)
Batch normalization layer (Ioffe and Szegedy, 2014).
Normalize the activations of the previous layer at each batch, i.e. applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.
Arguments
- axis: Integer, the axis that should be normalized
(typically the features axis).
For instance, after a
Conv2D
layer withdata_format="channels_first"
, setaxis=1
inBatchNormalization
. - momentum: Momentum for the moving average.
- epsilon: Small float added to variance to avoid dividing by zero.
- center: If True, add offset of
beta
to normalized tensor. If False,beta
is ignored. - scale: If True, multiply by
gamma
. If False,gamma
is not used. When the next layer is linear (also e.g.nn.relu
), this can be disabled since the scaling will be done by the next layer. - beta_initializer: Initializer for the beta weight.
- gamma_initializer: Initializer for the gamma weight.
- moving_mean_initializer: Initializer for the moving mean.
- moving_variance_initializer: Initializer for the moving variance.
- beta_regularizer: Optional regularizer for the beta weight.
- gamma_regularizer: Optional regularizer for the gamma weight.
- beta_constraint: Optional constraint for the beta weight.
- gamma_constraint: Optional constraint for the gamma weight.
Input shape
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
Output shape
Same shape as input.
References