Usage of initializations
Initializations define the probability distribution used to set the initial random weights of Keras layers.
The keyword arguments used for passing initializations to layers will depend on the layer. Usually it is simply init
:
model.add(Dense(64, init='uniform'))
Available initializations
- uniform
- lecun_uniform: Uniform initialization scaled by the square root of the number of inputs (LeCun 98).
- normal
- identity: Use with square 2D layers (
shape[0] == shape[1]
). - orthogonal: Use with square 2D layers (
shape[0] == shape[1]
). - zero
- glorot_normal: Gaussian initialization scaled by fan_in + fan_out (Glorot 2010)
- glorot_uniform
- he_normal: Gaussian initialization scaled by fan_in (He et al., 2014)
- he_uniform