keras/docs/sources/activations.md
2015-04-09 17:41:48 -07:00

1.3 KiB

Activations

Usage of activations

Activations can either be used through an Activation layer, or through the activation argument supported by all forward layers:

from keras.layers.core import Activation, Dense

model.add(Dense(64, 64, init='uniform'))
model.add(Activation('tanh'))

is equivalent to:

model.add(Dense(20, 64, init='uniform', activation='tanh'))

You can also pass an element-wise Theano function as an activation:

def tanh(x):
    return theano.tensor.tanh(x)

model.add(Dense(20, 64, init='uniform', activation=tanh))
model.add(Activation(tanh))

Available activations

  • softmax: Should only be applied to 2D layers (expected shape: (nb_samples, nb_dims)).
  • time_distributed_softmax: Softmax applied to every sample at every timestep of a layer of shape (nb_samples, nb_timesteps, nb_dims).
  • softplus
  • relu
  • tanh
  • sigmoid
  • hard_sigmoid
  • linear

On Advanced Activations

Activations that are more complex than a simple Theano function (eg. learnable activations, configurable activations, etc.) are available as Advanced Activation layers, and can be found in the module keras.layers.advanced_activations. These include PReLU and LeakyReLU.