First draft of the documentation for callbacks

This commit is contained in:
Tristan Deleu 2015-05-28 18:30:15 +02:00
parent 0613ef9049
commit e75ff62dcb
4 changed files with 42 additions and 2 deletions

@ -25,6 +25,7 @@ pages:
- [regularizers.md, Regularizers]
- [constraints.md, Constraints]
- [initializations.md, Initializations]
- [callbacks.md, Callbacks]
- [datasets.md, Datasets]

39
docs/sources/callbacks.md Normal file

@ -0,0 +1,39 @@
## Base class
```python
keras.callbacks.Callback()
```
- __Properties__:
- __params__: dict. Training parameters (eg. verbosity, batch size, number of epochs...).
- __model__: `keras.models.Model`. Reference of the model being trained.
- __Methods__:
- __on_train_begin__(): Method called at the beginning of training.
- __on_train_end__(): Method called at the end of training.
- __on_epoch_begin__(epoch): Method called at the beginning of epoch `epoch`.
- __on_epoch_end__(epoch, val_loss, val_acc): Method called at the end of epoch `epoch`, with validation loss `val_loss` and accuracy `val_acc` (if applicable).
- __on_batch_begin__(batch): Method called at the beginning of batch `batch`.
- __on_batch_end__(batch, indices, loss, accuracy): Method called at the end of batch `batch`, with loss `loss` and accuracy `accuracy` (if applicable).
### Example
```python
from keras.models import Sequential
from keras.layers.core import Dense, Activation
from keras.callbacks import History
model = Sequential()
model.add(Dense(784, 10, init='uniform'))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
history = History()
model.fit(X_train, Y_train, batch_size=128, nb_epoch=20, verbose=0, callbacks=[history])
print history.losses
# outputs
'''
[0.66047596406559383, 0.3547245744908703, ..., 0.25953155204159617, 0.25901699725311789]
'''
```

@ -13,7 +13,7 @@ model = keras.models.Sequential()
- __loss__: str (name of objective function) or objective function. See [objectives](objectives.md).
- __class_mode__: one of "categorical", "binary". This is only used for computing classification accuracy or using the predict_classes method.
- __theano_mode__: A `theano.compile.mode.Mode` ([reference](http://deeplearning.net/software/theano/library/compile/mode.html)) instance controlling specifying compilation options.
- __fit__(X, y, batch_size=128, nb_epoch=100, verbose=1, validation_split=0., validation_data=None, shuffle=True, show_accuracy=False): Train a model for a fixed number of epochs.
- __fit__(X, y, batch_size=128, nb_epoch=100, verbose=1, validation_split=0., validation_data=None, shuffle=True, show_accuracy=False, callbacks=[]): Train a model for a fixed number of epochs.
- __Return__: a history dictionary with a record of training loss values at successive epochs, as well as validation loss values (if applicable), accuracy (if applicable), etc.
- __Arguments__:
- __X__: data.
@ -25,6 +25,7 @@ model = keras.models.Sequential()
- __validation_data__: tuple (X, y) to be used as held-out validation data. Will override validation_split.
- __shuffle__: boolean. Whether to shuffle the samples at each epoch.
- __show_accuracy__: boolean. Whether to display class accuracy in the logs to stdout at each epoch.
- __callbacks__: `keras.callbacks.Callback` list. List of callbacks to apply during training. See [callbacks](callbacks.md).
- __evaluate__(X, y, batch_size=128, show_accuracy=False, verbose=1): Show performance of the model over some validation data.
- __Return__: The loss score over the data.
- __Arguments__: Same meaning as fit method above. verbose is used as a binary flag (progress bar or nothing).

@ -115,7 +115,6 @@ class DrawActivations(Callback):
self.imgs.set_title('Epoch #%d - Batch #%d' % (self.epoch, batch))
def on_train_end(self):
# anim = animation.ArtistAnimation(self.fig, self.imgs, interval=10, blit=False, repeat_delay=1000)
anim = SubplotTimedAnimation(self.fig, self.imgs, grid=(1,5), interval=10, blit=False, repeat_delay=1000)
# anim.save('test_gif.gif', fps=15, writer='imagemagick')
plt.show()