keras/docs/templates/models.md
2015-12-12 12:36:00 -08:00

3.3 KiB

Keras has two models: Sequential, a linear stack of layers, and Graph, a directed acyclic graph of layers.

Using the Sequential model

from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import SGD

model = Sequential()
model.add(Dense(2, init='uniform', input_dim=64))
model.add(Activation('softmax'))

model.compile(optimizer='sgd', loss='mse')

'''
Train the model for 3 epochs, in batches of 16 samples,
on data stored in the Numpy array X_train,
and labels stored in the Numpy array y_train:
'''
model.fit(X_train, y_train, nb_epoch=3, batch_size=16, verbose=1)
'''
What you will see with mode verbose=1:
Train on 37800 samples, validate on 4200 samples
Epoch 0
37800/37800 [==============================] - 7s - loss: 0.0385
Epoch 1
37800/37800 [==============================] - 8s - loss: 0.0140
Epoch 2
10960/37800 [=======>......................] - ETA: 4s - loss: 0.0109
'''

model.fit(X_train, y_train, nb_epoch=3, batch_size=16, verbose=2)
'''
What you will see with mode verbose=2:
Train on 37800 samples, validate on 4200 samples
Epoch 0
loss: 0.0190
Epoch 1
loss: 0.0146
Epoch 2
loss: 0.0049
'''

'''
Demonstration of the show_accuracy argument
'''
model.fit(X_train, y_train, nb_epoch=3, batch_size=16, verbose=2, show_accuracy=True)
'''
Train on 37800 samples, validate on 4200 samples
Epoch 0
loss: 0.0190 - acc.: 0.8750
Epoch 1
loss: 0.0146 - acc.: 0.8750
Epoch 2
loss: 0.0049 - acc.: 1.0000
'''

'''
Demonstration of the validation_split argument
'''
model.fit(X_train, y_train, nb_epoch=3, batch_size=16,
          validation_split=0.1, show_accuracy=True, verbose=1)
'''
Train on 37800 samples, validate on 4200 samples
Epoch 0
37800/37800 [==============================] - 7s - loss: 0.0385 - acc.: 0.7258 - val. loss: 0.0160 - val. acc.: 0.9136
Epoch 1
37800/37800 [==============================] - 8s - loss: 0.0140 - acc.: 0.9265 - val. loss: 0.0109 - val. acc.: 0.9383
Epoch 2
10960/37800 [=======>......................] - ETA: 4s - loss: 0.0109 - acc.: 0.9420
'''

Using the Graph model

# graph model with one input and two outputs
graph = Graph()
graph.add_input(name='input', input_shape=(32,))
graph.add_node(Dense(16), name='dense1', input='input')
graph.add_node(Dense(4), name='dense2', input='input')
graph.add_node(Dense(4), name='dense3', input='dense1')
graph.add_output(name='output1', input='dense2')
graph.add_output(name='output2', input='dense3')

graph.compile(optimizer='rmsprop', loss={'output1':'mse', 'output2':'mse'})
history = graph.fit({'input':X_train, 'output1':y_train, 'output2':y2_train}, nb_epoch=10)

# graph model with two inputs and one output
graph = Graph()
graph.add_input(name='input1', input_shape=(32,))
graph.add_input(name='input2', input_shape=(32,))
graph.add_node(Dense(16), name='dense1', input='input1')
graph.add_node(Dense(4), name='dense2', input='input2')
graph.add_node(Dense(4), name='dense3', input='dense1')
graph.add_output(name='output', inputs=['dense2', 'dense3'], merge_mode='sum')
graph.compile(optimizer='rmsprop', loss={'output':'mse'})

history = graph.fit({'input1':X_train, 'input2':X2_train, 'output':y_train}, nb_epoch=10)
predictions = graph.predict({'input1':X_test, 'input2':X2_test}) # {'output':...}


Model API documentation

{{autogenerated}}