keras/docs/sources/objectives.md
2015-07-06 15:28:16 -04:00

1.2 KiB

Usage of objectives

An objective function (or loss function, or optimization score function) is one of the two parameters required to compile a model:

model.compile(loss='mean_squared_error', optimizer='sgd')

You can either pass the name of an existing objective, or pass a Theano symbolic function that returns a scalar for each data-point and takes the following two arguments:

  • y_true: True labels. Theano tensor.
  • y_pred: Predictions. Theano tensor of the same shape as y_true.

The actual optimized objective is the mean of the output array across all datapoints.

For a few examples of such functions, check out the objectives source.

Available objectives

  • mean_squared_error / mse
  • mean_absolute_error / mae
  • mean_absolute_percentage_error / mape
  • mean_squared_logarithmic_error / msle
  • squared_hinge
  • hinge
  • binary_crossentropy: Also known as logloss.
  • categorical_crossentropy: Also known as multiclass logloss. Note: using this objective requires that your labels are binary arrays of shape (nb_samples, nb_classes).