2015-04-06 19:48:26 +00:00
## Usage of objectives
An objective function (or loss function, or optimization score function) is one of the two parameters required to compile a model:
```python
model.compile(loss='mean_squared_error', optimizer='sgd')
```
2015-07-06 19:28:16 +00:00
You can either pass the name of an existing objective, or pass a Theano symbolic function that returns a scalar for each data-point and takes the following two arguments:
2015-04-06 19:48:26 +00:00
- __y_true__: True labels. Theano tensor.
- __y_pred__: Predictions. Theano tensor of the same shape as y_true.
2015-07-06 19:28:16 +00:00
The actual optimized objective is the mean of the output array across all datapoints.
2015-04-06 19:48:26 +00:00
For a few examples of such functions, check out the [objectives source ](https://github.com/fchollet/keras/blob/master/keras/objectives.py ).
## Available objectives
- __mean_squared_error__ / __mse__
- __mean_absolute_error__ / __mae__
2015-07-04 14:24:24 +00:00
- __mean_absolute_percentage_error__ / __mape__
- __mean_squared_logarithmic_error__ / __msle__
2015-04-06 19:48:26 +00:00
- __squared_hinge__
- __hinge__
- __binary_crossentropy__: Also known as logloss.
2015-07-06 19:28:16 +00:00
- __categorical_crossentropy__: Also known as multiclass logloss. __Note__ : using this objective requires that your labels are binary arrays of shape `(nb_samples, nb_classes)` .