keras/examples/README.md

3.6 KiB

addition_rnn.py An implementation of sequence to sequence learning for performing addition

antirectifier.py The example demonstrates how to write custom layers for Keras.

babi_memnn.py Trains a memory network on the bAbI dataset.

babi_rnn.py Trains two recurrent neural networks based upon a story and a question. The resulting merged vector is then queried to answer a range of bAbI tasks.

cifar10_cnn.py Train a simple deep CNN on the CIFAR10 small images dataset.

conv_filter_visualization.py Visualization of the filters of VGG16, via gradient ascent in input space.

deep_dream.py Deep Dreaming in Keras.

image_ocr.py This example uses a convolutional stack followed by a recurrent stack and a CTC logloss function to perform optical character recognition

imdb_bidirectional_lstm.py Train a Bidirectional LSTM on the IMDB sentiment classification task.

imdb_cnn.py This example demonstrates the use of Convolution1D for text classification.

imdb_cnn_lstm.py Train a recurrent convolutional network on the IMDB sentiment classification task.

imdb_fasttext.py This example demonstrates the use of fasttext for text classification

imdb_lstm.py Trains a LSTM on the IMDB sentiment classification task.

lstm_benchmark.py Compare LSTM implementations on the IMDB sentiment classification task.

lstm_text_generation.py Example script to generate text from Nietzsche's writings.

mnist_cnn.py Trains a simple convnet on the MNIST dataset.

mnist_hierarchical_rnn.py This is an example of using Hierarchical RNN (HRNN) to classify MNIST digits.

mnist_irnn.py This is a reproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in "A Simple Way to Initialize Recurrent Networks of Rectified Linear Units" by Quoc V. Le, Navdeep Jaitly, Geoffrey E. Hinton

mnist_mlp.py Trains a simple deep NN on the MNIST dataset.

mnist_net2net.py This is an implementation of Net2Net experiment with MNIST in 'Net2Net: Accelerating Learning via Knowledge Transfer'

mnist_siamese_graph.py Train a Siamese MLP on pairs of digits from the MNIST dataset.

mnist_sklearn_wrapper.py Example of how to use sklearn wrapper

mnist_swwae.py Trains a stacked what-where autoencoder built on residual blocks on the MNIST dataset.

mnist_transfer_cnn.py Transfer learning toy example

neural_doodle.py Neural doodle with Keras

neural_style_transfer.py Neural style transfer with Keras.

pretrained_word_embeddings.py This script loads pre-trained word embeddings (GloVe embeddings) into a frozen Keras Embedding layer, and uses it to train a text classification model on the 20 Newsgroup dataset

reuters_mlp.py Trains and evaluate a simple MLP on the Reuters newswire topic classification task.

stateful_lstm.py Example script showing how to use stateful RNNs to model long sequences efficiently.

variational_autoencoder.py This script demonstrates how to build a variational autoencoder with Keras.

variational_autoencoder_deconv.py This script demonstrates how to build a variational autoencoder with Keras and deconvolution layers.