keras/examples
Fariz Rahman 79ec9b8079 ACGAN : Remove lines with no effect (#4503)
* Remove lines with no effect

* pep8

* Update mnist_acgan.py
2016-11-29 13:22:34 -08:00
..
addition_rnn.py Spellcheck source files (#2907) 2016-06-06 13:29:25 -07:00
antirectifier.py Normalize layer imports in examples 2016-05-11 18:45:37 -07:00
babi_memnn.py Add missing Softmax activation memnn. (#3706) 2016-09-06 11:33:11 -07:00
babi_rnn.py Update download path for babi dataset 2016-08-29 13:03:36 -07:00
cifar10_cnn.py Make examples agnostic to image_dim_ordering 2016-09-06 15:53:56 -07:00
conv_filter_visualization.py Add keras.applications, refactor 2 convnet scripts 2016-08-27 20:27:49 -07:00
conv_lstm.py Style fixes 2016-11-05 13:45:50 -07:00
deep_dream.py Applied imagenet mean pixel on BGR instead of RGB. (#4027) 2016-10-12 16:59:56 -07:00
image_ocr.py Python 3 support of image_ocr.py (#4049) 2016-10-13 13:53:35 -07:00
imdb_bidirectional_lstm.py Bidirectional Wrapper (#3495) 2016-08-17 16:27:06 -07:00
imdb_cnn_lstm.py Remove unused imports. (#4083) 2016-10-16 21:58:35 -07:00
imdb_cnn.py Update imdb_cnn.py to use GlobalMaxPooling1D (#4164) 2016-10-24 09:25:08 -07:00
imdb_fasttext.py Remove unused import statement (#4053) 2016-10-14 09:16:56 -07:00
imdb_lstm.py Example touch-up 2016-08-27 20:28:03 -07:00
lstm_benchmark.py Style fixes 2016-05-05 11:17:25 -07:00
lstm_text_generation.py Update lstm_text_generation.py (#3516) 2016-08-18 14:03:26 -07:00
mnist_acgan.py ACGAN : Remove lines with no effect (#4503) 2016-11-29 13:22:34 -08:00
mnist_cnn.py Make examples agnostic to image_dim_ordering 2016-09-06 15:53:56 -07:00
mnist_hierarchical_rnn.py Style fixes in example. 2016-08-14 15:41:04 -07:00
mnist_irnn.py Fixed typo (#2770) 2016-05-21 10:12:45 -07:00
mnist_mlp.py revert unnecessary changes in example script 2016-08-03 11:11:35 -07:00
mnist_net2net.py add example mnist_net2net.py (#3503) 2016-08-18 13:07:16 -07:00
mnist_siamese_graph.py fixed shape typo (#2679) 2016-05-09 22:17:12 -07:00
mnist_sklearn_wrapper.py Normalize layer imports in examples 2016-05-11 18:45:37 -07:00
mnist_swwae.py Added stacked what where autoencoder. (#3616) 2016-09-07 11:05:41 -07:00
mnist_transfer_cnn.py Make examples agnostic to image_dim_ordering 2016-09-06 15:53:56 -07:00
neural_doodle.py Applied imagenet mean pixel on BGR instead of RGB. (#4027) 2016-10-12 16:59:56 -07:00
neural_style_transfer.py Applied imagenet mean pixel on BGR instead of RGB. (#4027) 2016-10-12 16:59:56 -07:00
pretrained_word_embeddings.py Word embdedding example updated (#3417) 2016-08-08 10:59:31 -07:00
README.md Add conv_lstm to examples/README 2016-11-05 15:30:33 -07:00
reuters_mlp.py Normalize layer imports in examples 2016-05-11 18:45:37 -07:00
stateful_lstm.py Remove extraneous batch_input_shape (#4393) 2016-11-16 18:59:03 -08:00
variational_autoencoder_deconv.py fixed variational autoencoder visualization for Gaussian latent space (#4423) 2016-11-23 14:08:19 -08:00
variational_autoencoder.py fixed variational autoencoder visualization for Gaussian latent space (#4423) 2016-11-23 14:08:19 -08:00

Keras examples directory

addition_rnn.py Implementation of sequence to sequence learning for performing addition of two numbers (as strings).

antirectifier.py Demonstrates how to write custom layers for Keras.

babi_memnn.py Trains a memory network on the bAbI dataset for reading comprehension.

babi_rnn.py Trains a two-branch recurrent network on the bAbI dataset for reading comprehension.

cifar10_cnn.py Trains a simple deep CNN on the CIFAR10 small images dataset.

conv_filter_visualization.py Visualization of the filters of VGG16, via gradient ascent in input space.

conv_lstm.py Demonstrates the use of a convolutional LSTM network.

deep_dream.py Deep Dreams in Keras.

image_ocr.py Trains a convolutional stack followed by a recurrent stack and a CTC logloss function to perform optical character recognition (OCR).

imdb_bidirectional_lstm.py Trains a Bidirectional LSTM on the IMDB sentiment classification task.

imdb_cnn.py Demonstrates the use of Convolution1D for text classification.

imdb_cnn_lstm.py Trains a convolutional stack followed by a recurrent stack network on the IMDB sentiment classification task.

imdb_fasttext.py Trains a FastText model on the IMDB sentiment classification task.

imdb_lstm.py Trains a LSTM on the IMDB sentiment classification task.

lstm_benchmark.py Compares different LSTM implementations on the IMDB sentiment classification task.

lstm_text_generation.py Generates text from Nietzsche's writings.

mnist_cnn.py Trains a simple convnet on the MNIST dataset.

mnist_hierarchical_rnn.py Trains a Hierarchical RNN (HRNN) to classify MNIST digits.

mnist_irnn.py Reproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in "A Simple Way to Initialize Recurrent Networks of Rectified Linear Units" by Le et al.

mnist_mlp.py Trains a simple deep multi-layer perceptron on the MNIST dataset.

mnist_net2net.py Reproduction of the Net2Net experiment with MNIST in "Net2Net: Accelerating Learning via Knowledge Transfer".

mnist_siamese_graph.py Trains a Siamese multi-layer perceptron on pairs of digits from the MNIST dataset.

mnist_sklearn_wrapper.py Demonstrates how to use the sklearn wrapper.

mnist_swwae.py Trains a Stacked What-Where AutoEncoder built on residual blocks on the MNIST dataset.

mnist_transfer_cnn.py Transfer learning toy example.

neural_doodle.py Neural doodle.

neural_style_transfer.py Neural style transfer.

pretrained_word_embeddings.py Loads pre-trained word embeddings (GloVe embeddings) into a frozen Keras Embedding layer, and uses it to train a text classification model on the 20 Newsgroup dataset.

reuters_mlp.py Trains and evaluate a simple MLP on the Reuters newswire topic classification task.

stateful_lstm.py Demonstrates how to use stateful RNNs to model long sequences efficiently.

variational_autoencoder.py Demonstrates how to build a variational autoencoder.

variational_autoencoder_deconv.py Demonstrates how to build a variational autoencoder with Keras using deconvolution layers.