keras/examples/imdb_cnn_lstm.py

85 lines
2.4 KiB
Python
Raw Normal View History

Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
from __future__ import absolute_import
from __future__ import print_function
import numpy as np
np.random.seed(1337) # for reproducibility
from keras.preprocessing import sequence
from keras.optimizers import SGD, RMSprop, Adagrad
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation
from keras.layers.embeddings import Embedding
from keras.layers.recurrent import LSTM, GRU, SimpleRNN
from keras.layers.convolutional import Convolution1D, MaxPooling1D
from keras.datasets import imdb
'''
2015-10-24 21:07:47 +00:00
Train a recurrent convolutional network on the IMDB sentiment classification task.
2015-10-24 21:13:02 +00:00
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
GPU command:
THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python imdb_lstm.py
2015-10-25 03:39:06 +00:00
2015-10-24 21:13:02 +00:00
Get to 0.8498 test accuracy after 2 epochs. 41s/epoch on K520 GPU.
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
'''
2015-10-25 03:39:06 +00:00
# Embedding
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
max_features = 20000
2015-10-25 03:39:06 +00:00
maxlen = 100
2015-10-24 21:07:47 +00:00
embedding_size = 128
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
2015-10-25 03:39:06 +00:00
# Convolution
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
filter_length = 3
nb_filter = 64
2015-10-24 21:07:47 +00:00
pool_length = 2
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
2015-10-25 03:39:06 +00:00
# LSTM
2015-10-24 21:07:47 +00:00
lstm_output_size = 70
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
2015-10-25 03:39:06 +00:00
# Training
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
batch_size = 30
2015-10-24 21:07:47 +00:00
nb_epoch = 2
2015-10-25 03:39:06 +00:00
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
'''
Note:
batch_size is highly sensitive.
2015-10-25 03:39:06 +00:00
Only 2 epochs are needed as the dataset is very small.
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
'''
print("Loading data...")
(X_train, y_train), (X_test, y_test) = imdb.load_data(nb_words=max_features, test_split=0.2)
print(len(X_train), 'train sequences')
print(len(X_test), 'test sequences')
print("Pad sequences (samples x time)")
X_train = sequence.pad_sequences(X_train, maxlen=maxlen)
X_test = sequence.pad_sequences(X_test, maxlen=maxlen)
print('X_train shape:', X_train.shape)
print('X_test shape:', X_test.shape)
print('Build model...')
model = Sequential()
model.add(Embedding(max_features, embedding_size, input_length=maxlen))
model.add(Dropout(0.25))
model.add(Convolution1D(nb_filter=nb_filter,
filter_length=filter_length,
border_mode="valid",
activation="relu",
subsample_length=1))
model.add(MaxPooling1D(pool_length=pool_length))
2015-10-25 03:39:06 +00:00
model.add(LSTM(lstm_output_size))
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
model.add(Dense(1))
model.add(Activation('sigmoid'))
2015-10-25 03:39:06 +00:00
model.compile(loss='binary_crossentropy',
optimizer='adam',
class_mode="binary")
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
print("Train...")
2015-10-25 03:39:06 +00:00
model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch,
validation_data=(X_test, y_test), show_accuracy=True)
score, acc = model.evaluate(X_test, y_test, batch_size=batch_size,
show_accuracy=True)
Faster and better Sentiment Analysis example. Faster and more accurate sentiment analysis using combination of convolutional and recurrent layers. Better and faster results when compared to using either convnet or rnn alone. Comparison with other sentiment analysis examples (run on a slow machine so that the time differences are visible): imdb_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/4 20000/20000 [==============================] - 784s - loss: 0.4773 - acc: 0.7769 - val_loss: 0.3613 - val_acc: 0.8396 Epoch 2/4 20000/20000 [==============================] - 788s - loss: 0.2691 - acc: 0.8946 - val_loss: 0.3644 - val_acc: 0.8376 Epoch 3/4 20000/20000 [==============================] - 791s - loss: 0.1770 - acc: 0.9351 - val_loss: 0.3913 - val_acc: 0.8370 Epoch 4/4 20000/20000 [==============================] - 800s - loss: 0.1137 - acc: 0.9612 - val_loss: 0.4621 - val_acc: 0.8308 5000/5000 [==============================] - 48s Test score: 0.46212145137 Test accuracy: 0.8308 imdb_cnn.py Train on 20000 samples, validate on 5000 samples Epoch 1/3 20000/20000 [==============================] - 1414s - loss: 0.6401 - acc: 0.5930 - val_loss: 0.5144 - val_acc: 0.7442 Epoch 2/3 20000/20000 [==============================] - 1411s - loss: 0.3908 - acc: 0.8255 - val_loss: 0.3615 - val_acc: 0.8344 Epoch 3/3 20000/20000 [==============================] - 1416s - loss: 0.3173 - acc: 0.8636 - val_loss: 0.3788 - val_acc: 0.8256 imdb_cnn_lstm.py Train... Train on 20000 samples, validate on 5000 samples Epoch 1/2 20000/20000 [==============================] - 575s - loss: 0.4312 - acc: 0.7900 - val_loss: 0.3457 - val_acc: 0.8456 Epoch 2/2 20000/20000 [==============================] - 580s - loss: 0.2302 - acc: 0.9094 - val_loss: 0.3546 - val_acc: 0.8498 5000/5000 [==============================] - 26s Test score: 0.354624111649 Test accuracy: 0.8498
2015-10-23 20:19:27 +00:00
print('Test score:', score)
print('Test accuracy:', acc)