addition_rnn |
Implementation of sequence to sequence learning for performing addition of two numbers (as strings). |
babi_memnn |
Trains a memory network on the bAbI dataset for reading comprehension. |
babi_rnn |
Trains a two-branch recurrent network on the bAbI dataset for reading comprehension. |
cifar10_cnn |
Trains a simple deep CNN on the CIFAR10 small images dataset. |
cifar10_densenet |
Trains a DenseNet-40-12 on the CIFAR10 small images dataset. |
conv_lstm |
Demonstrates the use of a convolutional LSTM network. |
deep_dream |
Deep Dreams in Keras. |
eager_dcgan |
Generating digits with generative adversarial networks and eager execution. |
eager_image_captioning |
Generating image captions with Keras and eager execution. |
eager_pix2pix |
Image-to-image translation with Pix2Pix, using eager execution. |
eager_styletransfer |
Neural style transfer with eager execution. |
fine_tuning |
Fine tuning of a image classification model. |
imdb_bidirectional_lstm |
Trains a Bidirectional LSTM on the IMDB sentiment classification task. |
imdb_cnn |
Demonstrates the use of Convolution1D for text classification. |
imdb_cnn_lstm |
Trains a convolutional stack followed by a recurrent stack network on the IMDB sentiment classification task. |
imdb_fasttext |
Trains a FastText model on the IMDB sentiment classification task. |
imdb_lstm |
Trains a LSTM on the IMDB sentiment classification task. |
lstm_text_generation |
Generates text from Nietzsche’s writings. |
lstm_seq2seq |
This script demonstrates how to implement a basic character-level sequence-to-sequence model. |
mnist_acgan |
Implementation of AC-GAN (Auxiliary Classifier GAN ) on the MNIST dataset |
mnist_antirectifier |
Demonstrates how to write custom layers for Keras |
mnist_cnn |
Trains a simple convnet on the MNIST dataset. |
mnist_cnn_embeddings |
Demonstrates how to visualize embeddings in TensorBoard. |
mnist_irnn |
Reproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in “A Simple Way to Initialize Recurrent Networks of Rectified Linear Units” by Le et al. |
mnist_mlp |
Trains a simple deep multi-layer perceptron on the MNIST dataset. |
mnist_hierarchical_rnn |
Trains a Hierarchical RNN (HRNN) to classify MNIST digits. |
mnist_tfrecord |
MNIST dataset with TFRecords, the standard TensorFlow data format. |
mnist_transfer_cnn |
Transfer learning toy example. |
neural_style_transfer |
Neural style transfer (generating an image with the same “content” as a base image, but with the “style” of a different picture). |
nmt_attention |
Neural machine translation with an attention mechanism. |
quora_siamese_lstm |
Classifying duplicate quesitons from Quora using Siamese Recurrent Architecture. |
reuters_mlp |
Trains and evaluatea a simple MLP on the Reuters newswire topic classification task. |
stateful_lstm |
Demonstrates how to use stateful RNNs to model long sequences efficiently. |
text_explanation_lime |
How to use lime to explain text data. |
variational_autoencoder |
Demonstrates how to build a variational autoencoder. |
variational_autoencoder_deconv |
Demonstrates how to build a variational autoencoder with Keras using deconvolution layers. |
tfprob_vae |
A variational autoencoder using TensorFlow Probability on Kuzushiji-MNIST. |
vq_vae |
Discrete Representation Learning with VQ-VAE and TensorFlow Probability. |