更新时间:2021-07-15 17:17:25
封面
书名页
Deep Learning with Theano
Credits
About the Author
Acknowledgments
About the Reviewers
www.PacktPub.com
eBooks discount offers and more
Customer Feedback
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Chapter 1. Theano Basics
The need for tensors
Installing and loading Theano
Tensors
Graphs and symbolic computing
Operations on tensors
Memory and variables
Functions and automatic differentiation
Loops in symbolic computing
Configuration profiling and debugging
Summary
Chapter 2. Classifying Handwritten Digits with a Feedforward Network
The MNIST dataset
Structure of a training program
Classification loss function
Single-layer linear model
Cost function and errors
Backpropagation and stochastic gradient descent
Multiple layer model
Convolutions and max layers
Training
Dropout
Inference
Optimization and other update rules
Related articles
Chapter 3. Encoding Word into Vector
Encoding and embedding
Dataset
Continuous Bag of Words model
Training the model
Visualizing the learned embeddings
Evaluating embeddings – analogical reasoning
Evaluating embeddings – quantitative analysis
Application of word embeddings
Weight tying
Further reading
Chapter 4. Generating Text with a Recurrent Neural Net
Need for RNN
A dataset for natural language
Simple recurrent network
Metrics for natural language performance
Training loss comparison
Example of predictions
Applications of RNN
Chapter 5. Analyzing Sentiment with a Bidirectional LSTM
Installing and configuring Keras
Preprocessing text data
Designing the architecture for the model
Compiling and training the model
Evaluating the model
Saving and loading the model
Running the example
Chapter 6. Locating with Spatial Transformer Networks
MNIST CNN model with Lasagne
A localization network
Unsupervised learning with co-localization
Region-based localization networks
Chapter 7. Classifying Images with Residual Networks
Natural image datasets
Residual connections
Stochastic depth
Dense connections
Multi-GPU
Data augmentation
Chapter 8. Translating and Explaining with Encoding – decoding Networks
Sequence-to-sequence networks for natural language processing
Seq2seq for translation
Seq2seq for chatbots
Improving efficiency of sequence-to-sequence network
Deconvolutions for images
Multimodal deep learning