Deep Learning with Python, Second Edition, Video Edition

Deep Learning with Python, Second Edition, Video Edition

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 96 Lessons (15h 2m) | 1.92 GB

Unlock the groundbreaking advances of deep learning with this extensively revised new edition of the bestselling original. Learn directly from the creator of Keras and master practical Python deep learning techniques that are easy to apply in the real world.

In Deep Learning with Python, Second Edition you will learn:

  • Deep learning from first principles
  • Image classification and image segmentation
  • Timeseries forecasting
  • Text classification and machine translation
  • Text generation, neural style transfer, and image generation

Deep Learning with Python has taught thousands of readers how to put the full capabilities of deep learning into action. This extensively revised second edition introduces deep learning using Python and Keras, and is loaded with insights for both novice and experienced ML practitioners. You’ll learn practical techniques that are easy to apply in the real world, and important theory for perfecting neural networks.

Recent innovations in deep learning unlock exciting new software capabilities like automated language translation, image recognition, and more. Deep learning is quickly becoming essential knowledge for every software developer, and modern tools like Keras and TensorFlow put it within your reach—even if you have no background in mathematics or data science. This book shows you how to get started.

Deep Learning with Python, Second Edition introduces the field of deep learning using Python and the powerful Keras library. In this revised and expanded new edition, Keras creator François Chollet offers insights for both novice and experienced machine learning practitioners. As you move through this book, you’ll build your understanding through intuitive explanations, crisp illustrations, and clear examples. You’ll quickly pick up the skills you need to start developing deep-learning applications.

Chollet is a master of pedagogy and explains complex concepts with minimal fuss, cutting through the math with practical Python code. He is also an experienced ML researcher and his insights on various model architectures or training tips are a joy to read.
Martin Görner, Google

Table of Contents

1 What is deep learning
2 Learning rules and representations from data
3 Understanding how deep learning works, in three figures
4 Before deep learning – A brief history of machine learning
5 Back to neural networks
6 Why deep learning Why now
7 Algorithms
8 The mathematical building blocks of neural networks
9 Data representations for neural networks
10 Real-world examples of data tensors
11 The gears of neural networks – Tensor operations
12 Tensor reshaping
13 The engine of neural networks – Gradient-based optimization
14 Derivative of a tensor operation – The gradient
15 Chaining derivatives – The Backpropagation algorithm
16 Looking back at our first example
17 Introduction to Keras and TensorFlow
18 Setting up a deep learning workspace
19 First steps with TensorFlow
20 Anatomy of a neural network – Understanding core Keras APIs
21 The “compile” step – Configuring the learning process
22 Getting started with neural networks – Classification and regression
23 Building your model
24 Classifying newswires – A multiclass classification example
25 Predicting house prices – A regression example
26 Fundamentals of machine learning
27 The nature of generalization in deep learning
28 Evaluating machine learning models
29 Improving model fit
30 Improving generalization
31 Regularizing your model
32 The universal workflow of machine learning
33 Collect a dataset
34 Develop a model
35 Beat a baseline
36 Deploy the model
37 Monitor your model in the wild
38 Working with Keras – A deep dive
39 Subclassing the Model class
40 Using built-in training and evaluation loops
41 Writing your own training and evaluation loops
42 Make it fast with tf.function
43 Introduction to deep learning for computer vision
44 The convolution operation
45 Training a convnet from scratch on a small dataset
46 Data preprocessing
47 Leveraging a pretrained model
48 Feature extraction with a pretrained model
49 Advanced deep learning for computer vision
50 Modern convnet architecture patterns
51 Residual connections
52 Depthwise separable convolutions
53 Interpreting what convnets learn
54 Visualizing convnet filters
55 Visualizing heatmaps of class activation
56 Deep learning for timeseries
57 Preparing the data
58 Let’s try a basic machine learning model
59 Understanding recurrent neural networks
60 A recurrent layer in Keras
61 Advanced use of recurrent neural networks
62 Using bidirectional RNNs
63 Deep learning for text
64 Preparing text data
65 Vocabulary indexing
66 Two approaches for representing groups of words – Sets and sequences
67 Processing words as a sequence – The sequence model approach, Part 1
68 Processing words as a sequence – The sequence model approach, Part 2
69 The Transformer architecture
70 The Transformer encoder
71 Beyond text classification – Sequence-to-sequence learning
72 Sequence-to-sequence learning with Transformer
73 Generative deep learning
74 How do you generate sequence data
75 A text-generation callback with variable-temperature sampling
76 DeepDream
77 Neural style transfer
78 Generating images with variational autoencoders
79 Implementing a VAE with Keras
80 A bag of tricks
81 Best practices for the real world
82 Hyperparameter optimization
83 Scaling-up model training
84 Multi-GPU training
85 TPU training
86 Conclusions
87 Key enabling technologies
88 Key network architectures
89 The limitations of deep learning
90 Local generalization vs. extreme generalization
91 The purpose of intelligence
92 Setting the course toward greater generality in AI
93 Implementing intelligence – The missing ingredients
94 The missing half of the picture
95 Blending together deep learning and program synthesis
96 Lifelong learning and modular subroutine reuse

Homepage