Natural Language Processing in Action Video Edition

Natural Language Processing in Action Video Edition
Natural Language Processing in Action Video Edition

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 17h 26m | 4.12 GB
eLearning | Skill level: All Levels


“Learn both the theory and practical skills needed to go beyond merely understanding the inner workings of NLP, and start creating your own algorithms or models.”
Dr. Arwen Griffioen, Zendesk

Natural Language Processing in Action is your guide to creating machines that understand human language using the power of Python with its ecosystem of packages dedicated to NLP and AI.

Recent advances in deep learning empower applications to understand text and speech with extreme accuracy. The result? Chatbots that can imitate real people, meaningful resume-to-job matches, superb predictive search, and automatically generated document summaries—all at a low cost. New techniques, along with accessible tools like Keras and TensorFlow, make professional-quality NLP easier than ever before.

Natural Language Processing in Action is your guide to building machines that can read and interpret human language. In it, you’ll use readily available Python packages to capture the meaning in text and react accordingly. The book expands traditional NLP approaches to include neural networks, modern deep learning algorithms, and generative techniques as you tackle real-world problems like extracting dates and names, composing text, and answering free-form questions.

Inside:

  • Some sentences in this book were written by NLP! Can you guess which ones?
  • Working with Keras, TensorFlow, gensim, and scikit-learn
  • Rule-based and data-based NLP
  • Scalable pipelines

This course requires a basic understanding of deep learning and intermediate Python skills.

Hobson Lane, Cole Howard, and Hannes Max Hapke are experienced NLP engineers who use these techniques in production.

Provides a great overview of current NLP tools in Python. I’ll definitely be keeping this book on hand for my own NLP work. Highly recommended!
Tony Mullen, Northeastern University–Seattle

An intuitive guide to get you started with NLP. The book is full of programming examples that help you learn in a very pragmatic way.
Tommaso Teofili, Adobe Systems

+ Table of Contents

1 Part 1. Wordy machines
2 Natural language vs. programming language
3 The magic
4 The math
5 Practical applications
6 Language through a computer’s “eyes”
7 A simple chatbot
8 Another way
9 A brief overflight of hyperspace
10 Word order and grammar
11 A chatbot natural language pipeline
12 Processing in depth
13 Natural language IQ
14 Challenges (a preview of stemming)
15 Building your vocabulary with a tokenizer Part 1
16 Building your vocabulary with a tokenizer Part 2
17 Dot product
18 A token improvement
19 Extending your vocabulary with n-grams Part 1
20 Extending your vocabulary with n-grams Part 2
21 Normalizing your vocabulary Part 1
22 Normalizing your vocabulary Part 2
23 Normalizing your vocabulary Part 3
24 Sentiment
25 VADER—A rule-based sentiment analyzer
26 Math with words (TF-IDF vectors)
27 Bag of words
28 Vectorizing
29 Vector spaces
30 Zipf’s Law
31 Topic modeling
32 Relevance ranking
33 Okapi BM25
34 From word counts to topic scores
35 TF-IDF vectors and lemmatization
36 Thought experiment
37 An algorithm for scoring topics
38 An LDA classifier
39 Latent semantic analysis
40 Your thought experiment made real
41 Singular value decomposition
42 U—left singular vectors
43 SVD matrix orientation
44 Principal component analysis
45 Stop horsing around and get back to NLP
46 Using truncated SVD for SMS message semantic analysis
47 Latent Dirichlet allocation (LDiA)
48 LDiA topic model for SMS messages
49 Distance and similarity
50 Steering with feedback
51 Topic vector power
52 Semantic search
53 Part 2. Deeper learning (neural networks)
54 Neural networks, the ingredient list
55 Detour through bias Part 1
56 Detour through bias Part 2
57 Detour through bias Part 3
58 Let’s go skiing—the error surface
59 Keras – Neural networks in Python
60 Semantic queries and analogies
61 Word vectors
62 Vector-oriented reasoning
63 How to compute Word2vec representations Part 1
64 How to compute Word2vec representations Part 2
65 How to use the gensim.word2vec module
66 How to generate your own word vector representations
67 fastText
68 Visualizing word relationships
69 Unnatural words
70 Learning meaning
71 Toolkit
72 Convolutional neural nets
73 Padding
74 Narrow windows indeed
75 Implementation in Keras – prepping the data
76 Convolutional neural network architecture
77 The cherry on the sundae
78 Using the model in a pipeline
79 Loopy (recurrent) neural networks (RNNs)
80 Remembering with recurrent networks
81 Backpropagation through time
82 Recap
83 Putting things together
84 Hyperparameters
85 Predicting
86 LSTM Part 1
87 LSTM Part 2
88 Backpropagation through time
89 Back to the dirty data
90 My turn to chat
91 My turn to speak more clearly
92 Learned how to say, but not yet what
93 Encoder-decoder architecture
94 Decoding thought
95 Look familiar
96 Assembling a sequence-to-sequence pipeline
97 Sequence encoder
98 Training the sequence-to-sequence network
99 Building a chatbot using sequence-to-sequence networks
100 Enhancements
101 In the real world
102 Part 3. Getting real (real-world NLP challenges)
103 Named entities and relations
104 A knowledge base
105 Regular patterns
106 Information worth extracting
107 Extracting dates
108 Extracting relationships (relations)
109 Relation normalization and extraction
110 Why won’t split(‘.!’) work
111 Language skill
112 Modern approaches Part 1
113 Modern approaches Part 2
114 Pattern-matching approach
115 A pattern-matching chatbot with AIML Part 1
116 A pattern-matching chatbot with AIML Part 2
117 Grounding
118 Retrieval (search)
119 Example retrieval-based chatbot
120 Generative models
121 Four-wheel drive
122 Design process
123 Trickery
124 Too much of a good thing (data)
125 Optimizing NLP algorithms
126 Advanced indexing
127 Advanced indexing with Annoy
128 Why use approximate indexes at all
129 Constant RAM algorithms
130 Parallelizing your NLP computations
131 Reducing the memory footprint during model training
132 Anaconda3
133 Mac
134 Working with strings
135 Regular expressions
136 Vectors
137 Distances Part 2
138 Data selection and avoiding bias
139 Knowing is half the battle
140 Holding your model back
141 Imbalanced training sets
142 Performance metrics
143 High-dimensional vectors are different
144 High-dimensional thinking
145 High-dimensional indexing