Week 1: Word Sequences

Week 2: Working on Sentiments

Week 3: RRNs and LSTM

Week 4: Generating Texts

We've gotten a grounding in how to do Natural Language processing with TensorFlow and Keras. You went from first principles -- basic Tokenization and Padding of text to produce data structures that could be used in a Neural Network. You then learned about embeddings, and how words could be mapped to vectors, and words of similar semantics given vectors pointing in a similar direction, giving you a mathematical model for their meaning, which could then be fed into a deep neural network for classification. From there you started learning about sequence models, and how they help deepen your understanding of sentiment in text by not just looking at words in isolation, but also how their meanings change when they qualify one another. You wrapped up by taking everything you learned and using it to build a poetry generator! This is just a beginning in using TensorFlow for natural language processing.

Google Colaboratory

The Unreasonable Effectiveness of Recurrent Neural Networks