Natural Language Processing in TensorFlow, Tensorflow in Practice Specialization, Coursera

Source: Deep Learning on Medium

Summary:

The knowledge point of this course(Natural Language Processing in Tensorflow) is organized in strict chronological order as to how the instructor designed this course. Also, include some colab notebooks provided by Laurence Moroney.

What you see below is what you can get from this course.

Week1 Sentiment in Text:

In the first week, Moroney talked about:

Word-based encoding

Week2 Word Embeddings

Today in the world of applied NLP, word embeddings have proved to be one of the most powerful and useful ideas to help teams get excellent performance.

Moroney mentioned that there’s a library called TensorFlow Data Services or TFDS for short, and that contains many data sets and lots of different categories.

  • The dataset used in week2 is IMBD reviews dataset.
  • You would learn how to train a classification neural network from scratch with Tensorflow, Numpy, and Pandas.
  • Laurence also talked about technical skills for model fine-tuning.
  • You would learn how to plot the result and understand the performance of your NN
  • The most interesting I learned this week is how to pull out the embedding layer and use projector.tensorflow.org to visualize your word embeddings.

Week3 Sequence Models

Sentiment can also be determined by the sequence in which words appear. For example, you could have ‘not fun,’ which of course, is the opposite of ‘fun,’ that’s why sequence models are very important in NLP.

In this week, you would get:

  • How to implement an LSTM model (Long-Short-Term-Memory) RNN.
  • How to do it single directional and bidirectional.
  • Measure the performance.
  • How to implement a Convolution NN in the NLP project.
  • Provided notebooks below to help students understand sequence models better.

IMDB Subwords 8K with Single Layer LSTM

IMDB Subwords 8K with Multi-Layer LSTM

IMDB Subwords 8K with 1D Convolutional Layer

Sarcasm with Bidirectional LSTM

Sarcasm with 1D Convolutional Layer

MDB Reviews with GRU (and optional LSTM and Conv1D)

  • The exercise this week is for helping the student to explore overfitting in NLP and find the reason of overfitting.