Modern NLP using Deep Learning

Published 2022-05-29
Platform Udemy
Rating 4.00
Number of Reviews 1
Number of Students 417
Price Free
Subjects

Go to Udemy

Neural machine translation (NMT), Text summarization, Question Answering, Chatbot

You will learn the newest state-of-the-art Natural language processing (NLP) Deep-learning approaches.


You will

  1. Get state-of-the-art knowledge regarding

    1. NMT

    2. Text summarization

    3. QA

    4. Chatbot

  2. Validate your knowledge by answering short and very easy 3-question queezes of each lecture

  3. Be able to complete the course by ~2 hours.


Syllabus

  1. Neural machine translation (NMT)

    1. Seq2seq
      A family of machine learning approaches used for natural language processing.

    2. Attention
      A technique that mimics cognitive attention.

    3. NMT
      An approach to machine translation that uses an artificial neural network to predict the likelihood of a sequence of words, typically modelling entire sentences in a single integrated model.

    4. Teacher-forcing
      An algorithm for training the weights of recurrent neural networks (RNNs).

    5. BLEU
      An algorithm for evaluating the quality of text which has been machine-translated from one natural language to another.

    6. Beam search
      A heuristic search algorithm that explores a graph by expanding the most promising node in a limited set.

  2. Text summarization

    1. Transformer
      A deep learning model that adopts the mechanism of self-attention, differentially weighting the significance of each part of the input data.

  3. Question Answering

    1. GPT-3
      An autoregressive language model that uses deep learning to produce human-like text.

    2. BERT
      A transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google.

  4. Chatbot

    1. LSH
      An algorithmic technique that hashes similar input items into the same "buckets" with high probability.

    2. RevNet
      A variant of ResNets where each layer's activations can be reconstructed exactly from the next layer's.

    3. Reformer
      Introduces two techniques to improve the efficiency of Transformers.

Resources

Go to Udemy