Introduction to Transformer for NLP with Python
BERT, GPT, Deep Learning, Machine Learning, & NLP with Hugging Face, Attention in Python, Tensorflow, PyTorch, & Keras
Description
Interested in the field of Natural Language Processing (NLP)? Then this course is for you!
Ever since Transformers arrived on the scene, deep learning hasn't been the same.
Machine learning is able to generate text essentially indistinguishable from that created by humans
We've reached new state-of-the-art performance in many NLP tasks, such as machine translation, question-answering, entailment, named entity recognition, and more
In this course, you will learn very practical skills for applying transformers, and if you want, the detailed theory behind how transformers and attention work.
There are several reasons why this course is different from any other course. The first reason is that it covers all basic natural language process techniques, so you will have an understanding of what natural language processing is. The second reason is that it covers GPT-2, NER, and BERT which are very popular in natural language processing. The final reason is that you will have lots of practice projects with detailed explanations step-by-step notebook so you can read it when you have free time.
The course is split into 4 major parts:
Basic natural language processing
Fundamental Transformers
Text generation with GPT-2
Text classification
PART 1: Using Transformers
In this section, you will learn about the fundamental of the natural language process. It is really important to understand basic natural language processing before learning transformers. In this section we will cover:
What is natural language processing (NLP)
What is stemming and lemmatization
What is chunking
What is a bag of words?
In this section, we will build 3 small projects. These projects are:
Gender identification
Sentiment analyzer
Topic modelling
PART 2: Fundamental transformer
In this section, you will learn how transformers really work. We will also introduce the new concept called Hugging face transformer and GPT-2 to have a big understanding of how powerful the transformer is.
In this section, we will implement two projects.
IMDB project
Q&A project implementation
PART 3: Project: Text generation with GPT-2
In this project, we will generate text with GPT-2. This is a project for us to practice and reinforce what we have learned so far. It will also demonstrate how text is generated quickly with a transformer.
PART 4: Token classification.
In this section, we will learn how to classify a text using a transformer. We will also learn about NER which is also popular in transformers. The main project in this section is about Q &A project and it will be more advanced than the previous Q & A project.
What You Will Learn!
- Chunking
- Bag of Words
- Hugging Face transformer
- POS tagging
- TF-IDF
- GPT-2
- Token Classification
- BERT
- Stemming
- Lemmatization
- NER
- Preprocessing data
- Attention
- Fine-tuning
Who Should Attend!
- Anyone interested in Deep Learning, Machine Learning and Artificial Intelligence
- Anyone passionate about Artificial Intelligence
- Anyone interested in Natural Language Processing
- Data Scientists who want to take their AI Skills to the next level