Hyopil Shin (Dept. of Linguistics, Seoul National University)
hpshin@snu.ac.kr
http://knlp.snu.ac.kr/
T.A: 서진
(Seemdog@snu.ac.kr)
ChatGPT
(http://www.theverge.com/2016/3/11/11208078/lee-se-dol-go-google-kasparov-jennings-ai)
Date | Topics | Related Materials and
Resources |
PyTorch |
|
1 | 3/4-3/9 |
Introduction to Natural Language Processing
Language Modeling 1- Statistical Language
Modeling: N-Grams |
Natural
Language Processing is Fun! Language Modeling and with N-Grams |
PyTorch: |
2 | 3/11-3/16 | Language Modeling
1- Statistical Language Modeling: Entropy and Maximum
Entropy Models |
Entropy is a Measure of Uncertainty | |
3 | 3/18-3/23 | Text Classification |
Text Classification |
|
4 | 3/25-3/30 | Vector Semantics Language Modeling II: Static Word Embedding
|
Vector Semantics and Embeddings | PyTorch: Linear Regression With PyTorch Logistic Regression With PyTorch |
5 | 4/1-4/6 | Language Modeling II:
Static Word Embedding |
Vector Semantics
and Embeddings |
PyTorch:
|
6 | 14/8-4/13 |
Sequence to Sequence
Model: Encoder-Decoder |
PyTorch:
|
|
7 | 4/15-4/20 | Attention
Model Neural Machine Translation By Jointly Learning to Align and Translate
|
Attention: Illustrated Attention | PyTorch:
|
8 | 4/22-4/27 | Transformer Self Attention: Attention is All you need |
PyTorch: |
|
9 |
4/22-4/27 | Language
Modeling III:
Dynamic Word
Embedding : BERT
(Bidirectional Encoder
Representations from
Transformers) |
BERT Fine Tuning BERT Fine-Tuning Tutorial with PyTorch BERT Word Embeddings |
|
10 |
4/29-5/4 |
Pre-trained
Models and Transfer Learning
|
|
XLM-R:
Unsupervised Cross-lingual Representation Learning
at Scale XLNet: Generalized Autoregressive Pretraining for Language Understanding MASS: Masked Sequence to Sequence Pre-training for Language Generation BART:Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension GLM: All NLP Tasks Are Generation Tasks: A General Pretraining Framework SpanBERT: Improving Pre-training by Representing and Predicting Spans |
11 | 5/6-5/11 |
Transformers by
Huggingface: |
|
|
12 | 5/13-5/18 | Transformers by
Huggingface: Quick Tour Summary of Tasks : Sequence Classification, Extractive Question Answering, Language Modeling, Text Generation, Named Entity Recognition, Sumarization, and Translation |
Various Korean text
processing with Huggingface Transformers and
Korean Pre-trained models |
|
13 | 5/20-5/25 | Language
Modeling IV:Large
Language Models (LLMs)
|
|
|
14 | 5/27-6/1 | Large Language Models For Korean |
|
DaG(David and Goliath Large Language Model) |
15 | 6-3/6-8 | Final Test and Project Presentations |