M3239.001100: 자연어처리의 응용: 트랜스포머기반 방법론들
(Applications of Natural Language Processing: Transformers-based Methodologies)

108.535A: 컴퓨터언어학연구 II: 트랜스포머기반 방법론들

(Studies on Computational Linguistics II: Transformers-based Methodologies)


Hyopil Shin (Graduate School of Data Science and Dept. of Linguistics, Seoul National University)

hpshin@snu.ac.kr
https://sites.google.com/snu.ac.kr/gsds-nlp/home
http://knlp.snu.ac.kr/

Tue/Thur  3:30 to 4:45 in building 942 room 302

T.A: 이상아(visualjan@snu.ac.kr)

        transformer1  Transformer Architectures Huggingface logo  transformer

        ( Photo by Arseny Togulev on Unsplash)

Course Description

현재 자연언어처리분야에서 Game Changer가 된 Transformer를 중심으로 이를 활용한 여러 응용분야들을 살펴보도록 한다. Transformer의 이론적 고찰에서부터 시작하여 Huggingface의 Transformers에서 제공하는 architecture들을 살펴보고 이 중 중요한 모델들에 대해서 집중적으로 학습한다. 이를 바탕으로 Transformer를 활용한 Sentence Bert, Question Answering, Search, Chatbot, Multimodal, Text Classification/Summarization 등을 살펴보도록 한다. 수강생들은 강의에서 제공되는 주제들을 선택하여 관련 페이퍼와 자료들을 공부하여 발표하고 최종적으로 이를 활용한 시스템의 구현이나 학회에 발표할 수 있는 논문을 작성할 수 있도록 한다. 이 강의를 수강하기 위해서는 텍스트 및 자연어 빅데이터 분석방법론/컴퓨터언어학연구 I 등을 수강하였거나 관련 내용을 숙지하고 있어야 한다. Python, Pytorch 등이 기본적으로 요구된다. 이 과목은 데이터사이언스의 자연어처리의 응용 과목과 언어학과의 컴퓨터언어학연구 II의 Cross-listing 과목이다.

Updates

  • 강의는 기본적으로 Zoom을 이용한 온라인 강의. Zoom강의 주소는 학기초 ETL을 통해 공지됨
  • 강의의 실제 자료와 주피터 노트북은 ETL에 탑재됨

Useful Sites

  • Lectures



Textbook and Sites



Huggingface Site Huggingface Transformers



Syllabus


Date Topics Related Materials and Resources
Repositories
1 3/2 & 3/4
Introduction to Class
Encoder-Decoder Review

Attention Model
Transformer-based Encoder-Decoder Models

Attention: Illustrated Attention


PyTorch:
  • pytorch-seq2seq
    • Sequence to Sequence Learning with Neural Networks
    • Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
    • Neural Machine Translation by Jointly Learning to Align and Translate
    • Packed Padded Sequences, Masking, Inference and BLEU
    • Convolutional Sequence to Sequence Learning
    • Attention is All You Need
2 3/9 & 3/11 Introduction to Transformer

BERT
 (Bidirectional Encoder Representations from Transformers)


BERT Fine Tuning
BERT Fine-Tuning Tutorial with PyTorch

BERT Word Embeddings

Transformers Explained Visually(Part 1): Overview of Functionality

Transformers Explained Visually(Part 2): How it works, step-by-step

Transformers Explained Visually(Part3): Multi-head Attention, deep dive

Master Positional Encoding: Part I

Rethinking Attention with Performers

From Transformers to Performers: Approximating Attention

SWITCH TRANSFORMERS: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity

Google Switch Transformers: Scaling to Trillion Parameter Models with Constant Computational Costs
PyTorch:
The Annotated Transformer
3 3/16 & 3/18 Introduction to Huggingface Transformers
  • Summary of Tasks : Sequence Classification, Extractive Question Answering, Language Modeling, Text Generation, Named Entity Recognition, Summarization, and Translation

Some Models for Long Sequences

Transformers by Huggingface and Full Documentation
4 3/23 & 3/25 Introduction to Huggingface Transformers
Huggingface Transformers Notebooks

Fine Tuning BERT for Text Classification with FARM
5 3/30 & 4/1 Sentence Embedding with Transformers


Sentence-BERT: Sentence Embeddings using Siamese-Networks
GitHub - adsieg/text_similarity: Text Similarity
6 4/6 & 4/8
Sentence Embedding with Transformers Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation

LaBSE:Language-Agnostic BERT Sentence Embeddings by Google AI

Billion-scale Semantic Similarity Search with FAISS+SBERT

How to Build Semantic Search with Transformers and FAISS
Facebook Faiss : Library for efficient similarity search and clustering of dense vectors.
7 4/13 & 4/15 Search with Transformers
txtai
tldrstroy
8 4/20 & 4/22 Search with Transformers Introducing txtai, an AI-Powered search engine on Transformers 
Building a Faster and Accurate Search Engine on Custom Dataset with Transformers

Deep Learning for Semantic Text Matching
txtai
tldrstroy
9
4/27 & 4/29 Text Classification /Generation with Transformers Siamese and Dual BERT for Multi Text Classification

GPT2 for Text Classification using Huggingface Transformers

10 5/4 & 5/6 Text Classification /Generation with Transformers Build a Bidirectional Text Generation Using Pytorch

Text Generation in Any Language with GPT-2 

11 5/11 & 5/13

Summarization with Transformers TLDR!! Summarize Articles and Content With NLP

PEGASUS: Google's State of the Art Abstractive Summarization Model

Fine Tuning a T5 Transformer for Any Summarization Task
PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization by Zhang et al.

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension by Lewis et al.

Language Models are Unsupervised Multitask Learners by Radford et al.

Discourse-Aware Neural Extractive Text Summarization  
12 5/18 & 5/20 Multimodal Transformers

TAPAS
Transformers with Tabular Data: How to Incorporate Tabular Data with Huggingface Transformers

Google Unveils TAPAS, a BERT-based Neural Network for Querying Tables Using Natural Language

Google TAPAS is a BERT-based Model to Query Tabular Data Using Neural Language
Multimodal Transformers | Transformers with Tabular Data


Weakly Supervised Table Parsing via Pre-training by Herzig et al.

13 5/25 & 5/27 QA with Transformers
BERT-based Cross-Lingual Question Answering with DeepPavlov

How to Finetune mT5 to Create a Question Generator(for 100_Languages)

Build an Open-Domain Question-Answering System With BERT in 3 Lines of Code

Sentence2MCQ using BERT Word Sense Disambiguation and T5 Transformer
Haystack: Neural Question Answering at Scale
14 6/1 & 6/3 Chatbot with Transformers Chatbots Were the Next Big Thing: What Happened? - The Startup

Chatbots are Cool! A Framework Using Python

Let's Build an Intelligent Chatbot

Make Your Own Rick Sanchez (bot) with Transformers and DialoGPT Fine-Tuining

Blenderbot- Part 1: The Data

Blenderbot - part 2: The Transformer
Recipes for Building an Open-domain Chatbot by Roller et al.
15 6/8 & 6/10 Final Presentations