108.413A: Studies in Computational Linguistics1

Hyopil Shin (Dept. of Linguistics, Seoul National University)

hpshin@snu.ac.kr, http://knlp.snu.ac.kr
Tue  2:00 to 5:00 in building 1 room 103

T.A: TBA

bert

(http://www.theverge.com/2016/3/11/11208078/lee-se-dol-go-google-kasparov-jennings-ai) ( http://jalammar.github.io/illustrated-bert/)

Course Description

Recently deep learning  approaches have obtained very high performance across many different computational linguistics or Natural Language Processing (NLP). This course provides an introduction to the Neural Network Model and  deep learning methodologies applied to NLP from scratch. On the model side we will start from basic notions of Neural Networks such as linear/logistic regression, perceptrons, backpropagations, and parameter optimizations. Then we will  cover actual Neural Network models including Feedforward, Convolutional, Recurrent, and Long Short Term Neural Networks. Along the lines of the models, various word/sentence/contextual embeddings and attention mechanism will also be dealt with in depth. The first part of the class focuses on the basics of neural network models and the second part covers actual implementations of NLP tasks such as sentiment analysis(movie/text classifications), and text generations. We will take advantage of modules from Python 3.x and  PyTorch. Through lectures and programming assignments students will learn the necessary implementation tricks for making neural networks work on practical problems.  

Updates

  • Please set up python, pytorch, and colab for class!

Useful Sites


  • Lectures

Textbook and Sites

                                                                                     

Deep Learning from Scratch (밑바닥부터 시작하는 딥러닝), by 사이토 고키, 한빛출판사. Deep Learning From Scratch source codes


DL wizard


Deep Learning Tutorials based on PyTorch

Syllabus


Date Topics Related Materials and           Resources
Assignments
1 9/1-9/7

Introduction to Computational Linguistics / Natural Language Processing


Preliminaries:


PyTorch:
Install Python 3.x and  PyTorch

모두를 위한 머신러닝/딥러닝(홍콩과기대 김성훈 교수)
  • Linear Regression의 개념:비디오, 강의 슬라이드
  • Linear Regression cost함수 최소화: 비디오, 강의 슬라이
  • 여러 개의 입력(feature)의 Linear Regression: 비디오, 강의 슬라이드
  • Logistic Regression classification: 강의 슬라이드-Hypothesis 함수 소개: 비디오- cost 함수 소개: 비디오

How to Implement Simple Linear Regression From Scratch with Python

Logistic Regression for Machine Learning

2 9/8-9/14 Introduction to a Neural Network PyTorch:
딥러닝 개념잡기
3 9/15-9/21 Introduction to a Neural Network PyTorch:
 
4 9/22-9/28

Introduction to a Neural Network

  • Parameter Optimization
  • Weight Decay
  • Batch Normalization
  • DropOut


Hyper-parameter Tuning Techniques in Deep Learning

PyTorch:


5 9/29-10/5

Introduction to a Neural Network

  • Parameter Optimization
  • Weight Decay
  • Batch Normalization
  • DropOut


Hyper-parameter Tuning Techniques in Deep Learning

PyTorch:

 

6 10/6-10/12


Convolutional Neural Network


Understanding Convolutional Neural Network for  NLP



PyTorch:


7 10/13-10/19

 

Reccurent Neural Network
A Friendly Introduction to Recurrent Neural Network






Long Short-Term Memory Neural Network and Gated Recurrent Unit


PyTorch:



8 10/20-10/26 Encoder-Decoder: Encoder-Decoder Long Short-Term Memory Networks
A Gentle Introduction to LSTM Autoencoders
Step-by-step Understanding LSTM Autoencoder layers

Attention Model: Neural Machine Translation By Jointly Learning to Align and Translate
Self Attention: Attention is All you need

PyTorch:


9 10/27-11/2 Attention Model: Neural Machine Translation By Jointly Learning to Align and Translate
Self Attention: Attention is All you need

Attention in RNNs
Seq2Seq Pay Attention to Self Attention: Part I
Seq2seq Pay Attention to Self Attention: Part 2
Attention: Illustrated Attention
Attention and Memory in Deep Learning and NLP
PyTorch:
Translation with Sequence to Sequence Network and Attention
 
10 11/3-11/9 Attention Model: Neural Machine Translation By Jointly Learning to Align and Translate
Self Attention: Attention is All you need

Attention in RNNs
Seq2Seq Pay Attention to Self Attention: Part I
Seq2seq Pay Attention to Self Attention: Part 2
Attention: Illustrated Attention
Attention and Memory in Deep Learning and NLP
PyTorch:
Translation with Sequence to Sequence Network and Attention

 









11








11/10-11/16

Embeddings (word embeddings)

Sebastian Ruder의 On word Embeddings Part1, 2, 3, 4:  A hands-on Intuitive Approach to Deep Learning Methods for Text Data - Word2Vec, Glove, and FastText

The Current Best of Universal Word Embeddings and Sentence Embeddings

PyTorch:

Word Embeddings: Encoding Lexical Semantics


Word Embedding







12 11/17-11/23
Embeddings (contextual embeddings)

PyTorch:
The Annotated Transformer


13 12/1-12/7
NLP Task 1: Sentiment Analysis
PyTorch Sentiment Analysis

PyTorch:
A Comprehensive Introduction to Torchtext(Practical Torchtext part1)

Language Modeling tutorial in Torchtext(Practical Torchtext part2)


14 12/8-12/14
NLP Task 2: Chatbot
Chatbot Tutorial

NLP Task 3: Generating and Classifying Names with a Character-Level RNN

 

 
15 12/15-12/21 Final Project Presentation