Deep Learning neural network models have been successfully applied to natural language processing, and are now changing radically how we interact with machines (translation, search engines, Siri, Alexa, GPT and Bing Chat to name a few). These models are able to infer a continuous representation for words and sentences, and generalize to new tasks with much less training data. The seminar will introduce the main deep learning models used in natural language processing, allowing the attendees to gain hands-on understanding and implementation of them in Keras.

This course is a 20 hour introduction to the main deep learning models used in text processing, covering the latest developments, including Transformers and pre-trained (multilingual) language models like GPT4, T5, BERT, and their use with fine-tuning and prompting, as well as instruction learning and human feedback. It combines theoretical and practical hands-on classes. Attendants will be able to understand and implement the models in Keras.

The course is part of the NLP master hosted by the Ixa NLP research group at the HiTZ research center of the University of the Basque Country (UPV/EHU).

Student profile

Addressed to professionals, researchers and students who want to understand and apply deep learning techniques to text. The practical part requires basic programming experience, a university-level course in computer science and experience in Python. Basic math skills (algebra or pre-calculus) are also needed.

Contents

Introduction to machine learning and NLP with Keras

Machine learning, Deep learning
Natural Language Processing
A sample NLP task with ML
. Sentiment analysis
. Features
. Logistic Regression
LABORATORY: Sentiment analysis with logistic regression

Multilayer Perceptron and Word Embeddings

Multiple layers ~ Deep: MLP
Backpropagation and gradients
Learning rate
More regularization
Hyperparameters
Representation learning
Word embeddings
LABORATORY: Sentiment analysis with Multilayer Perceptron

Recurrent Neural Networks, Seq2seq, Neural Machine Translation

From words to sequences: RNNs
. Language Models (sentence encoders)
. Language Generation (sentence decoders)
. Sequence to sequence models and Neural Machine Translation (I)
Better RNNs: LSTM
LABORATORY: Sentiment analysis with LSTMs

Attention, Better Machine Translation and Natural Language Inference

Re-thinking seq2seq:
. Attention, memory, Transformers
. State of the art NMT
Natural Language Inference with siamese networks
LABORATORY: Attention Model for NLI

Pre-trained Transformers, BERT, GPT.

Pre-trained language models.
. Multilingual transfer learning
. Fine-tuning, Prompting, Instructions, Human Feedback
Deep learning frameworks
Last words
LABORATORY: Pre-trained transformers for sentiment analysis and NLI

Instructors

Person 1

Eneko Agirre

Professor, member of IXA
Director of HiTZ
ACL fellow

Person 2

Oier Lopez de la Calle

Assistant Professor, member of IXA
and HiTZ

Person 2

Ander Barrena

Assistant Professor at IXA
and HiTZ

Practical details

General information

Part of the Language Analysis and Processing master program.
  • The classes will be broadcasted live online. The practical labs will be also held online, in two split groups with one lecturer in each.
  • 5 theoretical sessions with interleaved hands-on labs (20 hours).
  • Scheduled from July 10th to 14th 2022, 15:00-19:00 CET.
  • Teaching language: English.
  • Capacity: 60 attendants (First-come first-served).
  • Cost: 270€ + 4€ insurance = 274€
    (If you are an UPV/EHU member or have already registered for another course, it is 270€).

Registration

Registration open: now to the 26th of June 2023 (or until room is full).
  • Please register by email to amaia.lorenzo@ehu.eus (subject "Registration to DL4NLP", please CC gorka.azcune@ehu.eus).
  • After you receive the payment instructions you will have three days to formalize the payment.
  • Plese use the same email for any enquiry you might have.
  • The university provides official certificates for an additional fee. Please apply AFTER completing the course.
  • The university can provide invoices addressed to universities or companies. More details are provided after registration is made.

Prerequisite
Basic programming experience, a university-level course in computer science and experience in Python. Basic math skills (algebra or pre-calculus) are also needed.

Previous editions

Online class of July 2020 (left), with a handful of the 70 participants. To the right the screen lecturers had to talk to :-)

Class of January 2020

Class of July 2019