Deep Learning neural network models have been successfully applied to natural language processing, and are now changing radically how we interact with machines (Siri, Amazon Alexa, Google Home, Skype translator, Google Translate, or the Google search engine). These models are able to infer a continuous representation for words and sentences, instead of using hand-engineered features as in other machine learning approaches. The seminar will introduce the main deep learning models used in natural language processing, allowing the attendees to gain hands-on understanding and implementation of them in Keras.

This course is a 20 hour introduction to the main deep learning models used in text processing. It combines theoretical and practical hands-on classes. Attendants will be able to understand and implement the models in Keras.

Student profile

Addressed to professionals, researchers and students who want to understand and apply deep learning techniques to text. The practical part requires basic programming experience, a university-level course in computer science and experience in Python. Basic math skills (algebra or pre-calculus) are also needed.


Introduction to machine learning and NLP with Keras

Machine learning, Deep learning
Natural Language Processing
A sample NLP task with ML
. Sentiment analysis
. Features
. Logistic Regression
LABORATORY: Sentiment analysis with logistic regression

Multilayer Perceptron and Word Embeddings

Multiple layers ~ Deep: MLP
Backpropagation and gradients
Learning rate
More regularization
Representation learning
Word embeddings
LABORATORY: Sentiment analysis with Multilayer Perceptron

Recurrent Neural Networks,Seq2seq, Neural Machine Translation

From words to sequences: RNNs
. Language Models (sentence encoders)
. Language Generation (sentence decoders)
. Sequence to sequence models and Neural Machine Translation (I)
Better RNNs: LSTM
LABORATORY: Sentiment analysis with LSTMs

Attention, Better Machine Translation and Natural Language Inference

Re-thinking seq2seq:
. Attention and memory
. State of the art NMT with Transformers
Natural Language Inference with siamese networks
LABORATORY: Attention Model for NLI

Convolutional neural networks

Convolutional Neural Networks
Deep learning frameworks
Last words
LABORATORY: Convolutional Neural Networks


Person 1

Eneko Agirre

Full professor, member of IXA

Person 2

Oier Lopez de la Calle

Postdoc researcher at IXA

Person 3

Olatz Perez de Vinaspre

Postdoc researcher at IXA

Invited talk (Friday 14:30)

One Perceptron to Rule Them All: Language and Vision

Deep neural networks have boosted the convergence of multimedia data analytics in a unified framework shared by practitioners in natural language and vision. Image captioning, visual question answering or multimodal translation are some of the first applications of a new and exciting field that exploiting the generalization properties of deep neural representations. This talk will provide an overview of how vision and language problems are addressed with deep neural networks, and the exciting challenges being addressed nowadays by the research community.
Person 4

Xavier Giro i Nieto

Associate Professor at UPC

Practical details

General information

Bring your own laptop (in order to do the practical side).
Part of the Language Analysis and Processing master program.
5 theoretical sessions with interleaved labs (20 hours), plus an invited talk.
Scheduled from July 3rd to 5th 2019, 9:00-13:00 14:30-18:30 (Friday ends 16:00).

Where: "Ada Lovelace", Computer science faculty, San Sebastian.
Accommodation information (in Basque and Spanish).
Lunch on your own in one of the cafeterias on campus.
Teaching language: English.
Capacity: 60 attendants (selected according to CV).
Cost: 184 euros (180 for UPV/EHU members).


Pre-registration open: now to the 15th of May 2019 (closed!).
Please register using this url (Fully booked!).
Email for any enquiry you might have.

Basic programming experience, a university-level course in computer science and experience in Python. Basic math skills (algebra or pre-calculus) are also needed.
Bring your own laptop (no need to install anything).

Previous editions