Neural Networks and NLP

ECS7001P · Queen Mary University of London

Deep learning for NLP: neural network architectures, RNNs, LSTMs, attention mechanisms, transformers, and sequence-to-sequence models.

RNNLSTMAttentionTransformersBERTNMTNLG

Overview

This QMUL module covers the intersection of deep learning and natural language processing. From basic neural networks to attention mechanisms and transformers, with hands-on labs implementing key architectures.

Content & Resources

LAB

Lab 1: Neural Network Basics

Feed-forward networks, backpropagation

LAB

Lab 2: Word Embeddings

Word2Vec, GloVe implementations

LAB

Lab 3: RNNs and LSTMs

Sequence modeling for text

LAB

Lab 4: Attention & Transformers

Self-attention, multi-head attention