I am a final-year PhD student at the Universitat Politècnica de Catalunya (UPC) in Barcelona and a member of the Language and Speech Technologies and Applications (TALP) research group and the Machine Translation group. My research focuses on approaches to neural machine translation that leverage unsupervised and continual learning methods. I am advised by Dr. Marta Ruiz Costa-jussà.
Apart from research, I teach classes and supervise MSc students. I am also a co-organizer of the Lifelong Learning for Machine Translation shared task (in 2020 and 2021) and the Similar Language Translation shared task (in 2020 and 2021) at the Conference on Machine Translation (WMT) co-located with EMNLP.
PhD - Neural Machine Translation (in progress)
Universitat Politècnica de Catalunya
MSc in Engineering - Computer Science
Warsaw University of Technology
MA - Management
University of Warsaw
BSc in Engineering - Computer Science
Warsaw University of Technology
Nominated for Best Paper Award (top 6.97% of accepted regular papers).
Ranked 1st for Czech→Polish and 2nd for Spanish→Portuguese translation systems.
Lifelong learning aims to enable information systems to learn from a continuous stream of data across time. However, this scenario is very challenging as the general limitations of machine learning methods apply to neural network-based models. Contemporary neural networks learn in isolation, and are not able to effectively learn new information without forgetting previously acquired knowledge.
[Read the full post]
The majority of current NMT systems is still trained using large bilingual corpora, which are available only for a handful of domains and high-resource language pairs. This is mainly caused by the fact that creating parallel corpora requires a great amount of resources (e.g. data, knowledge, time, money). Therefore, research on unsupervised MT focuses on eliminating the dependency on labeled data, which is especially beneficial for low-resource languages.
[Read the full post]
➡️ Introduction to NLP (lecturer)
➡️ Embeddings & Text Classification (lab instructor)
➡️ Sequence & Language Modeling (lecturer)
➡️ Machine Translation (lecturer)
➡️ Machine Translation (lab instructor)
➡️ Introduction to NLP (lecturer)
➡️ Embeddings & Text Classification (lab instructor)
➡️ Sequence & Language Modeling (lecturer)
➡️ Machine Translation (lab instructor)
➡️ Introduction to NLP (lecturer)
➡️ Machine Translation (lab assistant instructor)