Título: Meta-Learning for Low Resource Sentence Compression
Data: 15/09/2022
Horário: 10h00
Local: Videoconferência
Resumo:
The sentence compression task is fundamental in the text summarization process. Unfortunately, there is not enough labeled data for specific domains to train deep learning models to solve this problem. This work presents an approach to circumvent this problem using a meta-learning algorithm called MAML. Unlike other works that use large amounts of data to train their models, our approach emphasizes the problem modeling as a few-shot learning task. This approach consists of training a meta-model from tasks with few examples and then fine-tuning the meta-model for a never-before-seen compression task. Our experiments show that fine-tuning a meta-learning model presents consistently better results for the chosen evaluation metrics when compared to training conventional models.
Banca examinadora: