In-depth exploration of the syntactic capabilities of autoencoding language models for downstream applications 

    Pérez-Mayos, Laura (Date of defense: 2022-06-15)

    Pretrained Transformer-based language models have quickly replaced traditional approaches to model NLP tasks, pushing the state of the art to new levels, and will certainly continue to be very influential in the years ...