Recent advances in natural language processing via large pre-trained language models: A survey

B Min, H Ross, E Sulem, APB Veyseh… - ACM Computing …, 2023 - dl.acm.org
Large, pre-trained language models (PLMs) such as BERT and GPT have drastically
changed the Natural Language Processing (NLP) field. For numerous NLP tasks …

BabyBERTa: Learning more grammar with small-scale child-directed language

PA Huebner, E Sulem, F Cynthia… - Proceedings of the 25th …, 2021 - aclanthology.org
Transformer-based language models have taken the NLP world by storm. However, their
potential for addressing important questions in language acquisition research has been …

Controlled Evaluation of Syntactic Knowledge in Multilingual Language Models

D Kryvosheieva, R Levy - arxiv preprint arxiv:2411.07474, 2024 - arxiv.org
Language models (LMs) are capable of acquiring elements of human-like syntactic
knowledge. Targeted syntactic evaluation tests have been employed to measure how well …

How Well Can BERT Learn the Grammar of an Agglutinative and Flexible-Order Language? The Case of Basque.

G Urbizu, M Zulaika, X Saralegi… - Proceedings of the 2024 …, 2024 - aclanthology.org
This work investigates the acquisition of formal linguistic competence by neural language
models, hypothesizing that languages with complex grammar, such as Basque, present …

Simplification of German Narrative Documents with Longformer mBART

T Schomacker - 2025 - reposit.haw-hamburg.de
Transformer-models have become the most prominent method for solving a multitude of
natural language processing (NLP) tasks since their introduction in 2017. Natural Language …

[SITAT][C] Methods for the inspection of learning machines: with a focus on the language domain

R Schwarzenberg - 2022 - Dissertation, Berlin, Technische …