BabyLMs for isiXhosa: Data-Efficient Language Modelling in a Low-Resource Context

Matzopoulos, Alexis and Hendriks, Charl and Mahomed, Hishaam and Meyer, Francois (2025) BabyLMs for isiXhosa: Data-Efficient Language Modelling in a Low-Resource Context, Proceedings of First Workshop on Language Models for Low-Resource Languages.

[thumbnail of 2025.loreslm-1.19.pdf] Text
2025.loreslm-1.19.pdf

Download (1MB)

Abstract

The BabyLM challenge called on participants to develop sample-efficient language models. Submissions were pretrained on a fixed English corpus, limited to the amount of words children are exposed to in development (<100m). The challenge produced new architectures for data-efficient language modelling, outperforming models trained on trillions of words. This is promising for low-resource languages, where available corpora are limited to much less than 100m words. In this paper, we explore the potential of BabyLMs for low-resource languages, using the isiXhosa language as a case study. We pretrain two BabyLM architectures, ELC-BERT and MLSM, on an isiXhosa corpus. They outperform a vanilla pretrained model on POS tagging and NER, achieving notable gains (+3.2 F1) for the latter. In some instances, the BabyLMs even outperform XLM-R. Our findings show that data-efficient models are viable for low-resource languages, but highlight the continued importance, and lack of, high-quality pretraining data. Finally, we visually analyse how BabyLM architectures encode isiXhosa.

Item Type: Conference paper
Subjects: Computing methodologies > Artificial intelligence > Natural language processing
Date Deposited: 28 Jul 2025 12:35
Last Modified: 28 Jul 2025 12:35
URI: https://pubs.cs.uct.ac.za/id/eprint/1734

Actions (login required)

View Item View Item