Emily Chang: Bridging Geometric and Information-theoretic Compression in Language Models
10 November 2023 - 10 November 2023
Room D1.14 - East Campus USI-SUPSI
For a language model (LM) to faithfully model human language, it must compress vast, potentially infinite information into a relatively low-dimensional space. On this topic, I will present a recent work with Corentin Kervadec and Marco Baroni to appear at EMNLP. We propose analyzing compression in (pre-trained) LMs from two points of view: geometric and information-
theoretic. We demonstrate that the two views are highly correlated, such that the intrinsic geometric dimension of linguistic data predicts their coding length under the LM. We then show that, in turn, high compression of a linguistic dataset predicts rapid adaptation to that dataset, confirming that being able to compress linguistic information is an important part of successful LM performance. As a practical byproduct of our analysis, we evaluate a battery of intrinsic dimension estimators for the first time on linguistic data, showing that only some encapsulate the relationship between information-theoretic compression, geometric compression, and ease-of-adaptation.

The speaker

Emily is a 2nd year PhD student in computational linguistics with Marco Baroni at the Universitat Pompeu Fabra (Barcelona, Spain). Her current interests lie in data compression as a signature of learnability and compositionality in humans and language models. In the past, she focused on emergent communication in humans and machines at MIT CSAIL and the Ecole Normale Supérieure, respectively.
st.wwwsupsi@supsi.ch