Department of Mathematics
 Search | Help | Login

Math @ Duke





.......................

.......................


Publications [#382580] of Rong Ge

Papers Published

  1. Wu, C; Lee, H; Ge, R, Connecting Pre-trained Language Models and Downstream Tasks via Properties of Representations, Advances in Neural Information Processing Systems, vol. 36 (January, 2023)
    (last updated on 2026/01/17)

    Abstract:
    Recently, researchers have found that representations learned by large-scale pretrained language models are useful in various downstream tasks. However, there is little theoretical understanding of how pre-training performance is related to downstream task performance. In this paper, we analyze how this performance transfer depends on the properties of the downstream task and the structure of the representations. We consider a log-linear model where a word can be predicted from its context through a network having softmax as its last layer. We show that even if the downstream task is highly structured and depends on a simple function of the hidden representation, there are still cases when a low pre-training loss cannot guarantee good performance on the downstream task. On the other hand, we propose and empirically validate the existence of an “anchor vector” in the representation space, and show that this assumption, together with properties of the downstream task, guarantees performance transfer.

 

dept@math.duke.edu
ph: 919.660.2800
fax: 919.660.2821

Mathematics Department
Duke University, Box 90320
Durham, NC 27708-0320


x