Laft: Cross-Lingual Transfer For Text Generation By Language-Agnostic Finetuning
Xianze Wu, Zaixiang Zheng, Hao Zhou, Yong Yu
Oral Session 3 - Thursday 07/21 14:20 EST
Abstract:
Multilingual language pretraining enables possibilities of transferring task knowledge learned from a rich-resource source language to the other, particularly favoring those low-resource languages with few or no task annotated data. However, knowledge about language and tasks encoded is strongly entangled in multilingual neural representations, thereby the learned task knowledge falsely correlated to the source language, falling short of cross-lingual transferability. In this paper, we present a novel language-agnostic finetuning (LAFT) to facilitate zero-resource cross-lingual transfer for text generation. LAFT performs language-agnostic task acquisition to isolate task learning completely from the source language, and then language specification for better generation for specified languages. Experiments demonstrate that the proposed approach facilitates a better and parameter-efficient transferability on two text generation tasks.