WAD-X: Improving Zero-shot Cross-lingual Transfer via Adapter-based Word Alignment

Ahtamjan Ahmat,Yating Yang,Bo Ma,Rui Dong, Kaiwen Lu,Lei Wang

ACM Transactions on Asian and Low-Resource Language Information Processing(2023)

引用 0|浏览7
暂无评分
摘要
Multilingual pre-trained language models (mPLMs) have achieved remarkable performance on zero-shot cross-lingual transfer learning. However, most mPLMs implicitly encourage cross-lingual alignment in pre-training stage, making it hard to capture accurate word alignment across languages. In this paper, we propose Word-align ADapters for Cross-lingual transfer (WAD-X) to explicitly align word representations of mPLMs via language-specific subspace. Taking a mPLM as the backbone model, WAD-X constructs subspace for each source-target language pair via adapters. The adapters use statistical alignment as the prior knowledge to guide word-level aligning in the corresponding bilingual semantic subspace. We evaluate our model across a set of target languages on three zero-shot cross-lingual transfer tasks: part-of-speech tagging (POS), dependency parsing (DP), and sentiment analysis (SA). Experimental results demonstrate that our proposed model improves zero-shot cross-lingual transfer on three benchmarks, with improvements of 2.19, 2.50, and 1.61 points in POS, DP, and SA tasks over strong baselines.
更多
查看译文
关键词
Cross-lingual transfer,adapter,low-resource languages,word alignment
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要