Standard

The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. / Efimov, Pavel; Boytsov, Leonid; Arslanova, Elena et al.
Advances in Information Retrieval: 45th European Conference on Information Retrieval: book. ed. / Jaap Kamps; Lorraine Goeuriot. Springer Cham, 2023. p. 51-67 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13982).

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Harvard

Efimov, P, Boytsov, L, Arslanova, E & Braslavski, P 2023, The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. in J Kamps & L Goeuriot (eds), Advances in Information Retrieval: 45th European Conference on Information Retrieval: book. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 13982, Springer Cham, pp. 51-67. https://doi.org/10.1007/978-3-031-28241-6_4

APA

Efimov, P., Boytsov, L., Arslanova, E., & Braslavski, P. (2023). The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. In J. Kamps, & L. Goeuriot (Eds.), Advances in Information Retrieval: 45th European Conference on Information Retrieval: book (pp. 51-67). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13982). Springer Cham. https://doi.org/10.1007/978-3-031-28241-6_4

Vancouver

Efimov P, Boytsov L, Arslanova E, Braslavski P. The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. In Kamps J, Goeuriot L, editors, Advances in Information Retrieval: 45th European Conference on Information Retrieval: book. Springer Cham. 2023. p. 51-67. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-031-28241-6_4

Author

Efimov, Pavel ; Boytsov, Leonid ; Arslanova, Elena et al. / The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer : book chapter. Advances in Information Retrieval: 45th European Conference on Information Retrieval: book. editor / Jaap Kamps ; Lorraine Goeuriot. Springer Cham, 2023. pp. 51-67 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).

BibTeX

@inproceedings{b6bc3a3a78f845778907c9f61ab067f5,
title = "The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter",
abstract = "Large multilingual language models such as mBERT or XLM-R enable zero-shot cross-lingual transfer in various IR and NLP tasks. Cao et al. [8] proposed a data- and compute-efficient method for cross-lingual adjustment of mBERT that uses a small parallel corpus to make embeddings of related words across languages similar to each other. They showed it to be effective in NLI for five European languages. In contrast we experiment with a topologically diverse set of languages (Spanish, Russian, Vietnamese, and Hindi) and extend their original implementations to new tasks (XSR, NER, and QA) and an additional training regime (continual learning). Our study reproduced gains in NLI for four languages, showed improved NER, XSR, and cross-lingual QA results in three languages (though some cross-lingual QA gains were not statistically significant), while mono-lingual QA performance never improved and sometimes degraded. Analysis of distances between contextualized embeddings of related and unrelated words (across languages) showed that fine-tuning leads to “forgetting” some of the cross-lingual alignment information. Based on this observation, we further improved NLI performance using continual learning. Our software is publicly available https://github.com/pefimov/cross-lingual-adjustment.",
author = "Pavel Efimov and Leonid Boytsov and Elena Arslanova and Pavel Braslavski",
note = "This research was supported in part through computational resources of HPC facilities at HSE University [27]. PE is grateful to Yandex Cloud for their grant toward computing resources of Yandex DataSphere. PB acknowledges support by the Russian Science Foundation, grant No 20-11-20166.",
year = "2023",
month = mar,
day = "16",
doi = "10.1007/978-3-031-28241-6_4",
language = "English",
isbn = "978-3-031-28237-9",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Cham",
pages = "51--67",
editor = "Jaap Kamps and Lorraine Goeuriot",
booktitle = "Advances in Information Retrieval: 45th European Conference on Information Retrieval",
address = "United Kingdom",

}

RIS

TY - GEN

T1 - The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer

T2 - book chapter

AU - Efimov, Pavel

AU - Boytsov, Leonid

AU - Arslanova, Elena

AU - Braslavski, Pavel

N1 - This research was supported in part through computational resources of HPC facilities at HSE University [27]. PE is grateful to Yandex Cloud for their grant toward computing resources of Yandex DataSphere. PB acknowledges support by the Russian Science Foundation, grant No 20-11-20166.

PY - 2023/3/16

Y1 - 2023/3/16

N2 - Large multilingual language models such as mBERT or XLM-R enable zero-shot cross-lingual transfer in various IR and NLP tasks. Cao et al. [8] proposed a data- and compute-efficient method for cross-lingual adjustment of mBERT that uses a small parallel corpus to make embeddings of related words across languages similar to each other. They showed it to be effective in NLI for five European languages. In contrast we experiment with a topologically diverse set of languages (Spanish, Russian, Vietnamese, and Hindi) and extend their original implementations to new tasks (XSR, NER, and QA) and an additional training regime (continual learning). Our study reproduced gains in NLI for four languages, showed improved NER, XSR, and cross-lingual QA results in three languages (though some cross-lingual QA gains were not statistically significant), while mono-lingual QA performance never improved and sometimes degraded. Analysis of distances between contextualized embeddings of related and unrelated words (across languages) showed that fine-tuning leads to “forgetting” some of the cross-lingual alignment information. Based on this observation, we further improved NLI performance using continual learning. Our software is publicly available https://github.com/pefimov/cross-lingual-adjustment.

AB - Large multilingual language models such as mBERT or XLM-R enable zero-shot cross-lingual transfer in various IR and NLP tasks. Cao et al. [8] proposed a data- and compute-efficient method for cross-lingual adjustment of mBERT that uses a small parallel corpus to make embeddings of related words across languages similar to each other. They showed it to be effective in NLI for five European languages. In contrast we experiment with a topologically diverse set of languages (Spanish, Russian, Vietnamese, and Hindi) and extend their original implementations to new tasks (XSR, NER, and QA) and an additional training regime (continual learning). Our study reproduced gains in NLI for four languages, showed improved NER, XSR, and cross-lingual QA results in three languages (though some cross-lingual QA gains were not statistically significant), while mono-lingual QA performance never improved and sometimes degraded. Analysis of distances between contextualized embeddings of related and unrelated words (across languages) showed that fine-tuning leads to “forgetting” some of the cross-lingual alignment information. Based on this observation, we further improved NLI performance using continual learning. Our software is publicly available https://github.com/pefimov/cross-lingual-adjustment.

UR - http://www.scopus.com/inward/record.url?partnerID=8YFLogxK&scp=85151051828

UR - https://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=tsmetrics&SrcApp=tsm_test&DestApp=WOS_CPL&DestLinkType=FullRecord&KeyUT=000995495200004

U2 - 10.1007/978-3-031-28241-6_4

DO - 10.1007/978-3-031-28241-6_4

M3 - Conference contribution

SN - 978-3-031-28237-9

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 51

EP - 67

BT - Advances in Information Retrieval: 45th European Conference on Information Retrieval

A2 - Kamps, Jaap

A2 - Goeuriot, Lorraine

PB - Springer Cham

ER -

ID: 37140299