Options
The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding
Journal
MRL 2021 - 1st Workshop on Multilingual Representation Learning, Proceedings of the Conference
Date Issued
2021-01-01
Author(s)
Prasad, Archiki
Rehan, Mohammad Ali
Pathak, Shreya
Jyothi, Preethi
Abstract
While recent benchmarks have spurred a lot of new work on improving the generalization of pretrained multilingual language models on multilingual tasks, techniques to improve code-switched natural language understanding tasks have been far less explored. In this work, we propose the use of bilingual intermediate pretraining as a reliable technique to derive large and consistent performance gains using code-switched text on three different NLP tasks: Natural Language Inference (NLI), Question Answering (QA) and Sentiment Analysis (SA). We show consistent performance gains on four different code-switched language-pairs (Hindi-English, Spanish-English, Tamil-English and Malayalam-English) for SA and on Hindi-English for NLI and QA. We also present a code-switched masked language modeling (MLM) pretraining technique that consistently benefits SA compared to standard MLM pretraining using real code-switched text.