ArabGlossBERT: Fine-Tuning BERT on Context-Gloss Pairs for WSD

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Using pre-trained transformer models such as BERT has proven to be effective in many NLP tasks. This paper presents our work to fine-tune BERT models for Arabic Word Sense Disambiguation (WSD). We treated the WSD task as a sentence-pair binary classification task. First, we constructed a dataset of labeled Arabic context-gloss pairs (~167k pairs) we extracted from the Arabic Ontology and the large lexicographic database available at Birzeit University. Each pair was labeled as True or False and target words in each context were identified and annotated. Second, we used this dataset for fine-tuning three pre-trained Arabic BERT models. Third, we experimented the use of different supervised signals used to emphasize target words in context. Our experiments achieved promising results (accuracy of 84%) although we used a large set of senses in the experiment.

Original languageEnglish
Title of host publicationInternational Conference Recent Advances in Natural Language Processing, RANLP 2021
Subtitle of host publicationDeep Learning for Natural Language Processing Methods and Applications - Proceedings
EditorsGalia Angelova, Maria Kunilovskaya, Ruslan Mitkov, Ivelina Nikolova-Koleva
PublisherIncoma Ltd
Pages35-43
Number of pages9
ISBN (Electronic)9789544520724
DOIs
Publication statusPublished - 2021
Externally publishedYes
EventInternational Conference on Recent Advances in Natural Language Processing: Deep Learning for Natural Language Processing Methods and Applications, RANLP 2021 - Virtual, Online
Duration: 1 Sept 20213 Sept 2021

Publication series

NameInternational Conference Recent Advances in Natural Language Processing, RANLP
ISSN (Print)1313-8502

Conference

ConferenceInternational Conference on Recent Advances in Natural Language Processing: Deep Learning for Natural Language Processing Methods and Applications, RANLP 2021
CityVirtual, Online
Period1/09/213/09/21

Fingerprint

Dive into the research topics of 'ArabGlossBERT: Fine-Tuning BERT on Context-Gloss Pairs for WSD'. Together they form a unique fingerprint.

Cite this