vioft2nntf2t|tblJournal|Abstract_paper|0xf4ff69153100000006d4100001000800 Ambiguity in word meanings is a long-standing challenge in processing natural language. Word sense disambiguation (WSD) deals with this challenge. Prior neural language models make use of recurrent neural network and architecture with long short-term memory. These models process the words in sequence, are slower and not truly bi-directional, so they are not able to capture and represent the contextual meanings of the words, hence they are not competent in contextual semantic representation for WSD. Recent, Bi-Directional Encoder Representation from Transformers (BERT) is long short-term memory-based transformer model that is deeply bi-directional. It uses attention mechanisms, which process and use the relevance of the entire context at a time in both directions, so it is well suited to leverage the meanings in distributed representation for WSD. We have used BERT for obtaining contextual word embedding of context and sense gloss of Marathi language ambiguous word. For this purpose, we have used 282 moderately ambiguous Marathi words catering to 1004 senses distributed over 5282 Marathi sentences harvested by linguists from online Marathi websites. We have calculated semantic similarity between the pair of context and gloss embedding using Minkowski distance family and cosine similarity measures and assigned plausible sense to the given Marathi ambiguous word. Our empirical evaluation shows that the cosine similarity measure outperforms and yields an average disambiguation accuracy of 75.26% for the given Marathi sentence.
Sandip S. Patil, R.P. Bhavsar, B.V. Pawar K.B.C. North Maharashtra University, India
BERT, Distributional Semantics, Neural Language Modeling, Transfer Learning, Word Sense Disambiguation
January | February | March | April | May | June | July | August | September | October | November | December |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 |
| Published By : ICTACT
Published In :
ICTACT Journal on Soft Computing ( Volume: 13 , Issue: 2 , Pages: 2842 - 2849 )
Date of Publication :
January 2023
Page Views :
150
Full Text Views :
1
|