Multi-sense Embeddings Using Synonym Sets and Hypernym Information from Wordnet
Author | |
Keywords | |
Abstract |
Word embedding approaches increased the efficiency of natural language processing (NLP) tasks. Traditional word embeddings though robust for many NLP activities, do not handle polysemy of words. The tasks of semantic similarity between concepts need to understand relations like hypernymy and synonym sets to produce efficient word embeddings. The outcomes of any expert system are affected by the text representation. Systems that understand senses, context, and definitions of concepts while deriving vector representations handle the drawbacks of single vector representations. This paper presents a novel idea for handling polysemy by generating Multi-Sense Embeddings using synonym sets and hypernyms information of words. This paper derives embeddings of a word by understanding the information of a word at different levels, starting from sense to context and definitions. Proposed sense embeddings of words obtained prominent results when tested on word similarity tasks. The proposed approach is tested on nine benchmark datasets, which outperformed several state-of-the-art systems.
|
Year of Publication |
2020
|
Journal |
International Journal of Interactive Multimedia and Artificial Intelligence
|
Volume |
6
|
Issue |
Regular Issue
|
Number |
4
|
Number of Pages |
68-79
|
Date Published |
12/2020
|
ISSN Number |
1989-1660
|
URL | |
DOI | |
Attachment |
ijimai_6_4_7.pdf670.78 KB
|