Graph-based Dynamic Word Embeddings

Graph-based Dynamic Word Embeddings

Yuyin Lu, Xin Cheng, Ziran Liang, Yanghui Rao

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 4280-4288. https://doi.org/10.24963/ijcai.2022/594

As time goes by, language evolves with word semantics changing. Unfortunately, traditional word embedding methods neglect the evolution of language and assume that word representations are static. Although contextualized word embedding models can capture the diverse representations of polysemous words, they ignore temporal information as well. To tackle the aforementioned challenges, we propose a graph-based dynamic word embedding (GDWE) model, which focuses on capturing the semantic drift of words continually. We introduce word-level knowledge graphs (WKGs) to store short-term and long-term knowledge. WKGs can provide rich structural information as supplement of lexical information, which help enhance the word embedding quality and capture semantic drift quickly. Theoretical analysis and extensive experiments validate the effectiveness of our GDWE on dynamic word embedding learning.
Keywords:
Natural Language Processing: Embeddings
Machine Learning: Online Learning
Machine Learning: Time-series; Data Streams