Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Parallel data-local training for optimizing Word2Vec embeddings for word and graph embeddings

Authors
Moon, G.E.Newman-Griffis, D.Kim, J.Sukumaran-Rajam, A.Fosler-Lussier, E.Sadayappan, P.
Issue Date
Nov-2019
Publisher
Institute of Electrical and Electronics Engineers Inc.
Keywords
Graph Embedding; Learning Latent Representations; Node2Vec; Parallel Machine Learning; Parallel Word2Vec; Unsupervised Learning; Word Embedding
Citation
Proceedings of MLHPC 2019: 5th Workshop on Machine Learning in HPC Environments - Held in conjunction with SC 2019: The International Conference for High Performance Computing, Networking, Storage and Analysis, pp 44 - 55
Pages
12
Journal Title
Proceedings of MLHPC 2019: 5th Workshop on Machine Learning in HPC Environments - Held in conjunction with SC 2019: The International Conference for High Performance Computing, Networking, Storage and Analysis
Start Page
44
End Page
55
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/63710
DOI
10.1109/MLHPC49564.2019.00010
Abstract
The Word2Vec model is a neural network-based unsupervised word embedding technique widely used in applications such as natural language processing, bioinformatics and graph mining. As Word2Vec repeatedly performs Stochastic Gradient Descent (SGD) to minimize the objective function, it is very compute-intensive. However, existing methods for parallelizing Word2Vec are not optimized enough for data locality to achieve high performance. In this paper, we develop a parallel data-locality-enhanced Word2Vec algorithm based on Skip-gram with a novel negative sampling method that decouples loss calculation with positive and negative samples; this allows us to efficiently reformulate matrix-matrix operations for the negative samples over the sentence. Experimental results demonstrate our parallel implementations on multi-core CPUs and GPUs achieve significant performance improvement over the existing state-of-the-art parallel Word2Vec implementations while maintaining evaluation quality. We also show the utility of our Word2Vec implementation within the Node2Vec algorithm which accelerates embedding learning for large graphs. © 2019 IEEE.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Software > School of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Jinsung photo

Kim, Jinsung
소프트웨어대학 (소프트웨어학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE