Analysis of Sub-Routines in NVIDIA cuBLAS Library for a series of Matrix-Matrix Multiplications in Transformer
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, D. | - |
dc.contributor.author | Kim, I. | - |
dc.contributor.author | Kim, J. | - |
dc.date.accessioned | 2023-03-08T05:09:57Z | - |
dc.date.available | 2023-03-08T05:09:57Z | - |
dc.date.issued | 2022-10 | - |
dc.identifier.issn | 2162-1233 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/61186 | - |
dc.description.abstract | The general matrix-matrix multiplication (GEMM) is a key operation used in a variety of areas such as Computational Science, Data Science, Machine Learning, and so on. In transformers which are foundation models, Multi-Head Attention (MHA) has a series of matrix-matrix multiplications. To perform the MHA on GPUs, we need to exploit highly optimized sub-routines for GEMM, provided their hardware vendor. On NVIDIA GPUs, the cuBLAS library is provided in order to support basic linear algebra subprograms (BLAS). In this paper, we examine and analyze several sub-routines to handle a series of matrix-matrix multiplications used in the transformer model on NVIDIA GPUs. © 2022 IEEE. | - |
dc.format.extent | 3 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE Computer Society | - |
dc.title | Analysis of Sub-Routines in NVIDIA cuBLAS Library for a series of Matrix-Matrix Multiplications in Transformer | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/ICTC55196.2022.9952498 | - |
dc.identifier.bibliographicCitation | International Conference on ICT Convergence, v.2022-October, pp 618 - 620 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.scopusid | 2-s2.0-85143257085 | - |
dc.citation.endPage | 620 | - |
dc.citation.startPage | 618 | - |
dc.citation.title | International Conference on ICT Convergence | - |
dc.citation.volume | 2022-October | - |
dc.type.docType | Conference Paper | - |
dc.publisher.location | 미국 | - |
dc.subject.keywordAuthor | cuBLAS | - |
dc.subject.keywordAuthor | General Matrix-Matrix Multiplication | - |
dc.subject.keywordAuthor | GEMM | - |
dc.subject.keywordAuthor | Multi-Head Attention | - |
dc.subject.keywordAuthor | MHA | - |
dc.subject.keywordAuthor | Transformer | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.