Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Correlation Between Attention Heads of BERT

Full metadata record
DC Field Value Language
dc.contributor.authorYang, Seungmo-
dc.contributor.authorKang, Mincheal-
dc.contributor.authorSeo, Jiwon-
dc.contributor.authorKim, Younghoon-
dc.date.accessioned2023-05-03T09:35:09Z-
dc.date.available2023-05-03T09:35:09Z-
dc.date.issued2022-04-
dc.identifier.issn0000-0000-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/112579-
dc.description.abstractRecently, as deep learning achieves tremendous success in a variety of application domains, natural language processing adopting deep learning also has become very widespread in research. The performance of typical such models like Transformer, BERT and GPT models is quite excellent and near human performance. However, due to its complicate structure of operations such as self-attention, the role of internal outputs between layers or the relationship between latent vectors has been seldomly studied compared to CNNs. In this work, we calculate the correlation between the output of multiple self-attention heads in each layer of a pre-trained BERT model and investigate if there exist redundantly trained ones, that is, we test if the output latent vectors of an attention head can be linearly transformed to those of the other head. By experiments, we show that there are heads with high correlation and the result implies that such examination on the correlation between heads may help us to optimize the structure of BERT.-
dc.format.extent3-
dc.language영어-
dc.language.isoENG-
dc.publisherIEEE-
dc.titleCorrelation Between Attention Heads of BERT-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1109/ICEIC54506.2022.9748643-
dc.identifier.scopusid2-s2.0-85128836688-
dc.identifier.wosid000942023400099-
dc.identifier.bibliographicCitation2022 International Conference on Electronics, Information, and Communication (ICEIC), pp 1 - 3-
dc.citation.title2022 International Conference on Electronics, Information, and Communication (ICEIC)-
dc.citation.startPage1-
dc.citation.endPage3-
dc.type.docTypeProceedings Paper-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.subject.keywordPlusComputer vision-
dc.subject.keywordPlusDeep learning-
dc.subject.keywordPlusApplications domains-
dc.subject.keywordPlusBERT-
dc.subject.keywordPlusCorrelation-
dc.subject.keywordPlusHuman performance-
dc.subject.keywordPlusLatent vectors-
dc.subject.keywordPlusPerformance-
dc.subject.keywordPlusSelf-attention head-
dc.subject.keywordPlusStructure of operations-
dc.subject.keywordPlusNatural language processing systems-
dc.subject.keywordAuthorBERT-
dc.subject.keywordAuthorself-attention head-
dc.subject.keywordAuthorcorrelation-
dc.identifier.urlhttps://ieeexplore.ieee.org/document/9748643-
Files in This Item
Go to Link
Appears in
Collections
COLLEGE OF COMPUTING > DEPARTMENT OF ARTIFICIAL INTELLIGENCE > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Young hoon photo

Kim, Young hoon
ERICA 소프트웨어융합대학 (DEPARTMENT OF ARTIFICIAL INTELLIGENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE