Correlation Between Attention Heads of BERT
- Authors
- Yang, Seungmo; Kang, Mincheal; Seo, Jiwon; Kim, Younghoon
- Issue Date
- Apr-2022
- Publisher
- IEEE
- Keywords
- BERT; self-attention head; correlation
- Citation
- 2022 International Conference on Electronics, Information, and Communication (ICEIC), pp 1 - 3
- Pages
- 3
- Indexed
- SCIE
SCOPUS
- Journal Title
- 2022 International Conference on Electronics, Information, and Communication (ICEIC)
- Start Page
- 1
- End Page
- 3
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/112579
- DOI
- 10.1109/ICEIC54506.2022.9748643
- ISSN
- 0000-0000
- Abstract
- Recently, as deep learning achieves tremendous success in a variety of application domains, natural language processing adopting deep learning also has become very widespread in research. The performance of typical such models like Transformer, BERT and GPT models is quite excellent and near human performance. However, due to its complicate structure of operations such as self-attention, the role of internal outputs between layers or the relationship between latent vectors has been seldomly studied compared to CNNs. In this work, we calculate the correlation between the output of multiple self-attention heads in each layer of a pre-trained BERT model and investigate if there exist redundantly trained ones, that is, we test if the output latent vectors of an attention head can be linearly transformed to those of the other head. By experiments, we show that there are heads with high correlation and the result implies that such examination on the correlation between heads may help us to optimize the structure of BERT.
- Files in This Item
-
Go to Link
- Appears in
Collections - COLLEGE OF COMPUTING > DEPARTMENT OF ARTIFICIAL INTELLIGENCE > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.