Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Focus on the Core: Efficient Attention via Pruned Token Compression for Document Classification

Authors
Yun, JungminKim, MihyeonKim, Youngbin
Issue Date
2023
Publisher
Association for Computational Linguistics (ACL)
Citation
Findings of the Association for Computational Linguistics: EMNLP 2023, pp 13617 - 13628
Pages
12
Journal Title
Findings of the Association for Computational Linguistics: EMNLP 2023
Start Page
13617
End Page
13628
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/72809
ISSN
0000-0000
Abstract
Transformer-based models have achieved dominant performance in numerous NLP tasks. Despite their remarkable successes, pre-trained transformers such as BERT suffer from a computationally expensive self-attention mechanism that interacts with all tokens, including the ones unfavorable to classification performance. To overcome these challenges, we propose integrating two strategies: token pruning and token combining. Token pruning eliminates less important tokens in the attention mechanism's key and value as they pass through the layers. Additionally, we adopt fuzzy logic to handle uncertainty and alleviate potential mispruning risks arising from an imbalanced distribution of each token's importance. Token combining, on the other hand, condenses input sequences into smaller sizes in order to further compress the model. By integrating these two approaches, we not only improve the model's performance but also reduce its computational demands. Experiments with various datasets demonstrate superior performance compared to baseline models, especially with the best improvement over the existing BERT model, achieving +5%p in accuracy and +5.6%p in F1 score. Additionally, memory cost is reduced to 0.61x, and a speedup of 1.64x is achieved. © 2023 Association for Computational Linguistics.
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School of Advanced Imaging Sciences, Multimedia and Film > Department of Imaging Science and Arts > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Young Bin photo

Kim, Young Bin
첨단영상대학원 (영상학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE