Meta-Transformer: A Meta-Learning Framework for Scalable Automatic Modulation Classification
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jang, Jungik | - |
dc.contributor.author | Pyo, Jisung | - |
dc.contributor.author | Yoon, Young-Il | - |
dc.contributor.author | Choi, Jaehyuk | - |
dc.date.accessioned | 2024-02-15T15:30:18Z | - |
dc.date.available | 2024-02-15T15:30:18Z | - |
dc.date.issued | 2024-01 | - |
dc.identifier.issn | 2169-3536 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/90430 | - |
dc.description.abstract | Recent advances in deep learning (DL) have led many contemporary automatic modulation classification (AMC) techniques to use deep networks in classifying the modulation type of incoming signals at the receiver. However, current DL-based methods face scalability challenges, particularly when encountering unseen modulations or input signals from environments not present during model training, making them less suitable for real-world applications like software-defined radio devices. In this paper, we introduce a scalable AMC scheme that provides flexibility for new modulations and adaptability to input signals with diverse configurations. We propose the Meta-Transformer, a meta-learning framework based on few-shot learning (FSL) to acquire general knowledge and a learning method for AMC tasks. This approach empowers the model to identify new unseen modulations using only a very small number of samples, eliminating the need for complete model retraining. Furthermore, we enhance the scalability of the classifier by leveraging main-sub transformer-based encoders, enabling efficient processing of input signals with diverse setups. Extensive evaluations demonstrate that the proposed AMC method outperforms existing techniques across all signal-to-noise ratios (SNRs) on RadioML2018.01A. The source code and pre-trained models are released at https://github.com/cheeseBG/meta-transformer-amc. | - |
dc.format.extent | 10 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.title | Meta-Transformer: A Meta-Learning Framework for Scalable Automatic Modulation Classification | - |
dc.type | Article | - |
dc.identifier.wosid | 001150390800001 | - |
dc.identifier.doi | 10.1109/ACCESS.2024.3352634 | - |
dc.identifier.bibliographicCitation | IEEE ACCESS, v.12, pp 9267 - 9276 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.scopusid | 2-s2.0-85182950979 | - |
dc.citation.endPage | 9276 | - |
dc.citation.startPage | 9267 | - |
dc.citation.title | IEEE ACCESS | - |
dc.citation.volume | 12 | - |
dc.type.docType | Article | - |
dc.publisher.location | 미국 | - |
dc.subject.keywordAuthor | Automatic modulation classification | - |
dc.subject.keywordAuthor | few-shot learning | - |
dc.subject.keywordAuthor | meta-learning | - |
dc.subject.keywordAuthor | transformer | - |
dc.subject.keywordAuthor | unseen dataset | - |
dc.subject.keywordPlus | CNN | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Telecommunications | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Telecommunications | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.