Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Choi, Sanghyuk Roy | - |
dc.contributor.author | Lee, Minhyeok | - |
dc.date.accessioned | 2024-01-09T04:33:34Z | - |
dc.date.available | 2024-01-09T04:33:34Z | - |
dc.date.issued | 2023-07 | - |
dc.identifier.issn | 2079-7737 | - |
dc.identifier.issn | 2079-7737 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/69879 | - |
dc.description.abstract | The emergence and rapid development of deep learning, specifically transformer-based architectures and attention mechanisms, have had transformative implications across several domains, including bioinformatics and genome data analysis. The analogous nature of genome sequences to language texts has enabled the application of techniques that have exhibited success in fields ranging from natural language processing to genomic data. This review provides a comprehensive analysis of the most recent advancements in the application of transformer architectures and attention mechanisms to genome and transcriptome data. The focus of this review is on the critical evaluation of these techniques, discussing their advantages and limitations in the context of genome data analysis. With the swift pace of development in deep learning methodologies, it becomes vital to continually assess and reflect on the current standing and future direction of the research. Therefore, this review aims to serve as a timely resource for both seasoned researchers and newcomers, offering a panoramic view of the recent advancements and elucidating the state-of-the-art applications in the field. Furthermore, this review paper serves to highlight potential areas of future investigation by critically evaluating studies from 2019 to 2023, thereby acting as a stepping-stone for further research endeavors. | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Multidisciplinary Digital Publishing Institute (MDPI) | - |
dc.title | Transformer Architecture and Attention Mechanisms in Genome Data Analysis: A Comprehensive Review | - |
dc.type | Article | - |
dc.identifier.doi | 10.3390/biology12071033 | - |
dc.identifier.bibliographicCitation | Biology, v.12, no.7 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.wosid | 001034881600001 | - |
dc.identifier.scopusid | 2-s2.0-85166245762 | - |
dc.citation.number | 7 | - |
dc.citation.title | Biology | - |
dc.citation.volume | 12 | - |
dc.type.docType | Review | - |
dc.publisher.location | 스위스 | - |
dc.subject.keywordAuthor | attention mechanism | - |
dc.subject.keywordAuthor | bioinformatics | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | genome data | - |
dc.subject.keywordAuthor | genomics | - |
dc.subject.keywordAuthor | natural language processing | - |
dc.subject.keywordAuthor | sequence analysis | - |
dc.subject.keywordAuthor | transcriptome data | - |
dc.subject.keywordAuthor | transformer model | - |
dc.subject.keywordPlus | MIRNA-DISEASE ASSOCIATIONS | - |
dc.subject.keywordPlus | NEURAL-NETWORK | - |
dc.subject.keywordPlus | DEEP | - |
dc.subject.keywordPlus | PREDICTION | - |
dc.subject.keywordPlus | MODEL | - |
dc.subject.keywordPlus | FUSION | - |
dc.relation.journalResearchArea | Life Sciences & Biomedicine - Other Topics | - |
dc.relation.journalWebOfScienceCategory | Biology | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.