딥러닝을 위한 비단조 활성화 함수
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 정재진 | - |
dc.date.accessioned | 2024-07-12T06:30:19Z | - |
dc.date.available | 2024-07-12T06:30:19Z | - |
dc.date.issued | 2024-06 | - |
dc.identifier.issn | 2671-4744 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/kumoh/handle/2020.sw.kumoh/28763 | - |
dc.description.abstract | The activation function significantly affects the performance of neural networks. Among the numerous functions, the Rectified Linear Unit(ReLU) is widely used in many deep learning applications owing to its simplicity and performance. This study proposes a new nonlinear activation function derived from logarithmic and hyperbolic tangent functions. It exhibits the following distinct characteristics: 1) If the input is greater than 0, then the output is the same as the input, 2) if the input is approximately 0, then the output exhibits non-linear characteristics, and 3) if the input is negative infinity, then the output has a value of approximately zero. Simulation results show that the proposed activation function surpasses the ReLU, Mish, and Power Function Linear Units in terms of classification accuracy. In particular, when applied to the CIFAR-10 classification using the VGG19 network, it increases the accuracy by approximately 1%. | - |
dc.format.extent | 7 | - |
dc.language | 한국어 | - |
dc.language.iso | KOR | - |
dc.publisher | 국방기술품질원 | - |
dc.title | 딥러닝을 위한 비단조 활성화 함수 | - |
dc.title.alternative | Non-monotonic activation function for deep learning | - |
dc.type | Article | - |
dc.publisher.location | 대한민국 | - |
dc.identifier.doi | 10.23199/jdqs.2024.6.1.010 | - |
dc.identifier.bibliographicCitation | 국방품질연구논집(JDQS), v.6, no.1, pp 103 - 109 | - |
dc.citation.title | 국방품질연구논집(JDQS) | - |
dc.citation.volume | 6 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 103 | - |
dc.citation.endPage | 109 | - |
dc.identifier.kciid | ART003092490 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | kciCandi | - |
dc.subject.keywordAuthor | Convolutional Neural Network(CNN) | - |
dc.subject.keywordAuthor | Deep learning | - |
dc.subject.keywordAuthor | Activation function | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
350-27, Gumi-daero, Gumi-si, Gyeongsangbuk-do, Republic of Korea (39253)054-478-7170
COPYRIGHT 2020 Kumoh University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.