Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

딥러닝을 위한 비단조 활성화 함수Non-monotonic activation function for deep learning

Other Titles
Non-monotonic activation function for deep learning
Authors
정재진
Issue Date
Jun-2024
Publisher
국방기술품질원
Keywords
Convolutional Neural Network(CNN); Deep learning; Activation function
Citation
국방품질연구논집(JDQS), v.6, no.1, pp 103 - 109
Pages
7
Journal Title
국방품질연구논집(JDQS)
Volume
6
Number
1
Start Page
103
End Page
109
URI
https://scholarworks.bwise.kr/kumoh/handle/2020.sw.kumoh/28763
DOI
10.23199/jdqs.2024.6.1.010
ISSN
2671-4744
Abstract
The activation function significantly affects the performance of neural networks. Among the numerous functions, the Rectified Linear Unit(ReLU) is widely used in many deep learning applications owing to its simplicity and performance. This study proposes a new nonlinear activation function derived from logarithmic and hyperbolic tangent functions. It exhibits the following distinct characteristics: 1) If the input is greater than 0, then the output is the same as the input, 2) if the input is approximately 0, then the output exhibits non-linear characteristics, and 3) if the input is negative infinity, then the output has a value of approximately zero. Simulation results show that the proposed activation function surpasses the ReLU, Mish, and Power Function Linear Units in terms of classification accuracy. In particular, when applied to the CIFAR-10 classification using the VGG19 network, it increases the accuracy by approximately 1%.
Files in This Item
There are no files associated with this item.
Appears in
Collections
School of Electronic Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Jeong, Jae Jin photo

Jeong, Jae Jin
공과대학 (전자공학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE