SinLU: Sinu-Sigmoidal Linear Unit
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Paul, A. | - |
dc.contributor.author | Bandyopadhyay, R. | - |
dc.contributor.author | Yoon, Jin Hee | - |
dc.contributor.author | Geem, Zong Woo | - |
dc.contributor.author | Sarkar, R. | - |
dc.date.accessioned | 2022-02-27T01:40:12Z | - |
dc.date.available | 2022-02-27T01:40:12Z | - |
dc.date.created | 2022-02-03 | - |
dc.date.issued | 2022-02 | - |
dc.identifier.issn | 2227-7390 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/83563 | - |
dc.description.abstract | Non-linear activation functions are integral parts of deep neural architectures. Given the large and complex dataset of a neural network, its computational complexity and approximation capability can differ significantly based on what activation function is used. Parameterizing an activation function with the introduction of learnable parameters generally improves the performance. Herein, a novel activation function called Sinu-sigmoidal Linear Unit (or SinLU) is proposed. SinLU is formulated as SinLU(x) = (x + a sin bx) · σ(x), where σ(x) is the sigmoid function. The proposed function incorporates the sine wave, allowing new functionalities over traditional linear unit activa-tions. Two trainable parameters of this function control the participation of the sinusoidal nature in the function, and help to achieve an easily trainable, and fast converging function. The performance of the proposed SinLU is compared against widely used activation functions, such as ReLU, GELU and SiLU. We showed the robustness of the proposed activation function by conducting experiments in a wide array of domains, using multiple types of neural network-based models on some standard datasets. The use of sine wave with trainable parameters results in a better performance of SinLU than commonly used activation functions. © 2022 by the authors. Licensee MDPI, Basel, Switzerland. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | MDPI | - |
dc.relation.isPartOf | Mathematics | - |
dc.title | SinLU: Sinu-Sigmoidal Linear Unit | - |
dc.type | Article | - |
dc.type.rims | ART | - |
dc.description.journalClass | 1 | - |
dc.identifier.wosid | 000755335500001 | - |
dc.identifier.doi | 10.3390/math10030337 | - |
dc.identifier.bibliographicCitation | Mathematics, v.10, no.3 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.scopusid | 2-s2.0-85123554761 | - |
dc.citation.title | Mathematics | - |
dc.citation.volume | 10 | - |
dc.citation.number | 3 | - |
dc.contributor.affiliatedAuthor | Geem, Zong Woo | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | Activation function | - |
dc.subject.keywordAuthor | CNN | - |
dc.subject.keywordAuthor | Deep learning | - |
dc.subject.keywordAuthor | Sigmoid function | - |
dc.subject.keywordAuthor | Sinusoidal curve | - |
dc.subject.keywordAuthor | Trainable parameter | - |
dc.relation.journalResearchArea | Mathematics | - |
dc.relation.journalWebOfScienceCategory | Mathematics | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.