Generative adversarial networks via a composite annealing of noise and diffusion
- Authors
- Nakamura, Kensuke; Korman, Simon; Hong, Byung-Woo
- Issue Date
- Feb-2024
- Publisher
- Elsevier Ltd
- Keywords
- Coarse-to-fine training; Generative adversarial networks; Noise injection; Optimization; Scale-space
- Citation
- Pattern Recognition, v.146
- Journal Title
- Pattern Recognition
- Volume
- 146
- URI
- https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/68581
- DOI
- 10.1016/j.patcog.2023.110034
- ISSN
- 0031-3203
1873-5142
- Abstract
- Generative adversarial network (GAN) is a framework for generating fake data using a set of real examples. However, GAN is unstable in the training stage. In order to stabilize GANs, the noise injection has been used to enlarge the overlap of the real and fake distributions at the cost of increasing variance. The diffusion process (or data smoothing in its spatial domain) removes fine details in order to capture the structure and important patterns in data but it suppresses the capability of GANs to learn high-frequency information in the training procedure. Based on these observations, we propose a data representation for the GAN training, called noisy scale-space (NSS), that recursively applies the smoothing with a balanced noise to data in order to replace the high-frequency information by random data, leading to a coarse-to-fine training of GANs. We experiment with NSS using DCGAN and StyleGAN2 based on benchmark datasets in which the NSS-based GANs outperforms the state-of-the-arts in most cases. © 2023 Elsevier Ltd
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Software > Department of Artificial Intelligence > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/68581)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.