Accurate and Consistent Image-to-Image Conditional Adversarial Network
- Authors
- Islam, Naeem Ul; Lee, Sungmin; Park, Jaebyung
- Issue Date
- Mar-2020
- Publisher
- MDPI
- Keywords
- generative adversarial network; convolutional neural network; consistent image-to-image translation network; autoencoders
- Citation
- ELECTRONICS, v.9, no.3
- Journal Title
- ELECTRONICS
- Volume
- 9
- Number
- 3
- URI
- https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/80766
- DOI
- 10.3390/electronics9030395
- ISSN
- 2079-9292
- Abstract
- Image-to-image translation based on deep learning has attracted interest in the robotics and vision community because of its potential impact on terrain analysis and image representation, interpretation, modification, and enhancement. Currently, the most successful approach for generating a translated image is a conditional generative adversarial network (cGAN) for training an autoencoder with skip connections. Despite its impressive performance, it has low accuracy and a lack of consistency; further, its training is imbalanced. This paper proposes a balanced training strategy for image-to-image translation, resulting in an accurate and consistent network. The proposed approach uses two generators and a single discriminator. The generators translate images from one domain to another. The discriminator takes the input of three different configurations and guides both the generators to generate realistic images in their corresponding domains while ensuring high accuracy and consistency. Experiments are conducted on different datasets. In particular, the proposed approach outperforms the cGAN in realistic image translation in terms of accuracy and consistency in training.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - IT융합대학 > 컴퓨터공학과 > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/80766)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.