Enhancement of Multi-Class Structural Defect Recognition Using Generative Adversarial Networkopen access
- Authors
- Shin, Hyunkyu; Ahn, Yonghan; Tae, Sungho; Gil, Heungbae; Song, Mihwa; Lee, Sanghyo
- Issue Date
- Nov-2021
- Publisher
- MDPI Open Access Publishing
- Keywords
- generative adversarial network; data augmentation; defect recognition; deep learning; convolutional neural network
- Citation
- Sustainability, v.13, no.22, pp 1 - 13
- Pages
- 13
- Indexed
- SCIE
SSCI
SCOPUS
- Journal Title
- Sustainability
- Volume
- 13
- Number
- 22
- Start Page
- 1
- End Page
- 13
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/108121
- DOI
- 10.3390/su132212682
- ISSN
- 2071-1050
2071-1050
- Abstract
- Recently, in the building and infrastructure fields, studies on defect detection methods using deep learning have been widely implemented. For robust automatic recognition of defects in buildings, a sufficiently large training dataset is required for the target defects. However, it is challenging to collect sufficient data from degrading building structures. To address the data shortage and imbalance problem, in this study, a data augmentation method was developed using a generative adversarial network (GAN). To confirm the effect of data augmentation in the defect dataset of old structures, two scenarios were compared and experiments were conducted. As a result, in the models that applied the GAN-based data augmentation experimentally, the average performance increased by approximately 0.16 compared to the model trained using a small dataset. Based on the results of the experiments, the GAN-based data augmentation strategy is expected to be a reliable alternative to complement defect datasets with an unbalanced number of objects.
- Files in This Item
-
- Appears in
Collections - COLLEGE OF ENGINEERING SCIENCES > MAJOR IN BUILDING INFORMATION TECHNOLOGY > 1. Journal Articles
- COLLEGE OF ENGINEERING SCIENCES > MAJOR IN ARCHITECTURAL ENGINEERING > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.