Accelerating multi-class defect detection of building façades using knowledge distillation of DCNN-based model
- Authors
- Lee, Kisu; Lee, Sanghyo; Kim, Hayoung
- Issue Date
- Jun-2021
- Publisher
- Sustainable Building Research Center
- Keywords
- Building façade defects; Deep learning; Knowledge distillation; Model compression; Multi-class defect detection
- Citation
- International Journal of Sustainable Building Technology and Urban Development, v.12, no.2, pp 80 - 95
- Pages
- 16
- Indexed
- SCOPUS
- Journal Title
- International Journal of Sustainable Building Technology and Urban Development
- Volume
- 12
- Number
- 2
- Start Page
- 80
- End Page
- 95
- URI
- https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/118799
- DOI
- 10.22712/susb.20210008
- ISSN
- 2093-761X
2093-7628
- Abstract
- This paper proposes a high-speed detection method for multi-class defects in residential building façades. Automated deep learning-based defect detection systems have been developed to compensate for various problems in existing human-oriented defect management methods for building façades. However, the superior performance of deep learning-based models occasionally causes a trade-off with the inference time. In other words, using a lightweight model results in performance degradation, which we propose to prevent through a knowledge distillation (KD) method. This study was conducted using approximately 10,000 building façade images, which were obtained using drones. Using these data, we compared the performances of the lightweight model trained simply and the model trained with a KD method. As a result, mean average precision (mAP) increased by approximately 20% and inference time decreased by approximately 2.5x. © International Journal of Sustainable Building Technology and Urban Development.
- Files in This Item
-
- Appears in
Collections - COLLEGE OF ENGINEERING SCIENCES > MAJOR IN BUILDING INFORMATION TECHNOLOGY > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.