Visualization for Explanation of Deep Learning-Based Defect Detection Model Using Class Activation Map
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shin, Hyunkyu | - |
dc.contributor.author | Ahn, Yong Han | - |
dc.contributor.author | Song, Mihwa | - |
dc.contributor.author | Gil, Heungbae | - |
dc.contributor.author | Choi, Jungsik | - |
dc.contributor.author | LEE, SANG HYO | - |
dc.date.accessioned | 2023-07-05T05:39:15Z | - |
dc.date.available | 2023-07-05T05:39:15Z | - |
dc.date.issued | 2023-06 | - |
dc.identifier.issn | 1546-2218 | - |
dc.identifier.issn | 1546-2226 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/113121 | - |
dc.description.abstract | Recently, convolutional neural network (CNN)-based visual inspec-tion has been developed to detect defects on building surfaces automatically. The CNN model demonstrates remarkable accuracy in image data analysis; however, the predicted results have uncertainty in providing accurate informa-tion to users because of the "black box" problem in the deep learning model. Therefore, this study proposes a visual explanation method to overcome the uncertainty limitation of CNN-based defect identification. The visual repre-sentative gradient-weights class activation mapping (Grad-CAM) method is adopted to provide visually explainable information. A visualizing evaluation index is proposed to quantitatively analyze visual representations; this index reflects a rough estimate of the concordance rate between the visualized heat map and intended defects. In addition, an ablation study, adopting three-branch combinations with the VGG16, is implemented to identify perfor-mance variations by visualizing predicted results. Experiments reveal that the proposed model, combined with hybrid pooling, batch normalization, and multi-attention modules, achieves the best performance with an accuracy of 97.77%, corresponding to an improvement of 2.49% compared with the baseline model. Consequently, this study demonstrates that reliable results from an automatic defect classification model can be provided to an inspector through the visual representation of the predicted results using CNN models. | - |
dc.format.extent | 14 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Tech Science Press | - |
dc.title | Visualization for Explanation of Deep Learning-Based Defect Detection Model Using Class Activation Map | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.32604/cmc.2023.038362 | - |
dc.identifier.scopusid | 2-s2.0-85165544677 | - |
dc.identifier.wosid | 000992762700010 | - |
dc.identifier.bibliographicCitation | Computers, Materials and Continua, v.75, no.3, pp 4753 - 4766 | - |
dc.citation.title | Computers, Materials and Continua | - |
dc.citation.volume | 75 | - |
dc.citation.number | 3 | - |
dc.citation.startPage | 4753 | - |
dc.citation.endPage | 4766 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | Y | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Materials Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Materials Science, Multidisciplinary | - |
dc.subject.keywordAuthor | Defect detection | - |
dc.subject.keywordAuthor | visualization | - |
dc.subject.keywordAuthor | class activation map | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | explanation | - |
dc.subject.keywordAuthor | visualizing evaluation index | - |
dc.identifier.url | https://www.techscience.com/cmc/v75n3/52623 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.