Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Improving Adversarial Robustness via Distillation-Based Purification

Full metadata record
DC Field Value Language
dc.contributor.authorKoo, Inhwa-
dc.contributor.authorChae, Dong-Kyu-
dc.contributor.authorLee, Sang-Chul-
dc.date.accessioned2024-06-12T00:00:17Z-
dc.date.available2024-06-12T00:00:17Z-
dc.date.issued2023-10-
dc.identifier.issn2076-3417-
dc.identifier.issn2076-3417-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/194756-
dc.description.abstractDespite the impressive performance of deep neural networks on many different vision tasks, they have been known to be vulnerable to intentionally added noise to input images. To combat these adversarial examples (AEs), improving the adversarial robustness of models has emerged as an important research topic, and research has been conducted in various directions including adversarial training, image denoising, and adversarial purification. Among them, this paper focuses on adversarial purification, which is a kind of pre-processing that removes noise before AEs enter a classification model. The advantage of adversarial purification is that it can improve robustness without affecting the model's nature, while another defense techniques like adversarial training suffer from a decrease in model accuracy. Our proposed purification framework utilizes a Convolutional Autoencoder as a base model to capture the features of images and their spatial structure.We further aim to improve the adversarial robustness of our purification model by distilling the knowledge from teacher models. To this end, we train two Convolutional Autoencoders (teachers), one with adversarial training and the other with normal training. Then, through ensemble knowledge distillation, we transfer the ability of denoising and restoring of original images to the student model (purification model). Our extensive experiments confirm that our student model achieves high purification performance(i.e., how accurately a pre-trained classification model classifies purified images). The ablation study confirms the positive effect of our idea of ensemble knowledge distillation from two teachers on performance.-
dc.format.extent11-
dc.language영어-
dc.language.isoENG-
dc.publisherMDPI-
dc.titleImproving Adversarial Robustness via Distillation-Based Purification-
dc.typeArticle-
dc.publisher.location스위스-
dc.identifier.doi10.3390/app132011313-
dc.identifier.scopusid2-s2.0-85192464467-
dc.identifier.wosid001119484700001-
dc.identifier.bibliographicCitationAPPLIED SCIENCES-BASEL, v.13, no.20, pp 1 - 11-
dc.citation.titleAPPLIED SCIENCES-BASEL-
dc.citation.volume13-
dc.citation.number20-
dc.citation.startPage1-
dc.citation.endPage11-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaChemistry-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaMaterials Science-
dc.relation.journalResearchAreaPhysics-
dc.relation.journalWebOfScienceCategoryChemistry, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryEngineering, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryMaterials Science, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryPhysics, Applied-
dc.subject.keywordAuthoradversarial robustness-
dc.subject.keywordAuthoradversarial attacks-
dc.subject.keywordAuthoradversarial purification-
dc.subject.keywordAuthorknowledge distillation-
dc.subject.keywordAuthorimage classification-
dc.subject.keywordAuthorconvolutional autoencoders-
dc.identifier.urlhttps://www.mdpi.com/2076-3417/13/20/11313-
Files in This Item
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Chae, Dong Kyu photo

Chae, Dong Kyu
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE