Compressing deep graph convolution network with multi-staged knowledge distillation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Junghun | - |
dc.contributor.author | Jung, Jinhong | - |
dc.contributor.author | Kang, U. | - |
dc.date.accessioned | 2023-09-25T08:40:19Z | - |
dc.date.available | 2023-09-25T08:40:19Z | - |
dc.date.created | 2023-09-25 | - |
dc.date.issued | 2021-08 | - |
dc.identifier.issn | 1932-6203 | - |
dc.identifier.uri | http://scholarworks.bwise.kr/ssu/handle/2018.sw.ssu/44299 | - |
dc.description.abstract | Given a trained deep graph convolution network (GCN), how can we effectively compress it into a compact network without significant loss of accuracy? Compressing a trained deep GCN into a compact GCN is of great importance for implementing the model to environments such as mobile or embedded systems, which have limited computing resources. However, previous works for compressing deep GCNs do not consider the multi-hop aggregation of the deep GCNs, though it is the main purpose for their multiple GCN layers. In this work, we propose MustaD (Multi-staged knowledge Distillation), a novel approach for compressing deep GCNs to single-layered GCNs through multi-staged knowledge distillation (KD). MustaD distills the knowledge of 1) the aggregation from multiple GCN layers as well as 2) task prediction while preserving the multi-hop feature aggregation of deep GCNs by a single effective layer. Extensive experiments on four real-world datasets show that MustaD provides the state-of-the-art performance compared to other KD based methods. Specifically, MustaD presents up to 4.21%p improvement of accuracy compared to the second-best KD models. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | PUBLIC LIBRARY SCIENCE | - |
dc.relation.isPartOf | PLOS ONE | - |
dc.title | Compressing deep graph convolution network with multi-staged knowledge distillation | - |
dc.type | Article | - |
dc.identifier.doi | 10.1371/journal.pone.0256187 | - |
dc.type.rims | ART | - |
dc.identifier.bibliographicCitation | PLOS ONE, v.16, no.8 | - |
dc.description.journalClass | 1 | - |
dc.identifier.wosid | 000684737400012 | - |
dc.identifier.scopusid | 2-s2.0-85112780267 | - |
dc.citation.number | 8 | - |
dc.citation.title | PLOS ONE | - |
dc.citation.volume | 16 | - |
dc.contributor.affiliatedAuthor | Jung, Jinhong | - |
dc.identifier.url | https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0256187 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | Y | - |
dc.relation.journalResearchArea | Science & Technology - Other Topics | - |
dc.relation.journalWebOfScienceCategory | Multidisciplinary Sciences | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Soongsil University Library 369 Sangdo-Ro, Dongjak-Gu, Seoul, Korea (06978)02-820-0733
COPYRIGHT ⓒ SOONGSIL UNIVERSITY, ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.