Attention fusion network with self-supervised learning for staging of osteonecrosis of the femoral head (ONFH) using multiple MR protocols
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Bomin | - |
dc.contributor.author | Lee, Geun Young | - |
dc.contributor.author | Park, Sung-Hong | - |
dc.date.accessioned | 2024-05-14T01:30:24Z | - |
dc.date.available | 2024-05-14T01:30:24Z | - |
dc.date.issued | 2023-09 | - |
dc.identifier.issn | 0094-2405 | - |
dc.identifier.issn | 2473-4209 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/73655 | - |
dc.description.abstract | Background: Osteonecrosis of the femoral head (ONFH) is characterized as bone cell death in the hip joint, involving a severe pain in the groin. The staging of ONFH is commonly based on Magnetic resonance imaging and computed tomography (CT), which are important for establishing effective treatment plans. There have been some attempts to automate ONFH staging using deep learning, but few of them used only MR images. Purpose: To propose a deep learning model for MR-only ONFH staging, which can reduce additional cost and radiation exposure from the acquisition of CT images. Methods: We integrated information from the MR images of five different imaging protocols by a newly proposed attention fusion method, which was composed of intra-modality attention and inter-modality attention. In addition, a self-supervised learning was used to learn deep representations from a large amount of paired MR-CT dataset. The encoder part of the MR-CT translation network was used as a pretraining network for the staging, which aimed to overcome the lack of annotated data for staging. Ablation studies were performed to investigate the contributions of each proposed method. The area under the receiver operating characteristic curve (AUROC) was used to evaluate the performance of the networks. Results: Our model improved the performance of the four-way classification of the association research circulation osseous (ARCO) stage using MR images of the multiple protocols by 6.8%p in AUROC over a plain VGG network. Each proposed method increased the performance by 4.7%p (self-supervised learning) and 2.6%p (attention fusion) in AUROC, which was demonstrated by the ablation experiments. Conclusions: We have shown the feasibility of the MR-only ONFH staging by using self-supervised learning and attention fusion. A large amount of paired MR-CT data in hospitals can be used to further improve the performance of the staging, and the proposed method has potential to be used in the diagnosis of various diseases that require staging from multiple MR protocols. © 2023 American Association of Physicists in Medicine. | - |
dc.format.extent | 13 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | John Wiley and Sons Ltd | - |
dc.title | Attention fusion network with self-supervised learning for staging of osteonecrosis of the femoral head (ONFH) using multiple MR protocols | - |
dc.type | Article | - |
dc.identifier.doi | 10.1002/mp.16380 | - |
dc.identifier.bibliographicCitation | Medical Physics, v.50, no.9, pp 5528 - 5540 | - |
dc.description.isOpenAccess | N | - |
dc.identifier.wosid | 000961586600001 | - |
dc.identifier.scopusid | 2-s2.0-85152057620 | - |
dc.citation.endPage | 5540 | - |
dc.citation.number | 9 | - |
dc.citation.startPage | 5528 | - |
dc.citation.title | Medical Physics | - |
dc.citation.volume | 50 | - |
dc.type.docType | Article; Early Access | - |
dc.publisher.location | 미국 | - |
dc.subject.keywordAuthor | attention fusion | - |
dc.subject.keywordAuthor | MR-only staging | - |
dc.subject.keywordAuthor | multiple MR protocols | - |
dc.subject.keywordAuthor | osteonecrosis of femoral head | - |
dc.subject.keywordAuthor | self-supervised learning | - |
dc.relation.journalResearchArea | Radiology, Nuclear Medicine & Medical Imaging | - |
dc.relation.journalWebOfScienceCategory | Radiology, Nuclear Medicine & Medical Imaging | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.