Detection and classification of intracranial haemorrhage on CT images using a novel deep-learning algorithmopen access
- Authors
- Lee, Ji Young; Kim, Jong Soo; Kim, Tae Yoon; Kim, Young Soo
- Issue Date
- Dec-2020
- Publisher
- NATURE PORTFOLIO
- Citation
- SCIENTIFIC REPORTS, v.10, no.1, pp.1 - 7
- Indexed
- SCIE
SCOPUS
- Journal Title
- SCIENTIFIC REPORTS
- Volume
- 10
- Number
- 1
- Start Page
- 1
- End Page
- 7
- URI
- https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/144265
- DOI
- 10.1038/s41598-020-77441-z
- ISSN
- 2045-2322
- Abstract
- A novel deep-learning algorithm for artificial neural networks (ANNs), completely different from the back-propagation method, was developed in a previous study. The purpose of this study was to assess the feasibility of using the algorithm for the detection of intracranial haemorrhage (ICH) and the classification of its subtypes, without employing the convolutional neural network (CNN). For the detection of ICH with the summation of all the computed tomography (CT) images for each case, the area under the ROC curve (AUC) was 0.859, and the sensitivity and the specificity were 78.0% and 80.0%, respectively. Regarding ICH localisation, CT images were divided into 10 subdivisions based on the intracranial height. With the subdivision of 41-50%, the best diagnostic performance for detecting ICH was obtained with AUC of 0.903, the sensitivity of 82.5%, and the specificity of 84.1%. For the classification of the ICH to subtypes, the accuracy rate for subarachnoid haemorrhage (SAH) was considerably excellent at 91.7%. This study revealed that our approach can greatly reduce the ICH diagnosis time in an actual emergency situation with a fairly good diagnostic performance.
- Files in This Item
-
- Appears in
Collections - 서울 의과대학 > 서울 신경외과학교실 > 1. Journal Articles
- 서울 의과대학 > 서울 영상의학교실 > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/144265)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.