Enhancing explainability in brain tumor detection: A novel DeepEBTDNet model with LIME on MRI images
- Authors
- Ullah, Naeem; Hassan, Muhammad; Khan, Javed Ali; Anwar, Muhammad Shahid; Aurangzeb, Khursheed
- Issue Date
- Jan-2024
- Publisher
- WILEY
- Keywords
- brain-tumor detection; deep learning; explainable AI; LIME; MRI
- Citation
- INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, v.34, no.1
- Journal Title
- INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY
- Volume
- 34
- Number
- 1
- URI
- https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/90704
- DOI
- 10.1002/ima.23012
- ISSN
- 0899-9457
1098-1098
- Abstract
- Early detection of brain tumors is vital for improving patient survival rates, yet the manual analysis of the extensive 3D MRI images can be error-prone and time-consuming. This study introduces the Deep Explainable Brain Tumor Deep Network (DeepEBTDNet), a novel deep learning model for binary classification of brain MRIs as tumorous or normal. Employing sub-image dualistic histogram equalization (DSIHE) for enhanced image quality, DeepEBTDNet utilizes 12 convolutional layers with leaky ReLU (LReLU) activation for feature extraction, followed by a fully connected classification layer. Transparency and interpretability are emphasized through the application of the Local Interpretable Model-Agnostic Explanations (LIME) method to explain model predictions. Results demonstrate DeepEBTDNet's efficacy in brain tumor detection, even across datasets, achieving a validation accuracy of 98.96% and testing accuracy of 94.0%. This study underscores the importance of explainable AI in healthcare, facilitating precise diagnoses and transparent decision-making for early brain tumor identification and improved patient outcomes.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - ETC > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/90704)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.