Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Mukhiddinov, Mukhriddin | - |
dc.contributor.author | Abdusalomov, Akmalbek Bobomirzaevich | - |
dc.contributor.author | Cho, Jinsoo | - |
dc.date.accessioned | 2022-05-25T08:40:06Z | - |
dc.date.available | 2022-05-25T08:40:06Z | - |
dc.date.created | 2022-05-25 | - |
dc.date.issued | 2022-05 | - |
dc.identifier.issn | 1424-8220 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/84429 | - |
dc.description.abstract | The growing aging population suffers from high levels of vision and cognitive impairment, often resulting in a loss of independence. Such individuals must perform crucial everyday tasks such as cooking and heating with systems and devices designed for visually unimpaired individuals, which do not take into account the needs of persons with visual and cognitive impairment. Thus, the visually impaired persons using them run risks related to smoke and fire. In this paper, we propose a vision-based fire detection and notification system using smart glasses and deep learning models for blind and visually impaired (BVI) people. The system enables early detection of fires in indoor environments. To perform real-time fire detection and notification, the proposed system uses image brightness and a new convolutional neural network employing an improved YOLOv4 model with a convolutional block attention module. The h-swish activation function is used to reduce the running time and increase the robustness of YOLOv4. We adapt our previously developed smart glasses system to capture images and inform BVI people about fires and other surrounding objects through auditory messages. We create a large fire image dataset with indoor fire scenes to accurately detect fires. Furthermore, we develop an object mapping approach to provide BVI people with complete information about surrounding objects and to differentiate between hazardous and nonhazardous fires. The proposed system shows an improvement over other well-known approaches in all fire detection metrics such as precision, recall, and average precision. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | MDPI | - |
dc.relation.isPartOf | SENSORS | - |
dc.title | Automatic Fire Detection and Notification System Based on Improved YOLOv4 for the Blind and Visually Impaired | - |
dc.type | Article | - |
dc.type.rims | ART | - |
dc.description.journalClass | 1 | - |
dc.identifier.wosid | 000794492100001 | - |
dc.identifier.doi | 10.3390/s22093307 | - |
dc.identifier.bibliographicCitation | SENSORS, v.22, no.9 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.scopusid | 2-s2.0-85128722259 | - |
dc.citation.title | SENSORS | - |
dc.citation.volume | 22 | - |
dc.citation.number | 9 | - |
dc.contributor.affiliatedAuthor | Mukhiddinov, Mukhriddin | - |
dc.contributor.affiliatedAuthor | Cho, Jinsoo | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | fire detection | - |
dc.subject.keywordAuthor | smart glasses | - |
dc.subject.keywordAuthor | blind and visually impaired | - |
dc.subject.keywordAuthor | assistive technologies | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | object detection | - |
dc.subject.keywordAuthor | CNN | - |
dc.subject.keywordPlus | RECOGNITION SYSTEM | - |
dc.subject.keywordPlus | SMOKE DETECTION | - |
dc.subject.keywordPlus | INDOOR | - |
dc.subject.keywordPlus | SURVEILLANCE | - |
dc.subject.keywordPlus | NAVIGATION | - |
dc.subject.keywordPlus | PEOPLE | - |
dc.subject.keywordPlus | DESIGN | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.