Improved Real-Time Fire Warning System Based on Advanced Technologies for Visually Impaired People
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Abdusalomov, Akmalbek Bobomirzaevich | - |
dc.contributor.author | Mukhiddinov, Mukhriddin | - |
dc.contributor.author | Kutlimuratov, Alpamis | - |
dc.contributor.author | Whangbo, Taeg Keun | - |
dc.date.accessioned | 2022-11-11T04:40:06Z | - |
dc.date.available | 2022-11-11T04:40:06Z | - |
dc.date.created | 2022-11-08 | - |
dc.date.issued | 2022-10 | - |
dc.identifier.issn | 1424-8220 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/86002 | - |
dc.description.abstract | Early fire detection and notification techniques provide fire prevention and safety information to blind and visually impaired (BVI) people within a short period of time in emergency situations when fires occur in indoor environments. Given its direct impact on human safety and the environment, fire detection is a difficult but crucial problem. To prevent injuries and property damage, advanced technology requires appropriate methods for detecting fires as quickly as possible. In this study, to reduce the loss of human lives and property damage, we introduce the development of the vision-based early flame recognition and notification approach using artificial intelligence for assisting BVI people. The proposed fire alarm control system for indoor buildings can provide accurate information on fire scenes. In our proposed method, all the processes performed manually were automated, and the performance efficiency and quality of fire classification were improved. To perform real-time monitoring and enhance the detection accuracy of indoor fire disasters, the proposed system uses the YOLOv5m model, which is an updated version of the traditional YOLOv5. The experimental results show that the proposed system successfully detected and notified the occurrence of catastrophic fires with high speed and accuracy at any time of day or night, regardless of the shape or size of the fire. Finally, we compared the competitiveness level of our method with that of other conventional fire-detection methods to confirm the seamless classification results achieved using performance evaluation matrices. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | MDPI | - |
dc.relation.isPartOf | SENSORS | - |
dc.title | Improved Real-Time Fire Warning System Based on Advanced Technologies for Visually Impaired People | - |
dc.type | Article | - |
dc.type.rims | ART | - |
dc.description.journalClass | 1 | - |
dc.identifier.wosid | 000867085900001 | - |
dc.identifier.doi | 10.3390/s22197305 | - |
dc.identifier.bibliographicCitation | SENSORS, v.22, no.19 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.scopusid | 2-s2.0-85140029634 | - |
dc.citation.title | SENSORS | - |
dc.citation.volume | 22 | - |
dc.citation.number | 19 | - |
dc.contributor.affiliatedAuthor | Abdusalomov, Akmalbek Bobomirzaevich | - |
dc.contributor.affiliatedAuthor | Mukhiddinov, Mukhriddin | - |
dc.contributor.affiliatedAuthor | Kutlimuratov, Alpamis | - |
dc.contributor.affiliatedAuthor | Whangbo, Taeg Keun | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | fire warning system | - |
dc.subject.keywordAuthor | smart glasses | - |
dc.subject.keywordAuthor | blind and visually impaired | - |
dc.subject.keywordAuthor | YOLOv5 | - |
dc.subject.keywordAuthor | artificial intelligence | - |
dc.subject.keywordAuthor | flame classification | - |
dc.subject.keywordPlus | MODEL | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.