Deep Encoder-Decoder Network-Based Wildfire Segmentation Using Drone Images in Real-Time
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Muksimova, Shakhnoza | - |
dc.contributor.author | Mardieva, Sevara | - |
dc.contributor.author | Cho, Young-Im | - |
dc.date.accessioned | 2023-01-19T01:42:05Z | - |
dc.date.available | 2023-01-19T01:42:05Z | - |
dc.date.created | 2023-01-18 | - |
dc.date.issued | 2022-12 | - |
dc.identifier.issn | 2072-4292 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/86707 | - |
dc.description.abstract | Wildfire is a hazardous natural phenomenon that leads to significant human fatalities, catastrophic environmental damages, and economic losses. Over the past few years, the intensity and frequency of fires have increased worldwide. Studies have been conducted to develop distinctive solutions to minimize forest fires. Systems for distant fire detection and monitoring have been established, showing improvements in data collection and fire characterization. However, wildfires cover vast areas, making other proposed ground systems unsuitable for optimal coverage. Unmanned aerial vehicles (UAVs) have become the subject of active research in recent years. Deep learning-based image-processing methods demonstrate improved performance in various tasks, including detection and segmentation, which can be utilized to develop modern forest firefighting techniques. In this study, we established a novel two-pathway encoder-decoder-based model to detect and accurately segment wildfires and smoke from the images captured using UAVs in real-time. Our proposed nested decoder uses pre-activated residual blocks and an attention-gating mechanism, thereby improving segmentation accuracy. Moreover, to facilitate robust and generalized training, we prepared a new dataset comprising actual incidences of forest fires and smoke, varying from small to large areas. In terms of practicality, the experimental results reveal that our method significantly outperforms existing detection and segmentation methods, despite being lightweight. In addition, the proposed model is reliable and robust for detecting and segmenting drone camera images from different viewpoints in the presence of wildfire and smoke. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | MDPI | - |
dc.relation.isPartOf | REMOTE SENSING | - |
dc.title | Deep Encoder-Decoder Network-Based Wildfire Segmentation Using Drone Images in Real-Time | - |
dc.type | Article | - |
dc.type.rims | ART | - |
dc.description.journalClass | 1 | - |
dc.identifier.wosid | 000904394900001 | - |
dc.identifier.doi | 10.3390/rs14246302 | - |
dc.identifier.bibliographicCitation | REMOTE SENSING, v.14, no.24 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.scopusid | 2-s2.0-85144824068 | - |
dc.citation.title | REMOTE SENSING | - |
dc.citation.volume | 14 | - |
dc.citation.number | 24 | - |
dc.contributor.affiliatedAuthor | Muksimova, Shakhnoza | - |
dc.contributor.affiliatedAuthor | Mardieva, Sevara | - |
dc.contributor.affiliatedAuthor | Cho, Young-Im | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | drone | - |
dc.subject.keywordAuthor | encoder-decoder | - |
dc.subject.keywordAuthor | forest fire and smoke segmentation | - |
dc.subject.keywordAuthor | deep-learning | - |
dc.subject.keywordPlus | FIRE | - |
dc.relation.journalResearchArea | Environmental Sciences & Ecology | - |
dc.relation.journalResearchArea | Geology | - |
dc.relation.journalResearchArea | Remote Sensing | - |
dc.relation.journalResearchArea | Imaging Science & Photographic Technology | - |
dc.relation.journalWebOfScienceCategory | Environmental Sciences | - |
dc.relation.journalWebOfScienceCategory | Geosciences, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Remote Sensing | - |
dc.relation.journalWebOfScienceCategory | Imaging Science & Photographic Technology | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
1342, Seongnam-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do, Republic of Korea(13120)031-750-5114
COPYRIGHT 2020 Gachon University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.