Research Article

A COMPARATIVE STUDY OF DOUBLE-STEP DEEP LEARNING FRAMEWORK FOR BURNED AREA IDENTIFICATION AND SEVERITY ASSESSMENT IN WILDFIRES

Volume: 28 Number: 1 March 3, 2025
TR EN

A COMPARATIVE STUDY OF DOUBLE-STEP DEEP LEARNING FRAMEWORK FOR BURNED AREA IDENTIFICATION AND SEVERITY ASSESSMENT IN WILDFIRES

Abstract

As wildfires become more frequent and intense, it is essential to develop sophisticated techniques for precise detection and damage evaluation. This research examines a Double-Step Deep Learning Framework using several U-Net models, including MultiResUNet, to identify burned areas and estimate severity. Using satellite images, the study explores the effect of different severity levels within mask output, focusing on both 4 and 5 level severity classifications. Additionally, the Mask R-CNN model was evaluated independently for image segmentation, revealing challenges due to its reliance on pretrained weights and limited spectral input. The comparative analysis illustrates how changes in the granularity of severity intervals influence model performance, providing insights into the benefits of more nuanced severity segmentation for wildfire assessment. This approach has the potential to improve the precision of damage assessments and support more informed decision-making in the management and response of wildfires

Keywords

Supporting Institution

TUBITAK

Project Number

122N254

References

  1. Colomba, L., Farasin, A., Monaco, S., Greco, S., Garza, P., Apiletti, D., Baralis, E., & Cerquitelli, T. (2022). A dataset for burned area delineation and severity estimation from satellite imagery. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management (CIKM ’22) (pp. 3893–3897). Association for Computing Machinery. https://doi.org/10.1145/3511808.3557528
  2. Farasin, A., Colomba, L., & Garza, P. (2020). Double-Step U-Net: A deep learning-based approach for the estimation of wildfire damage severity through Sentinel-2 satellite data. Applied Sciences, 10(12), 4332. https://doi.org/10.3390/app10124332
  3. Finney, M. A. (1998). FARSITE: Fire Area Simulator—Model Development and Evaluation (Research Paper RMRS-RP-4, Revised 2004). U.S. Department of Agriculture, Forest Service, Rocky Mountain Research Station.
  4. Han, Y., Zheng, C., Liu, X., Tian, Y., & Dong, Z. (2024). Burned area and burn severity mapping with a transformer-based change detection model. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 17, 13866–13880. https://doi.org/10.1109/JSTARS.2024.3435857
  5. He, K., Gkioxari, G., Dollár, P., & Girshick, R. (2017). Mask R-CNN. Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2961–2969. https://doi.org/10.1109/ICCV.2017.322
  6. He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 770–778. https://doi.org/10.1109/CVPR.2016.90
  7. Ibtehaz, N., & Rahman, M. S. (2020). Multiresunet: Rethinking the U-Net architecture for multimodal biomedical image segmentation. Neural Networks, 121, 74–87. https://doi.org/10.1016/j.neunet.2019.08.025
  8. Kamal, U., Tonmoy, T. I., Das, S., & Hasan, M. K. (2020). Automatic traffic sign detection and recognition using SegU-Net and a modified Tversky loss function with L1-constraint. IEEE Transactions on Intelligent Transportation Systems, 21(4), 1467–1479. https://doi.org/10.1109/TITS.2019.2911727

Details

Primary Language

English

Subjects

Deep Learning

Journal Section

Research Article

Publication Date

March 3, 2025

Submission Date

December 14, 2024

Acceptance Date

January 26, 2025

Published in Issue

Year 2025 Volume: 28 Number: 1

APA
Yurdakul, M. M., Bayram, B., Bakırman, T., & İlhan, H. O. (2025). A COMPARATIVE STUDY OF DOUBLE-STEP DEEP LEARNING FRAMEWORK FOR BURNED AREA IDENTIFICATION AND SEVERITY ASSESSMENT IN WILDFIRES. Kahramanmaraş Sütçü İmam Üniversitesi Mühendislik Bilimleri Dergisi, 28(1), 513-523. https://doi.org/10.17780/ksujes.1601614