This is a Preprint and has not been peer reviewed. This is version 1 of this Preprint.

Mapping California woodland-chaparral ecosystems following wildfire with diverse drone images and computer vision
Downloads
Authors
Abstract
Fire is a key driver of vegetation dynamics in California's woodland-chaparral ecosystems, and its role has become ever more important in recent decades as wildfire extents and frequencies increase. Understanding post-fire vegetation transitions and the likelihood of type conversion is essential for effective land management. Remote sensing represents a powerful tool to map vegetation cover and study post-fire dynamics, but current approaches—generally based on satellite sensors—are limited by their spatial and temporal resolution and generally broad application extents. In contrast, uncrewed aerial vehicles, or “drones,” offer great potential to yield low-cost, high-resolution, locally-tailored data on vegetation cover and its variation across time and space. With recent rapid development of technologies for translating raw drone imagery into ecologically relevant data, the power of drone-based research is increasing along with its analytical decision space. In this work, we apply modern methods in image processing and computer vision to generate vegetation maps from a large and diverse dataset of drone images collected under realistic operational constraints. Specifically, our imagery was collected at three study sites across three years, by multiple pilots flying different drone models with varying flight parameters. Our analytical approach uses an automated method to spatially co-register all overlapping datasets into a common reference frame. We then generate vegetation predictions within each raw image using a computer vision model and translate image-level predictions to a geospatial map based on the known positions of the drone camera. Finally, we unify all geospatial predictions from similar dates into a best available prediction for each location. Using this merged representation, we conduct change analysis across years for the landscape area common between years—approximately 100 ha at each of two study sites. When predicting our eight vegetation classes on unseen images, we achieved 94% overall accuracy and 88% class-balanced accuracy. Change analysis yielded surprisingly little change over 3-4 years post-fire, with key changes being shrub (re)establishment and tree resprouting. Our findings demonstrate the viability of scalable drone-based approaches for tracking vegetation change in fire-prone landscapes.
DOI
https://doi.org/10.32942/X2H94W
Subjects
Engineering, Life Sciences
Keywords
california, chaparral, woodland, wildfire, disturbance, recovery, drone, computer vision
Dates
Published: 2025-10-07 02:50
Last Updated: 2025-10-07 02:50
License
CC BY Attribution 4.0 International
Additional Metadata
Language:
English
Data and Code Availability Statement:
All raw and processed data are available on Open Science Framework: https://osf.io/swz7h/. The code for preparing, analyzing, and visualizing data is available via Zenodo: https://zenodo.org/records/16987366.
There are no comments or no comments have been made public for this article.