9:30 AM - 9:45 AM
[HDS09-03] Burned Areas Mapping using PlanetScope Imagery and Improved Deep Learning Model
Keywords:Burned Area, PlanetScope, Improved U-net, Semantic Segmentation
With the advancement of technology and societal demands, the requirements for precision and timeliness in Burned Area Mapping (BAM) have become increasingly stringent in studies related to disaster prevention and assessment.
In this context, many advanced methods and high-quality remote sensing data have been applied to the study of Burned Area Mapping (BAM). The classical global fire dataset based on MODIS, MCD64A1, employs an improved threshold method based on the Normalized Burn Ratio (NBR) and a series of optimized correction algorithms to process the extensive daily MODIS data. However, its 500m resolution has become inadequate for practical applications. In recent years, fine BAM products for specific regions based on medium-high resolution satellites like Landsat and Sentinel-2 have emerged. For example, Burnt-net, based on Sentinel-2 data and an enhanced pyramid pooling module in a U-net framework, and CNN-mrg, utilizing time-series SAR imagery and CNN, have achieved commendable discrimination accuracy. However, they are still constrained by the sensor’s resolution and revisit cycle, leaving their timeliness to be further examined. Among these, PlanetScope, with its excellent temporal resolution (daily) and spatial resolution (3m), is becoming a focus in the BAM research field. Last year, V.S. Martins and others initially attempted to demonstrate the potential of PlanetScope data in BAM tasks using transfer learning based on Landsat data and the U-net model. Furthermore, many scholars have successively explored the application of PlanetScope in BAM. However, as some of their results indicate, due to the PlanetScope sensors only capturing visible and near-infrared data, along with the noise issues common in small satellite constellations, such as chromatic aberration and the cross-track offset angle which may cause the accuracy to be general, there remains considerable room for optimization in BAM tasks using PlanetScope. Additionally, due to the current lack of datasets based on PlanetScope, there are certain difficulties in validating the obtained results.
To address the aforementioned issues, we are exploring a method that maximizes the utilization of PlanetScope's exceptional spatial and temporal resolution. This includes improvements to deep learning models and the preprocessing of datasets.
Specifically, to reduce the chromatic aberration and noise inherent in time-series data from small satellite constellations like PlanetScope, and to maximize the use of PlanetScope's spatiotemporal resolution to compensate for its limited spectral resolution, we attempt to design a neural network that can receive data from two modalities as the training data, before and after a fire event. Building upon datasets from these two modalities, we calculate the Mean Squared Error (MSE) for additional time-series data. The resulting MSE values can to some extent reflect chromatic aberration, while also identifying anomalies with excessive rate of change within a certain timeframe. These values are then used as an additional channel to provide an auxiliary reference for model training.
From the experimental results, the improved model has shown notable accuracy improvements over prior studies, especially at a higher resolution. Additionally, the dataset with the added MSE channel resulted in fewer False Positives (FP) compared to the one without the MSE channel. However, it is also observed that there was a slight increase in False Negatives (FN) in some parts of the validation set.
Finally, to test the model's efficacy in real-world scenarios, we utilized the latest South American data downloaded from the PlanetScope official website for validation. Through visual comparison, it was confirmed that the model identified new burned areas that had not appeared in any dataset. This study explores efficient practical methods for applying such exceptional PlanetScope data in BAM applications, with the expectation that it will play a significant role in future research in disaster detection, fire analysis, and fire warning systems.
In this context, many advanced methods and high-quality remote sensing data have been applied to the study of Burned Area Mapping (BAM). The classical global fire dataset based on MODIS, MCD64A1, employs an improved threshold method based on the Normalized Burn Ratio (NBR) and a series of optimized correction algorithms to process the extensive daily MODIS data. However, its 500m resolution has become inadequate for practical applications. In recent years, fine BAM products for specific regions based on medium-high resolution satellites like Landsat and Sentinel-2 have emerged. For example, Burnt-net, based on Sentinel-2 data and an enhanced pyramid pooling module in a U-net framework, and CNN-mrg, utilizing time-series SAR imagery and CNN, have achieved commendable discrimination accuracy. However, they are still constrained by the sensor’s resolution and revisit cycle, leaving their timeliness to be further examined. Among these, PlanetScope, with its excellent temporal resolution (daily) and spatial resolution (3m), is becoming a focus in the BAM research field. Last year, V.S. Martins and others initially attempted to demonstrate the potential of PlanetScope data in BAM tasks using transfer learning based on Landsat data and the U-net model. Furthermore, many scholars have successively explored the application of PlanetScope in BAM. However, as some of their results indicate, due to the PlanetScope sensors only capturing visible and near-infrared data, along with the noise issues common in small satellite constellations, such as chromatic aberration and the cross-track offset angle which may cause the accuracy to be general, there remains considerable room for optimization in BAM tasks using PlanetScope. Additionally, due to the current lack of datasets based on PlanetScope, there are certain difficulties in validating the obtained results.
To address the aforementioned issues, we are exploring a method that maximizes the utilization of PlanetScope's exceptional spatial and temporal resolution. This includes improvements to deep learning models and the preprocessing of datasets.
Specifically, to reduce the chromatic aberration and noise inherent in time-series data from small satellite constellations like PlanetScope, and to maximize the use of PlanetScope's spatiotemporal resolution to compensate for its limited spectral resolution, we attempt to design a neural network that can receive data from two modalities as the training data, before and after a fire event. Building upon datasets from these two modalities, we calculate the Mean Squared Error (MSE) for additional time-series data. The resulting MSE values can to some extent reflect chromatic aberration, while also identifying anomalies with excessive rate of change within a certain timeframe. These values are then used as an additional channel to provide an auxiliary reference for model training.
From the experimental results, the improved model has shown notable accuracy improvements over prior studies, especially at a higher resolution. Additionally, the dataset with the added MSE channel resulted in fewer False Positives (FP) compared to the one without the MSE channel. However, it is also observed that there was a slight increase in False Negatives (FN) in some parts of the validation set.
Finally, to test the model's efficacy in real-world scenarios, we utilized the latest South American data downloaded from the PlanetScope official website for validation. Through visual comparison, it was confirmed that the model identified new burned areas that had not appeared in any dataset. This study explores efficient practical methods for applying such exceptional PlanetScope data in BAM applications, with the expectation that it will play a significant role in future research in disaster detection, fire analysis, and fire warning systems.