Product

Detecting wildfires and their impact from satellite: Build your solution in 15 minutes

7
mins watch
mins read
May 10, 2023

As the threat of wildfires continues to escalate around the globe, the need for effective detection and monitoring systems has become paramount. Enter satellite imagery, a powerful tool that enables us to gain a comprehensive and timely understanding of wildfire occurrences and the subsequent burned areas. With their ability to provide high-resolution images from space, satellites have revolutionised the way we detect, track, and analyse wildfires, offering invaluable insights for emergency response, land management, and environmental conservation efforts.

Introduction

In this blog post, we will explore how you can use the SpaceSense SDK to create quick solutions to measure the impact of wildfire, ultimately aiding in our ongoing battle against these destructive natural disasters. Our goal here is to show you how you can quickly get started and get actionable results in less than an hour, not provide you a finished product ready for operations. 

This article will provide you with a high level overview of the process to build the solution. To complement it, you can find a detailed Jupyter notebook ready to be run, as well as a short tutorial video going over the code.

We’ll start by measuring the burned areas, and then detect active wildfires. Several options are suggested when looking at scientific literature. So we’ll compare 4 different indices, all designed to measure burned areas, and see which performs best. The indices are NBR (Normalized Burn Ratio), NBR+ (Normalized Burn Ratio Plus), BAIS2 (Burned Area Index for Sentinel-2) and NDVI (Normalized Difference Vegetation Index - not designed for burned areas, but can be used).

Our wildfire used to test these indices is a fire that happened in France between the 12th and 25th of July 2022 (called Landiras 1), and burned more than 10 000 hectares.

Satellite image of the burned area before and after

Detecting burned areas

For each solution that we are testing, we are going to apply the index formula (found in the links above, and in the notebook) to the image “before” (here the 12th of July) and the image “after” (here the 6th of August). For each pixel, we will look at the delta (the difference between two values) between the before and after image, to highlight the affected areas. We can either display a map that will show this for all pixels, or one which only displays pixels above a certain threshold value. This is called a mask. Finally, for visual interpretation, we can apply the mask on the RGB image to see if it matches what we can visually see. Below you can see the result of this process for the NBR index.

The NBR image (left), the NBR mask (center) and the mask on the RGB image (right)

So now we can compare the performance of each of the indices. As you can see from the images below, it seems like the NBR+ is the most accurate, when checking visually. 

In this order: NBR, NBR+, BAIS2 and NDVI

Now let’s calculate the surface based on the affected pixels. The firefighter reports mention approximately 10 300 hectares burned. In the table below you can see the burned pixel estimation for each index, and the error rate. This confirms the visual interpretation, with NBR+ being the most accurate estimation. NDVI is close behind, but it might be a lucky shot, because it very clearly classified water areas as burned. 

Perfomance comparison table

Detecting active wildfires

We acquired a satellite image on the 17th of July 2022 which has active fires visible. We’ll use that to test the accuracy of our indices. We will discard the NDVI index, as it is incapable of accurately detecting the fires.

Each of these indices has a special threshold where they can detect active fires. You can refer yourself to the documentation for technical information. If we apply the indices to the image, and look at the pixel values that are within the threshold counted as “active fire”, and project that on a RGB image, we get the following results. The red areas are the active fires.

NBR (left), NBR+ (center) and BAIS2

As you can see, all indices are detecting the main fires, with NBR and NBR+ being able to detect very small fires at the right of the images, that are only one or two pixels wide. 

This becomes even more apparent when we only display the index map, where the fires appear as bright yellow spots.

NBR+ image

And that is it! You now have a basic solution to start tracking wildfires and their impact, all over the world. As mentioned in the introduction, this solution can be improved for better accuracy and scalability, but this already provides you with a simple and quick way to generate very useful insights.

If you want to try it for yourself using SpaceSense’s SDK, reach out to us below!

Want to benefit from geospatial
AI easily?

Connect with us

Check out our upcoming solutions that you can test

Check it out