Gaza Damage Assessment using Earth Observation Data

Monitoring structural damage to buildings in North Gaza using Sentinel 2 satellite imagery during the recent Israeli-Palestinian conflict.

Aditya Sharma
Geek Culture

--

2021 Israeli-Palestinian Conflict — Background

On May 6, 2021 clashes between the Palestinian protesters and Israeli police in Jerusalem saw the renewal of the Israeli-Palestinian conflict. Agitated by the forced evictions imposed on residents of the Shaikh Jarrah neighborhood, protests ultimately turned into riots as peaceful Palestinian protesters were targeted by the Israeli police [1]. The incident that incited this recent episode of the long-running conflict saw the Israeli police storming the compound of the Al-Aqsa mosque (3rd holiest site in Islam after Mecca and Medina) on the last day of Ramadan and firing rubber bullets, tear gas canisters and stun guns in response to, what has been claimed by the Israeli authorities and reported by independent journalists, rocks and stones thrown out onto a nearby roadway from the mosque in opposition to the forced evictions[2]. A ceasefire was agreed on May 20th bringing an end to the missile-firing campaigns carried out by both Hamas and the Israeli military. The damage reported to have been incurred by both sides, however, is extremely disproportionate considering Israel’s military might and their possession of the highly eulogized “Iron Dome” technology. More than 250 hundred Palestinian civilians including children have been reported dead along with 70,000 displaced in one of the most densely populated regions in the world [3, 4].

The objective of this post is to utilize remote sensing data and assess its potential to highlight the damage caused by Israel’s campaign of bombing civilian high-rise towers and independent media buildings in Gaza city. In this analysis, Sentinel 2 multispectral imagery is used for visualizing physical changes due to the conflict as a proxy to assess structural damage.

Data

Shapefile — The entire administrative boundary of Gaza city constitutes our region of interest for this study. The shapefile was obtained through the Humanitarian Data Exchange (HDX). The information reflected in the shapefile is based on the digitization of historic maps by the Palestinian Ministry of Planning.

Satellite Data — Sentinel 2 satellite imagery provides high-resolution (20 meters) optical data at a global scale with a high revisit frequency (~5–10 days). In this analysis, two L2A images obtained 25-days apart on April 30, 2021 and 25 May, 2021 are processed using the SNAP-Python API (snappy) interface. SNAP is developed by the European Space Agency (ESA) as a common software platform for the scientific exploitation of the Sentinel-1, 2 and 3 missions. The project page of SNAP and the individual toolboxes can be found at http://step.esa.int. In a previous post, I have documented the step-by-step instructions of how to configure your local python environment to utilize the snappy module.

Level 2A satellite data is used here, which means that Top of Atmosphere (TOA) corrections are already applied. Level 2 imagery is processed from Level 1C data such that the energy emitted from the objects on the Earth’s surface (which is what we want to study) are delineated from the atmospheric disturbances recorded by the sensor. Essentially, what this preprocessing step does is remove all the signals that account for the atmospheric phenomenon in the Level 1 image. Thus, Level 2 imagery is also referred to as a Bottom-of-Atmosphere (BOA) corrected reflectance product.

The Sentinel 2 Level 2A (L2A) data used in this analysis has been acquired from the Copernicus Open Access Hub using the sentinelsat python module. The criteria for selecting the two Sentinel 2 images is based on the most recently available image both before and after the Gaza-Palestinian conflict (May 6 — May 20, 2021) with a minimum cloud coverage (< 3%).

Methods

Preprocessing using SNAP-Python API

The processing pipeline can be implemented both via the SNAP Desktop GUI or SNAP-Python API interface. However, the former requires familiarization with the software which can be (from personal experience) a little overwhelming. Moreover, the pipeline must to be implemented in a two-step process, which takes up more storage, memory and therefore, time. Using python significantly cuts down the processing time by using less memory and storage space.

The preprocessing workflow (figure) involves spatially sub-setting our region of interest from the raw satellite images. Once both pre- and post-event images have been preprocessed, a new product containing the spectral bands from both images is created by overlapping and resampling pixel values of the recently acquired image onto the geospatial raster of the earlier image. This juxtaposition technique allows us later to easily calculate the change in the pre- and post-event spectral values. Before the change can be detected, however, we need to mask out the water bodies and clouds from the newly created image. Final step involves subtracting the spectral values of the recent (slave) image from the earlier (master) image to produce a single-band output in which the physical changes that have occurred between the two dates are recorded. The difference in both images is based on the spectral values associated with near-infrared (NIR) band of the Sentinel 2 image.

Code: I have documented the processing workflow on Github.

Visualizing Structural Damage

Why use near-infrared band for change detection? — Pixel-wise differences in the near-infrared (NIR, Band 8 from Sentinel 2 data) spectral values from the two images are used to delineate structural damage. In theory, the thermal infrared energy is emitted from all objects that have a temperature greater than absolute zero. In practice, however, this means that the any spectral (energy) change detected between two separately acquired satellite images can be attributed to a multitude of sources such as differences in time of day, vegetation patterns, moisture, and other landscape features. This is where the post-processing comes into play — allowing us to delineate actual structural change from natural variation by overlaying high resolution basemaps, roads, and building footprint layers.

The basic underlying principle for assessing structural damage using near-infrared band is based on the blackbody theory that relates brightness (spectral radiance) to temperature. In other words, emissivity (ejection of photons) of an object can change depending on the temperature of that object. To understand this using an example, lets consider a building that is present in both satellite images. The brightness or emissivity (measured as spectral radiance) of the pixels covering that particular building is associated directly to its temperature. The blackbody theory dictates then that the hotter the object gets more photons (units or packets of light) are emitted at shorter wavelengths. So essentially, if the brightness of the pixel (as recorded in the visible and near-infrared (VNIR) range of electromagnetic spectrum) that covers the building increases in the post-event satellite image, the temperature of the building must have also increased. It should be noted that several factors can attribute to changes in emissivity such as color, surface roughness, moisture content, soil compaction, viewing angle, and wavelength. Therefore, the interpretation of changes in the NIR band must account for these factors. In our case, the only two factors that can change due to structural damage to buildings are the color and surface roughness.

Comparison of Damage Analysis: Sentinel 2 vs. UNOSAT

The final map may be slightly misleading on first view due to other natural phenomenon. For instance, on the top right section of the upper image the change (red color) is most likely due to change in moisture levels in the field and not from targeted strikes. The near infrared band is sensitive to changes in temperature basically so weather can also influence the results. Also, the time of day when the satellite images were acquired play a role, so that’s why we want to validate our results. For validation, targeted buildings (shown in black circles), as reported by independent news media are identified. Additionally, the independently conducted UNITAR-UNOSAT damage analysis (bottom half of the figure) of the region serves as a key indicator to assess the accuracy of Sentinel 2-derived results.

Further verification can be done using Sentinel 1 SAR data to check whether the amplitude of correlation between the pre- and post-event image i.e., coherence changes or not. Any structural damage to buildings and streets can be localized in this manner as well. The SAR approach will filter out the natural phenomenon (such as vegetation and moisture changes) that might have shown up in the infrared band of Sentinel 2.

Conclusion

On May 25, 2021, the United Nations Institute for Training and Research Operational Satellite Applications Programme (UNOSAT) released a map of damaged hotspots in North Gaza detected by Pleides satellite. In this study, we are able to use Sentinel 2 (much lower resolution than Pleides) to detect the hotspots where structural damage may have occurred. Buildings reportedly targeted are identified reasonably well using Sentinel 2 data. Overall, this analysis highlights applicability of open-source earth observation satellite data to examine structural damage associated with natural and man-made disasters.

--

--

Aditya Sharma
Geek Culture

Geoscientist and data science enthusiast interested in applied use of remote sensing satellites.