Skip to content
Snippets Groups Projects

Exploring Wilderness Using Explainable Machine Learning in Satellite Imagery

This code belongs to the research article "Exploring Wilderness Using Explainable Machine Learning in Satellite Imagery" (2022) by Timo T. Stomberg, Taylor Stone, Johannes Leonhardt, Immanuel Weber, and Ribana Roscher; https://doi.org/10.48550/arXiv.2203.00379.

The AnthroProtect dataset with which the code has been tested can be found here: http://rs.ipb.uni-bonn.de/data/anthroprotect .

The trained model used for our publication can be downloaded here: http://rs.ipb.uni-bonn.de/downloads/asos/ .

Please cite our article, if you use this code, the model or the dataset.




This readme file is structured as follows:

  • Code Structure
  • Summary AnthroProtect Dataset
  • Summary Activation Space Occlusion Sensitivity (ASOS)
  • Setup and Requirements
  • Getting Started: Easily Predict a Sensitivity Map using a Trained Model
  • Train Your Own Model
  • Export Your Own Data

Code Structure

Within the "projects" folder there are two subfolders:

  • anthroprotect: With this code, the AnthroProtect dataset has been exported and preprocessed.
  • asos: With this code, the methodology Activation Space Occlusion Sensitivity (ASOS) has been applied to the dataset.

Please open the file "projects/main_config.py" and set the configurations as described in this file.

The projects' code is based on the following four sub-libraries within the library tlib:

  • tgeo: tools for Google Earth Engine, GeoTIFF files, KML files, etc.
  • tlearn: machine learning tools
  • ttorch: machine learning tools for PyTorch
  • tutils: basic utils

Summary AnthroProtect Dataset

The AnthroProtect dataset is built to discover the appearance of wilderness and anthropogenic areas in Fennoscandia using multispectral satellite imagery. It consists of 23,919 Sentinel-2 images.

Dataset Locations:

Click here for a detailed map: http://rs.ipb.uni-bonn.de/html/anthroprotect_datset_locations.html [Plotly Technologies Inc. (2015), Carto and OpenStreetMap contributors]

Dataset Samples:

[Copernicus Sentinel data 2020.]

Summary Activation Space Occlusion Sensitivity (ASOS)

We use a neural network (NN) consisting of a U-Net and a classifier and train it using the AnthroProtect dataset:



With the trained NN, we predict activation maps and define an activation space:



Within the activation space, regions are semantically arranged, which allows a fundamental sensitivity analysis:



This way, we are able to predict sensitivity maps in any region:

Setup and Requirements

Please download or clone this repository on your machine. Use the environment.yml file to setup a virtual environment using Anaconda; or use the requirements.txt file to setup a virtual environment using pip.

If you work on an Ubuntu system and use Anaconda, you can easily use the setup.sh file to set up everything including the Python paths. Just run the following command in your terminal:

source setup.sh

Enter "1) install venv from yml". Next time you can enter "2) activate venv".

Before you run setup.sh the first time, make sure that you have installed Anaconda and the following packages or run the following lines:

sudo apt install python3.8-venv python3-pip
sudo apt-get install libgl1-mesa-glx libegl1-mesa libxrandr2 libxrandr2 libxss1 libxcursor1 libxcomposite1 libasound2 libxi6 libxtst6
sudo apt update
sudo apt upgrade

You can also install this repository to your own project using pip. This way you can access the whole tlib library; but you cannot access the Jupyter Notebooks to reproduce our work. To install the repository, you have to install gdal and earthengine-api first.

conda install gdal
conda install -c conda-forge earthengine-api
pip install git+https://gitlab.jsc.fz-juelich.de/kiste/asos@main

After installing the repository, you can import it as tlib:

import tlib

Getting Started: Easily Predict a Sensitivity Map using a Trained Model

  • Download the AnthroProtect dataset. You find a link at the beginning of this readme. Unzip the dataset.
  • Download the trained model (link also at the beginning of this readme). Unzip the zip file and locate the "logs" folder in a working directory of your choice. Within this working directory, other files (figures etc.) might be saved later on.
  • Please open the file "projects/main_config.py" and set the configurations as described in this file.

  • Setup the repository as described in "Setup and Requirements", e.g. running source setup.sh in your terminal.
  • Open Jupyter Lab or Jupyter Notebook, e.g. running jupyter lab or jupyter notebook in your terminal.

If you have a Google Earth Engine account, you can predict a sensitivity map of any region of your choice:

  • Open the notebook "projects/asos/35_analyze_any_region.ipynb". Run the cells and follow the descriptions.

Otherwise (or additionally):

  • Open the notebook "projects/asos/34_analyze_samples.ipynb". You have several options to load the data and to plot them.

Train Your Own Model

  • Run python projects/asos/21_train.py in your terminal to start the training of a new model. You can set hyperparameters etc. in this file and also in the projects/asos/config.py file.
  • While training, you can view the loggings in tensorboard running tensorboard --logdir working_dir/ttorch_logs where working_dir is your working directory.
  • Folder "version_x" in "ttorch_logs" in your working directory must be moved to working directory and be renamed to "logs". You can remove "ttorch_logs" after that.
  • Open the "31_asos.ipynb" notebook to run ASOS. Note that this might take significantly longer than training the model.
  • With the following notebooks you can analyze the results.

Export Your Own Data

If you want to export your own data, have a look in the folder "projects/anthroprotect". You can define paramaters such as the countries in the config.py file. Run the notebooks in the given order.

The workflow of the Google Earth Engine data export is sketched in the following diagram: