From f659d9b270c20b73a1aa276548ab2b480e4bd8e6 Mon Sep 17 00:00:00 2001
From: Timo Tjaden Stomberg <timo.stomberg@uni-bonn.de>
Date: Tue, 6 Dec 2022 09:47:15 +0100
Subject: [PATCH] updated readme

---
 README.md | 65 ++++++++++++++++++++++++++++++++++++++++++++++++++-----
 1 file changed, 60 insertions(+), 5 deletions(-)

diff --git a/README.md b/README.md
index 09cdbe4..c7ed430 100644
--- a/README.md
+++ b/README.md
@@ -14,14 +14,71 @@ The **trained model** used for our publication can be downloaded here: **http://
 
 **Content of readme:**
 
+ - **Include Model to Your Own Project**
  - Code Structure
  - Summary AnthroProtect Dataset
  - Summary Activation Space Occlusion Sensitivity (ASOS)
- - **Setup and Requirements**
+ - **Setup this Repository**
  - **Getting Started: Easily Predict a Sensitivity Map using a Trained Model**
  - Train Your Own Model
  - Export Your Own Data
 
+## Include Model to Your Own Project
+
+Please download the AnthroProtect dataset and the trained model from the links mentioned above.
+
+Setup an environment, if not yet done:
+
+```console
+$ conda create --name asos python=3.9
+$ conda activate asos
+```
+
+Install the followowing packages:
+
+```console
+$ conda install -c conda-forge earthengine-api
+$ pip install git+https://gitlab.jsc.fz-juelich.de/kiste/asos@main
+```
+
+You can now load the model and datamodule (datasets and dataloaders) using the following load_trainer() function:
+
+```python
+import os, torch, tqdm, tlib
+
+def load_trainer(data_path, log_path, device='cuda'):
+
+    return tlib.ttorch.train.load_trainer(
+        log_dir=log_path,
+        datamodule_folder=os.path.join(data_path, 'tiles', 's2'),
+        device=device,
+    )
+
+if __name__ == '__main__':
+
+    # please change the data_path and the log_path accordingly:
+    data_path = os.path.expanduser('~/data/anthroprotect')
+    log_path = os.path.expanduser('~/working_dir/logs')
+
+    # load trainer
+    trainer = load_trainer(data_path=data_path, log_path=log_path)
+    model = trainer.model
+    datamodule = trainer.datamodule
+
+    # get test dataset and test dataloader (works also with train and val)
+    dataset = datamodule.test_dataset
+    dataloader = datamodule.get_dataloader('test')
+    
+    # predict
+    with torch.no_grad():
+        for batch in tqdm.tqdm(dataloader):
+
+            xs, ys, files = batch['x'], batch['y'], batch['file']
+            preds = model(xs)
+            for i in range(len(xs)):
+                print(f'label: {ys[i].item()}, prediction: {preds[i].item()}')
+```
+
 ## Code Structure
 
 Within the **"tjects" folder** there are two subfolders:
@@ -29,9 +86,7 @@ Within the **"tjects" folder** there are two subfolders:
  - **data_processing/anthroprotect:** With this code, the AnthroProtect dataset has been exported and preprocessed.
  - **experiments/asos:** With this code, the methodology Activation Space Occlusion Sensitivity (ASOS) has been applied to the dataset.
  
-Please open the file "tjects/**main_config.py**" and set the configurations as described in this file.
- 
-The tjects' code is based on the following four sub-libraries within the library tlib:
+The tjects' code is based on the following four sub-libraries within the library **tlib**:
 
  - **tgeo:** tools for Google Earth Engine, GeoTIFF files, KML files, etc.
  - **tlearn:** machine learning tools
@@ -77,7 +132,7 @@ This way, we are able to predict **sensitivity maps** in any region:
 <img src="readme/inv_letsi.png">
 
 
-## Setup and Requirements
+## Setup this Repository
 
 Please download or clone this repository on your machine. Use the environment.yml file to setup a virtual environment using Anaconda; or use the requirements.txt file to setup a virtual environment using pip.
 
-- 
GitLab