Skip to content
Snippets Groups Projects
Select Git revision
  • devel
  • master default protected
  • scarlet_issue035_only_use_scaled_nn
  • clara_issue036_make_recommendations
  • scarlet_issue025_save_shap_importances
  • scarlet_issue034_knene_100_neighbors_11_neighbors_disentanglement
  • scarlet_issue033_unimportant_untrustworthy_underrepresented
  • scarlet_issue024_shap_for_training_set
  • clara_issue021_forest_guided_clustering
9 results

xairq

XAirQ

Explainable machine learning for air quality

Structure of this repository

  • The root directory contains prepare.sh, LICENSE.md, .gitignore etc.
  • setup contains requirement files and hpc modules
  • source contains all python scripts, ordered by topic (preprocessing, models...)
  • test contains tests for the python scripts
  • resources contain descriptive .csv files needed for analysis
  • jupyter contains jupyter notebooks for full workflows
  • doc contains documentation materials

Prerequisites

  • Python Version >=3.8 with virtualenv package
  • Debian packages: PROJ and GEOS, graphviz. Install on Ubuntu 20.04 with sudo apt-get install libproj-dev proj-data proj-bin; sudo apt-get install libgeos-dev sudo apt install graphviz. If you use a conda distribution, these packages should already be installed.

Getting started on own machine

  • Clone the repository to JUWELS or your own linux machine
  • Run source prepare.sh to activate the python environment
  • Run ./run.sh to start scripts. Currently, only the option "test" is available here. in source/models/, there are some models you can run, e.g. with python random_forest.py

Code style

We try to write in the pep8 style!

  • Take a look at the pep8 convention
  • Start the analysis with ./run.sh and choose the pep8 option
  • Enter your name, so all the scripts you wrote are checked
  • Be sure to state an __author__ in the scripts you wish to check.

Getting started on JUWELS

  • Clone the repository to JUWELS
  • Run source prepare.sh to create the python HPC environment for JUWELS
  • Go to /ozone-interpolation/hpc_scripts and sbatch train_xxx_.sh for submitting jobs (currently only Random Forest and Neural Network)

Authors

  • Scarlet Stadtler
  • Clara Betancourt

License

This project is licensend under MIT License.