Skip to content
Snippets Groups Projects
Commit 65a38afc authored by Felix Kleinert's avatar Felix Kleinert
Browse files

Merge branch 'felix_issue106_HPC_modules_for_JUWELS' of...

Merge branch 'felix_issue106_HPC_modules_for_JUWELS' of ssh://gitlab.version.fz-juelich.de:10022/toar/machinelearningtools into felix_issue106_HPC_modules_for_JUWELS
parents 725f1f35 6a6b3e9c
No related branches found
No related tags found
3 merge requests!125Release v0.10.0,!124Update Master to new version v0.10.0,!97Felix issue106 hpc modules for juwels
Pipeline #36389 passed
......@@ -14,17 +14,19 @@ and [Network In Network (Lin et al., 2014)](https://arxiv.org/abs/1312.4400).
* Install __proj__ on your machine using the console. E.g. for opensuse / leap `zypper install proj`
* c++ compiler required for cartopy installation
## HPC - JUWELS
## HPC - JUWELS and HDFML
The following instruction guide you throug the installation on JUWELS and HDFML.
* Clone the repo to HPC system (we recommend to place it in `/p/projects/<project name>`.
* Setup venv by executing `source setup_venv.sh`. This script loads all pre-installed modules and creates a venv for
all other packages. Furthermore, it creates two slurm/batch scripts to execute code on compute nodes. <br>
* Setup venv by executing `source setupHPC.sh`. This script loads all pre-installed modules and creates a venv for
all other packages. Furthermore, it creates slurm/batch scripts to execute code on compute nodes. <br>
You have to enter the HPC project's budget name (--account flag).
* The default external data path on JUWELS is set to `/p/project/deepacf/intelliaq/<user>/DATA/toar_<sampling>`.
* The default external data path on JUWELS and HDFML is set to `/p/project/deepacf/intelliaq/<user>/DATA/toar_<sampling>`.
<br>To choose a different location open `run.py` and add the following keyword argument to `ExperimentSetup`:
`data_path=<your>/<custom>/<path>`.
* Execute `python run.py` on a login node to download example data. The program will throw an OSerror after downloading.
* Execute `sbatch run_develgpus.bash` to verify that the setup went well.
* Currently cartopy is not working, therefore PlotStations does not create any output
* Execute either `sbatch run_juwels_develgpus.bash` or `sbatch run_hdfml_batch.bash` to verify that the setup went well.
* Currently cartopy is not working on our HPC system, therefore PlotStations does not create any output.
# Security
* To use hourly data from ToarDB via JOIN interface, a private token is required. Request your personal access token and
......
#!/bin/bash
# __author__ = Felix Kleinert
# __date__ = '2020-04-06'
# This script loads the required modules for mlt which are available on JUWELS.
# Note that some other packages have to be installed into a venv (see setup_venv.sh).
module --force purge
module use $OTHERSTAGES
ml Stages/Devel-2019a
ml GCCcore/.8.3.0
ml Jupyter/2019a-Python-3.6.8
ml Python/3.6.8
ml TensorFlow/1.13.1-GPU-Python-3.6.8
ml Keras/2.2.4-GPU-Python-3.6.8
ml SciPy-Stack/2019a-Python-3.6.8
ml dask/1.1.5-Python-3.6.8
ml GEOS/3.7.1-Python-3.6.8
ml Graphviz/2.40.1
\ No newline at end of file
......@@ -14,8 +14,8 @@ else
exit
fi
echo "execute: HPC_setup/setup_venv_${hpcsys}.sh $basepath/$settingpath"
source HPC_setup/setup_venv_${hpcsys}.sh $basepath/$settingpath
echo "execute: HPC_setup/setup_venv_${hpcsys}.sh $basepath$settingpath"
source HPC_setup/setup_venv_${hpcsys}.sh $basepath$settingpath
echo "execute: HPC_setup/create_runscripts_HPC.sh $hpcsys $basepath"
source HPC_setup/create_runscripts_HPC.sh $hpcsys $basepath
......
#!/bin/bash
# __author__ = Felix Kleinert
# __date__ = '2020-04-06'
# This script creates a virtual env which contains all modules which are not available via slrum/easybuild (see mlt_modules.sh)
# load existing modules
source mlt_modules.sh
# create venv
python3 -m venv venv
source venv/bin/activate
# export path for side-packages
export PYTHONPATH=${PWD}/venv/lib/python3.6/site-packages:${PYTHONPATH}
pip install -r requirements_JUWELS_outcommented.txt
pip install --ignore-installed matplotlib==3.2.0
# Comment: Maybe we have to export PYTHONPATH a second time ater activating the venv (after job allocation)
# source venv/bin/activate
# alloc_develgpu
# source venv/bin/activate
# export PYTHONPATH=${PWD}/venv/lib/python3.6/site-packages:${PYTHONPATH}
# srun python run.py
# create batch run scripts
source create_runscripts_HPC.sh
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment