Skip to content
Snippets Groups Projects

Repository graph

You can move around the graph by using the arrow keys.
Select Git revision
  • 313-allow-non-monotonic-window-lead-times-in-helpers-statistics-py-ahead_names-definition
  • 414-include-crps-analysis-and-other-ens-verif-methods-or-plots
  • IntelliO3-ts
  • Kleinert_et_al_2022_Representing
  • develop protected
  • develop_IntelliO3-ts
  • enxhi_issue460_remove_TOAR-I_access
  • falco_issue271_feat_add-cdc-database-datahandler
  • felix_PaperVersion
  • felix_TF2_merge_test_sectorial_skill_scores
  • felix_issue267-create-wrf-chem-data-handler
  • felix_issue287_tech-wrf-datahandler-should-inherit-from-singlestationdatahandler
  • felix_issue303-wrf-dh-major-minor
  • felix_issue319-toar-statistics4wrf-dh
  • felix_issue400_feat-include-tf-probability
  • felix_issue411-feat_include-postprocessing-for-bnns
  • felix_issue412-create-ens-predictions-for-bnns
  • felix_issue420-datahandler-with-multiple-stats-per-variable
  • felix_issue_365-tech-set-ci-cd-artifact-expire-time
  • lukas_issue363_feat_calculate-toar-metrics-on-hourly-forecasts
  • v2.4.0 protected
  • v2.3.0 protected
  • v2.2.0 protected
  • v2.1.0 protected
  • Kleinert_etal_2022_initial_submission
  • v2.0.0 protected
  • v1.5.0 protected
  • v1.4.0 protected
  • v1.3.0 protected
  • v1.2.1 protected
  • v1.2.0 protected
  • v1.1.0 protected
  • IntelliO3-ts-v1.0_R1-submit
  • v1.0.0 protected
  • v0.12.2 protected
  • v0.12.1 protected
  • v0.12.0 protected
  • v0.11.0 protected
  • v0.10.0 protected
  • IntelliO3-ts-v1.0_initial-submit
40 results
Created with Raphaël 2.2.017Nov161512119843229Oct28272625222120191815141312118765427Sep241413987632131Aug30272625242322212019181716141312119643227Jul26232221201918161587542130Jun2928232221181728May27262522212019181764329Apr282721201918161514131211109876131Marfixed if no lazy_preprocessed_data is found, make_input_target() is executedupdate ready to run testrunning load_lazy_preprocessed_data excludes lazy_preprocessing, because loading of input and target is not necessary thenupdate test for train resumingparser args are popped after evaluation, because experiment_date and experiment_name are passes via parser_args into the Workflow()added n_epochs to parser arguments and transformation_input_target_dict as global varparser arguments are not handled correctly in main()added window_history_size as parser argumentadded lazy preprocessing and pca reduction as parser arguments for sbatchfix for uncompiled model when resuming training from epoch differing from 0added missing packages, try on other systemsseparate directory for lazy_preprocessed_data, now initialized in CDCDataHandler initMerge branch 'develop' into 'lukas_issue331_test_tf2_on_local_system_and_hdfml'Merge branch 'master' into 'develop'Merge branch '331-upgrade-code-to-tensorflow-v2' into 'lukas_issue331_test_tf2_on_local_system_and_hdfml'explanation on lazy_preprocessed and lazy_data difference and more accurate names for functionsexplanation on lazy_preprocessed and lazy_data difference and more accurate names for functionstransformation_input_dict = None if do_pca_reduction = True. So input does not get transformed when pca reduction is applied alreadylazy_preprocessing enabled againtrain_start & _end removed in run_mtfalco.py, because it should not be edited manually (unless val & test are edited as well. client.close() in pca_reductionfix in training_monitoring.py if validation is not availableNUMEXPR_MAX_THREADS set manually to 80 in run_mtfalco.py at the very beginning, so it does not drop to 8 (numexpr error)do_pca_reduction and n_pca_dimensions added to the hashes and sorted them to have better overviewraise ValueError in remove_nan() if intersection reaches min_length (with default=0, means it fails only if there is no intersection), option to disable pca_use_dask in run scriptmultiplied target variables occurence fixed, pca_reduction should work now (get_transposed_history fixed), start and end date are parsed as argumentsMerge branch 'release_v1.5.0' into 'master'v1.5.0v1.5.0release description is now updatedupdated version number, changelog, readme and docs and added dist fileMerge branch 'develop' into 'release_v1.5.0'Merge branch 'falco_issue323-raise-valueerror-in-remove_nan-if-data-is-nan-only' into 'develop'use dim name instead of pos to be independent of dim orderingadded use_dask with default 'False' for pca reductionupdate MyUnet modelprint statements for version testing in model classupdate UNet to TF2do_pca_reduction enabled by default and printed in main and logging.info with durationmerged latest 331-upgrade-code-to-tensorflow-v2 changesdask==2021.3.0, because afterwards support for python 3.6 was dropped.changed dask version, hopefully importable in gitlab now.calculate_pca_reduction() in data handler changed to a multiprocessing workflow using dask with every 1e3th values claculation steps
Loading