ingestion workflow was adapted for data not passing the quality control
During data ingestion from different networks, sometimes data does not pass the automated quality control.
This is logged, but the data is then just skipped and the workflow turns to the next time series.
For some data collections, this means that data has to be tediously extracted again from the original data.
The automated quality control can work on database data (and can, on top, produce nice plots).
Therefore, before skipping the data (time series and station) it should go to the staging "database".
This is good preparation for individual data provisions that should always be first inserted into the staging database.
Attention!
To avoid inconsistencies in IDs for stations and time series, it is important, that the productive and the staging schema share the same sequence for creating values.
This can easily be done via explicitly using those already existing sequences (see below: example for time series in comment #107 (comment 129078))