Skip to content
Snippets Groups Projects
Select Git revision
  • e38ae474de6ff3bd2bcbe3acb7117883d4ac5ca4
  • documentation default
  • master protected
  • integration
  • pre_update
5 results

Example - Lorenz Differential Equations.ipynb

Blame
  • Examples_from_manuscript.ipynb 7.79 KiB

    MLAir (v1.0) - Examples

    This notebook contains all examples as provided in Leufen et al. (2020). Please follow the installation instructions provided in the README on gitlab.

    Example 1

    The following cell imports MLAir and executes a minimalistic toy experiment. This cell is equivalent to Figure 2 in the manuscript.

    import mlair
    
    # just give it a dry run without any modifications
    mlair.run()

    Example 2

    In the following cell we use other station IDs provided as a list of strings (see also JOIN-Web interface of the TOAR database for more details). Moreover, we expand the window_history_size to 14 days and run the experiment. This cell is equivalent to Figure 3 in the manuscript.

    # our new stations to use
    stations = ['DEBW030', 'DEBW037', 'DEBW031', 'DEBW015', 'DEBW107']
    
    # expanded temporal context to 14 (days, because of default sampling="daily")
    window_history_size = 14
    
    # restart the experiment with little customisation
    mlair.run(stations=stations, 
              window_history_size=window_history_size)

    Example 3

    The following cell loads the trained model from Example 2 and generates predictions for the two specified stations. To ensure that the model is not retrained the keywords create_new_model and train_model are set to False. This cell is equivalent to Figure 4 in the manuscript.

    # our new stations to use
    stations = ['DEBY002', 'DEBY079']
    
    # same setting for window_history_size
    window_history_size = 14
    
    # run experiment without training
    mlair.run(stations=stations, 
              window_history_size=window_history_size, 
              create_new_model=False, 
              train_model=False)

    Example 4

    The following cell demonstrates how a user defined model can be implemented by inheriting from AbstractModelClass. Within the __init__ method super().__init__, set_model and set_compile_options should be called. Moreover, it is possible to set custom objects by calling set_custom_objects. Those custom objects are used to re-load the model (see also Keras documentation). For demonstration, the loss is added as custom object which is not required because a Keras built-in function is used as loss.

    The Keras-model itself is defined in set_model by using the sequential or functional Keras API. All compile options can be defined in set_compile_options. This cell is equivalent to Figure 5 in the manuscript.

    import keras
    from keras.losses import mean_squared_error as mse
    from keras.layers import PReLU, Input, Conv2D, Flatten, Dropout, Dense
    
    from mlair.model_modules import AbstractModelClass
    from mlair.workflows import DefaultWorkflow
    
    class MyCustomisedModel(AbstractModelClass):
    
        """
        A customised model with a 1x1 Conv, and 2 Dense layers (16, 
        output shape). Dropout is used after Conv layer.
        """
        def __init__(self, input_shape: list, output_shape: list):
        
            # set attributes shape_inputs and shape_outputs
            super().__init__(input_shape[0], output_shape[0])
    
            # apply to model
            self.set_model()
            self.set_compile_options()
            self.set_custom_objects(loss=self.compile_options['loss'])
    
        def set_model(self):
            x_input = Input(shape=self._input_shape)
            x_in = Conv2D(4, (1, 1))(x_input)
            x_in = PReLU()(x_in)
            x_in = Flatten()(x_in)
            x_in = Dropout(0.1)(x_in)
            x_in = Dense(16)(x_in)
            x_in = PReLU()(x_in)
            x_in = Dense(self._output_shape)(x_in)
            out = PReLU()(x_in)
            self.model = keras.Model(inputs=x_input, outputs=[out])
    
        def set_compile_options(self):
            self.initial_lr = 1e-2
            self.optimizer = keras.optimizers.SGD(lr=self.initial_lr, momentum=0.9)
            self.loss = mse
            self.compile_options = {"metrics": ["mse", "mae"]}
    
    # Make use of MyCustomisedModel within the DefaultWorkflow
    workflow = DefaultWorkflow(model=MyCustomisedModel, epochs=2)
    workflow.run()
    

    Example 5

    Embedding of a custom Run Module in a modified MLAir workflow. In comparison to examples 1 to 4, this code example works on a single step deeper regarding the level of abstraction. Instead of calling the run method of MLAir, the user needs to add all stages individually and is responsible for all dependencies between the stages. By using the Workflow class as context manager, all stages are automatically connected with the result that all stages can easily be plugged in. This cell is equivalent to Figure 6 in the manuscript.

    import logging
    
    class CustomStage(mlair.RunEnvironment):
        """A custom MLAir stage for demonstration."""
        def __init__(self, test_string):
            super().__init__() # always call super init method
            self._run(test_string) # call a class method
            
        def _run(self, test_string):
            logging.info("Just running a custom stage.")
            logging.info("test_string = " + test_string)
            epochs = self.data_store.get("epochs")
            logging.info("epochs = " + str(epochs))
        
        
    # create your custom MLAir workflow
    CustomWorkflow = mlair.Workflow()
    # provide stages without initialisation
    CustomWorkflow.add(mlair.ExperimentSetup, epochs=128)
    # add also keyword arguments for a specific stage
    CustomWorkflow.add(CustomStage, test_string="Hello World")
    # finally execute custom workflow in order of adding
    CustomWorkflow.run()