Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
MLAir
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container registry
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
esde
machine-learning
MLAir
Commits
319a0af9
Commit
319a0af9
authored
4 years ago
by
Felix Kleinert
Browse files
Options
Downloads
Patches
Plain Diff
upadet examples
parent
0d037f9d
No related branches found
No related tags found
4 merge requests
!156
include current development into release
,
!155
Resolve "new release v0.12.1"
,
!152
Resolve "Update README HPC setup"
,
!139
Draft: Resolve "KZ filter"
Pipeline
#46311
passed
4 years ago
Stage: test
Stage: docs
Stage: pages
Stage: deploy
Changes
1
Pipelines
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
README.md
+39
-25
39 additions, 25 deletions
README.md
with
39 additions
and
25 deletions
README.md
+
39
−
25
View file @
319a0af9
...
...
@@ -29,6 +29,7 @@ install the geo packages. For special instructions to install MLAir on the Jueli
*
or download the distribution file (?? .whl) and install it via
`pip install <??>`
. In this case, you can simply
import MLAir in any python script inside your virtual environment using
`import mlair`
.
# How to start with MLAir
In this section, we show three examples how to work with MLAir. Note, that for these examples MLAir was installed using
...
...
@@ -51,11 +52,15 @@ INFO: DefaultWorkflow started
INFO: ExperimentSetup started
INFO: Experiment path is: /home/<usr>/mlair/testrun_network
...
INFO: load data for DEBW001 from JOIN
INFO: load data for DEBW107 from JOIN
INFO: load data for DEBY081 from JOIN
INFO: load data for DEBW013 from JOIN
INFO: load data for DEBW076 from JOIN
INFO: load data for DEBW087 from JOIN
...
INFO: Training started
...
INFO: DefaultWorkflow finished after
0
0:0
0:12
(hh:mm:ss)
INFO: DefaultWorkflow finished after 0:0
3:04
(hh:mm:ss)
```
## Example 2
...
...
@@ -82,10 +87,12 @@ INFO: ExperimentSetup started
...
INFO: load data for DEBW030 from JOIN
INFO: load data for DEBW037 from JOIN
INFO: load data for DEBW031 from JOIN
INFO: load data for DEBW015 from JOIN
...
INFO: Training started
...
INFO: DefaultWorkflow finished after 00:0
0:24
(hh:mm:ss)
INFO: DefaultWorkflow finished after 00:0
2:03
(hh:mm:ss)
```
## Example 3
...
...
@@ -107,15 +114,15 @@ window_history_size = 14
mlair
.
run
(
stations
=
stations
,
window_history_size
=
window_history_size
,
create_new_model
=
False
,
train
able
=
False
)
train
_model
=
False
)
```
We can see from the terminal that no training was performed. Analysis is now made on the new stations.
```
log
INFO: DefaultWorkflow started
...
INFO: No training has started, because train
able
parameter was false.
INFO: No training has started, because train
_model
parameter was false.
...
INFO: DefaultWorkflow finished after
0
0:0
0:06
(hh:mm:ss)
INFO: DefaultWorkflow finished after 0:0
1:27
(hh:mm:ss)
```
...
...
@@ -222,18 +229,14 @@ behaviour.
```
python
from
mlair
import
AbstractModelClass
import
keras
class
MyCustomisedModel
(
AbstractModelClass
):
def
__init__
(
self
,
input_shape
:
list
,
output_shape
:
list
):
# set attributes shape_inputs and shape_outputs
super
().
__init__
(
input_shape
[
0
],
output_shape
[
0
])
# settings
self
.
dropout_rate
=
0.1
self
.
activation
=
keras
.
layers
.
PReLU
# apply to model
self
.
set_model
()
self
.
set_compile_options
()
...
...
@@ -254,34 +257,36 @@ class MyCustomisedModel(AbstractModelClass):
`self._output_shape`
and storing the model as
`self.model`
.
```
python
import
keras
from
keras.layers
import
PReLU
,
Input
,
Conv2D
,
Flatten
,
Dropout
,
Dense
class
MyCustomisedModel
(
AbstractModelClass
):
def
set_model
(
self
):
x_input
=
keras
.
layers
.
Input
(
shape
=
self
.
_input_shape
)
x_in
=
keras
.
layers
.
Conv2D
(
32
,
(
1
,
1
)
,
padding
=
'
same
'
,
name
=
'
{}_Conv_1x1
'
.
format
(
"
major
"
)
)(
x_input
)
x_in
=
self
.
activation
(
name
=
'
{}_conv_act
'
.
format
(
"
major
"
)
)(
x_in
)
x_in
=
keras
.
layers
.
Flatten
(
name
=
'
{}
'
.
format
(
"
major
"
)
)(
x_in
)
x_in
=
keras
.
layers
.
Dropout
(
self
.
dropout_rate
,
name
=
'
{}_Dropout_1
'
.
format
(
"
major
"
)
)(
x_in
)
x_in
=
keras
.
layers
.
Dense
(
16
,
name
=
'
{}_Dense_16
'
.
format
(
"
major
"
)
)(
x_in
)
x_in
=
self
.
activation
()(
x_in
)
x_in
=
keras
.
layers
.
Dense
(
self
.
_output_shape
,
name
=
'
{}_Dense
'
.
format
(
"
major
"
)
)(
x_in
)
out
_main
=
self
.
activation
()(
x_in
)
self
.
model
=
keras
.
Model
(
inputs
=
x_input
,
outputs
=
[
out
_main
])
x_input
=
Input
(
shape
=
self
.
_input_shape
)
x_in
=
Conv2D
(
4
,
(
1
,
1
))(
x_input
)
x_in
=
PReLU
(
)(
x_in
)
x_in
=
Flatten
(
)(
x_in
)
x_in
=
Dropout
(
0.1
)(
x_in
)
x_in
=
Dense
(
16
)(
x_in
)
x_in
=
PReLU
()(
x_in
)
x_in
=
Dense
(
self
.
_output_shape
)(
x_in
)
out
=
PReLU
()(
x_in
)
self
.
model
=
keras
.
Model
(
inputs
=
x_input
,
outputs
=
[
out
])
```
*
Your are free how to design your model. Just make sure to save it in the class attribute model.
*
Additionally, set your custom compile options including the loss definition.
```
python
from
keras.losses
import
mean_squared_error
as
mse
class
MyCustomisedModel
(
AbstractModelClass
):
def
set_compile_options
(
self
):
self
.
initial_lr
=
1e-2
self
.
optimizer
=
keras
.
optimizers
.
SGD
(
lr
=
self
.
initial_lr
,
momentum
=
0.9
)
self
.
lr_decay
=
mlair
.
model_modules
.
keras_extensions
.
LearningRateDecay
(
base_lr
=
self
.
initial_lr
,
drop
=
.
94
,
epochs_drop
=
10
)
self
.
loss
=
keras
.
losses
.
mean_squared_error
self
.
loss
=
mse
self
.
compile_options
=
{
"
metrics
"
:
[
"
mse
"
,
"
mae
"
]}
```
...
...
@@ -303,6 +308,15 @@ class MyCustomisedModel(AbstractModelClass):
self.compile_options = {"optimizer" = keras.optimizers.Adam()}
```
### How to plug in the customised model into the workflow?
*
Make use of the
`model`
argument and pass
`MyCustomisedModel`
when instantiating a workflow.
```
python
from
mlair.workflows
import
DefaultWorkflow
workflow
=
DefaultWorkflow
(
model
=
MyCustomisedModel
)
workflow
.
run
()
```
## Specials for Branched Models
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment