Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
MLAir
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container registry
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
esde
machine-learning
MLAir
Commits
08c461c4
Commit
08c461c4
authored
5 years ago
by
Felix Kleinert
Browse files
Options
Downloads
Patches
Plain Diff
update doc strings
parent
f512ddb0
No related branches found
No related tags found
3 merge requests
!125
Release v0.10.0
,
!124
Update Master to new version v0.10.0
,
!96
Felix issue114 customise flatten tail
Pipeline
#35440
passed
5 years ago
Stage: test
Stage: pages
Stage: deploy
Changes
1
Pipelines
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
src/model_modules/flatten.py
+22
-12
22 additions, 12 deletions
src/model_modules/flatten.py
with
22 additions
and
12 deletions
src/model_modules/flatten.py
+
22
−
12
View file @
08c461c4
...
...
@@ -12,10 +12,17 @@ def get_activation(input_to_activate: keras.layers, activation: Union[Callable,
This helper function is able to handle advanced keras activations as well as strings for standard activations
:param input_to_activate:
:param activation:
:param input_to_activate:
keras layer to apply activation on
:param activation:
activation to apply on `input_to_activate
'
. Can be a standard keras strings or activation layers
:param kwargs:
:return:
.. code-block:: python
input_x = ... # your input data
x_in = keras.layer(<without activation>)(input_x)
x_act_string = get_activation(x_in,
'
relu
'
)
x_act_layer = get_activation(x_in, keras.layers.advanced_activations.ELU)
"""
if
isinstance
(
activation
,
str
):
name
=
kwargs
.
pop
(
'
name
'
,
None
)
...
...
@@ -37,16 +44,16 @@ def flatten_tail(input_x: keras.layers, inner_neurons: int, activation: Union[Ca
"""
Flatten output of convolutional layers
:param input_x:
:param output_neurons:
:param output_activation:
:param name:
:param bound_weight:
:param dropout_rate:
:param activation:
:param reduction_filter:
:param inner_neurons:
:param kernel_regularizer:
:param input_x:
Multidimensional keras layer (ConvLayer)
:param output_neurons:
Number of neurons in the last layer (must fit the shape of labels)
:param output_activation:
final activation function
:param name:
Name of the flatten tail.
:param bound_weight:
Use `tanh
'
as inner activation if set to True, otherwise `activation
'
:param dropout_rate:
Dropout rate to be applied between trainable layers
:param activation:
activation to after conv and dense layers
:param reduction_filter:
number of filters used for information compression on `input_x
'
before flatten()
:param inner_neurons:
Number of neurons in inner dense layer
:param kernel_regularizer:
regularizer to apply on conv and dense layers
:return:
...
...
@@ -60,6 +67,9 @@ def flatten_tail(input_x: keras.layers, inner_neurons: int, activation: Union[Ca
name=
'
Main
'
, bound_weight=False, dropout_rate=.3,
kernel_regularizer=keras.regularizers.l2()
)
model = keras.Model(inputs=input_x, outputs=[out])
"""
# compression layer
if
reduction_filter
is
None
:
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment