Skip to content
Snippets Groups Projects
Commit 3ace6bbf authored by lukas leufen's avatar lukas leufen
Browse files

enumerate of layer was not proper

parent 1f11c299
No related branches found
No related tags found
6 merge requests!430update recent developments,!413update release branch,!412Resolve "release v2.0.0",!390Lukas issue362 feat branched rnn,!389Lukas issue361 feat custom dense layers in rnn,!387Resolve "custom dense layers in rnn"
Pipeline #92226 passed
......@@ -139,9 +139,9 @@ class RNN(AbstractModelClass): # pragma: no cover
for layer, n_hidden in enumerate(self.dense_layer_configuration):
if n_hidden < self._output_shape:
break
x_in = keras.layers.Dense(n_hidden, name=f"Dense_{layer + 1}",
x_in = keras.layers.Dense(n_hidden, name=f"Dense_{len(conf) + layer + 1}",
kernel_initializer=self.kernel_initializer, )(x_in)
x_in = self.activation(name=f"{self.activation_name}_{layer + 1}")(x_in)
x_in = self.activation(name=f"{self.activation_name}_{len(conf) + layer + 1}")(x_in)
if self.dropout is not None:
x_in = self.dropout(self.dropout_rate)(x_in)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment