Skip to content
Snippets Groups Projects
Commit d6773ba6 authored by Michael Langguth's avatar Michael Langguth
Browse files

Merge branch 'michael_issue#110_parse_hyperparameters_to_dataset_instance' into develop

parents 4f406f23 4f5e3a32
No related branches found
No related tags found
Loading
Checking pipeline status
Loading
  • @langguth1 Now I remember why we use self.model_hparams_dict_load.update({"sequence_length": self.train_dataset.sequence_length}), because we remove the hyperperatmers "sequence_length" from dataset, we use the default sequence_length of the tensorRecords. When training model , we need to pass this information to the model part. We have to add it back. I think we revised this together :) should I comment on it?

  • Author Maintainer

    Yes, I also remember now. I've also readded it in the current working branch #116 (closed) which will (hopefully) make its way to the develop branch in a few minutes.

  • Author Maintainer

    @gong1 I can also add a comment on this if you like. Just to avoid later confusion :-)

  • @langguth1 cool! Free to add comment there!

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment