-
Notifications
You must be signed in to change notification settings - Fork 29
Open
Description
Hi, I'm trying to migrate this sample to TL2.
But I found in train.py : 116 that optimizer only works for Decoder's variables, and there are codes restoring weights from .npz file to Encoders before training (and these weights will not be changed during training).
In order to migrate to TL2, and without pretrained Encoder's weights at hand, I need to write a model composed of 3 Encoders and 1 Decoders, and then config a tf.keras.optimizers.Adam optimizer to apply gradients to all trainable variables of the model, am i right?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels