-
Notifications
You must be signed in to change notification settings - Fork 280
Open
Description
Does the dropout parameter in LeakyParallel do anything? From experimentation, even setting the parameter to 1, the model still learns.
The PyTorch RNN documentation, on which LeakyParallel is based, says:
dropout– If non-zero, introduces a Dropout layer on the outputs of each RNN layer except the last layer, with dropout probability equal todropout. Default: 0
Is the dropout parameter of LeakyParallel ineffective because it only consists of one RNN layer, which is the output layer, and thus the parameter has no effect?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels