Can I finetune the model with Pytorch for "horizon_len" lower than 128? #263
Unanswered
sebassaras02
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've been trying to finetune the model with my custom dataset. My dataset has many time-series, I wrangled the data to have a context_len of 32 values (train with the previous 32) and horizon_len of 4 (predict the next 4).
I am getting this error:
"The size of tensor a (128) must match the size of tensor b (4) at non-singleton dimension 1"
I understand this is a mismatch in the loss function dimensions, but I didn't figure it out how to fine-tune for a lower horizon_len.
This is all the error obtained.
Beta Was this translation helpful? Give feedback.
All reactions