Timestamps and embeddings..? #446
-
|
Hi - completely new to this, so sorry if this is a stupid question! I want to use chronos-2 as an encoder for some time series. from chronos import Chronos2Pipeline
import torch
# Load the model
pipeline = Chronos2Pipeline.from_pretrained("amazon/chronos-2", device_map="cuda")
# Give list of time series.
inputs = [
torch.tensor([1.0, 2.0, 3.0, 4.0], dtype=torch.float32),
]
# Embed
emb, _ = pipeline.embed(inputs)I notice that Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
|
Sorry for the bug label, couldn't add it with Q&A for some reason 😐 |
Beta Was this translation helpful? Give feedback.
-
|
@vstenby Chronos models actually do not use timestamps internally. The |
Beta Was this translation helpful? Give feedback.
@vstenby Chronos models actually do not use timestamps internally. The
predict_dfexpects them to correctly format sequences as time series and to prevent incorrect use of the model (e.g., using on non-uniformly spaced time series). You should compare theembedmethod withpredictorpredict_quantileswhich also does not take timestamps.