- ansatz ->
QuantumPipelinewrapper that connects the ansatz and DL model (if any) ->Trainer(with/w/otorch) - Data preprocessing techniques (no idea how to do that rn) - maybe using a
lightningtype of data module (thanks a lotlightning). But aslightninghas a core dep oftorch, I have to reinvent(?) it ig - any ansatz could go with any DL backbone (ideally, not sure how much is feasible - tbd)
torchis used withlightningas a imp soft-dep now!!
Will add better vignettes once i have my ideas consolidated in my mind
Have a look at a basic tutorial here (This is how the flow would look like in future, still working on making data module and Trainer etc more "user friendly" and expressive)
for some idea how it might look like:
# no torch
pyqit.set_backend("pennylane")
qml_model= QMLmodel(...) # may use their own ansatz?
dm = DataModule(...)
trainer = Trainer(...)
trainer.fit(qml_model, dm)
trainer.predict(qml_model, dm_new, return_format = "numpy") # or "torch" for torch tensors if torch is backend,
# should i add pennylane tensors as well? good question!pyqit.set_backend("torch")
dm = DataModule(...)
model_a = QMLmodel(**params)
model_b = QMLmodel(**params) # or DLModel for that matter
trainer = Trainer( max_epochs=10, learning_rate=0.01)
pipeline = QuantumPipeline(
[
PipelineStage(model_a, name="stage_1", trainable=trainable_a),
PipelineStage(model_b, name="stage_2", trainable=True),
],
mode="sequential",
)
pipeline.fit(datamodule=dm, trainers=trainer, fit_mode="sequential_greedy")
preds = pipeline.predict(X_new, batch_size=8)You can also train just QMLmodel using Trainer
here anyQMLmodel and DLmodel can be implemented by the user themselves or use the implemented ones from the package
Then package would also have a complete model zoo.