Skip to content
This repository was archived by the owner on Sep 1, 2024. It is now read-only.

Errors about loading a pretrained model #108

@wuliting-wlt

Description

@wuliting-wlt

I meet two problems

  1. I follow the words on readme.md as follows to load a pre-trained model:

    import fairseq
    import hubert_pretraining, hubert
    ckpt_path = "/../../large_lrs3_iter5.pt"
    models, cfg, task = fairseq.checkpoint_utils.load_model_ensemble_and_task([ckpt_path])
    At this point, an error occurred:
    omegaconf.errors.ConfigAttributeError: Key 'required_seq_len_multiple' not in 'AVHubertConfig'
    full_key: required_seq_len_multiple
    object_type=AVHubertConfig

  2. I want to Finetune an AV-HuBERT model with Seq2Seq ,follow the readme.md:
    $ cd avhubert
    $ fairseq-hydra-train --config-dir /path/to/conf/finetune/ --config-name large_lrs3_433h.yaml
    task.data=/path/to/data task.label_dir=/path/to/label
    task.tokenizer_bpe_model=/path/to/tokenizer model.w2v_path=/path/to/checkpoint/large_lrs3_iter5.pt
    hydra.run.dir=/path/to/experiment/finetune/ common.user_dir=pwd
    The error is as follows:
    omegaconf.errors.ConfigKeyError: Key 'input_modality' not in 'AVHubertPretrainingConfig'
    full_key: input_modality
    object_type=AVHubertPretrainingConfig

I don't know what caused these errors

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions