[TRTLLM-11508][refactor] decouple MTP num_nextn_predict_layers from max_draft_len#12341
[TRTLLM-11508][refactor] decouple MTP num_nextn_predict_layers from max_draft_len#12341zhaoyangwang-nvidia wants to merge 2 commits intoNVIDIA:mainfrom
Conversation
|
/bot run |
|
PR_Github #39550 [ run ] triggered by Bot. Commit: |
|
/bot run |
|
PR_Github #39558 [ run ] triggered by Bot. Commit: |
|
PR_Github #39558 [ run ] completed with state
|
|
/bot run |
|
PR_Github #39583 [ run ] triggered by Bot. Commit: |
|
PR_Github #39583 [ run ] completed with state
|
|
/bot run |
|
PR_Github #39665 [ run ] triggered by Bot. Commit: |
|
PR_Github #39665 [ run ] completed with state
|
da046c5 to
4289529
Compare
|
/bot run |
|
PR_Github #39717 [ run ] triggered by Bot. Commit: |
|
PR_Github #39717 [ run ] completed with state
|
|
/bot run |
|
PR_Github #39735 [ run ] triggered by Bot. Commit: |
|
PR_Github #39735 [ run ] completed with state
|
4289529 to
5187058
Compare
|
/bot run |
|
PR_Github #39785 [ run ] triggered by Bot. Commit: |
|
PR_Github #39785 [ run ] completed with state |
…ax_draft_len Signed-off-by: ZhaoyangWang <zhaoyangw@nvidia.com>
Signed-off-by: ZhaoyangWang <zhaoyangw@nvidia.com>
5187058 to
a79ca0f
Compare
|
/bot run |
|
PR_Github #39813 [ run ] triggered by Bot. Commit: |
|
PR_Github #39813 [ run ] completed with state
|
|
/bot run |
|
PR_Github #39820 [ run ] triggered by Bot. Commit: |
|
PR_Github #39820 [ run ] completed with state
|
|
/bot run |
|
PR_Github #39827 [ run ] triggered by Bot. Commit: |
…ax_draft_len
@coderabbitai summary
Description
The internal field num_nextn_predict_layers_from_model_config has been removed and replaced by num_nextn_predict_layers.
The original num_nextn_predict_layers field in MTPDecodingConfig, which conflated two separate concerns, was split into two fields with clear responsibilities:
max_draft_lennum_nextn_predict_layersParameter Logic Per Mode
Eagle MTP (e.g. DeepSeek-V3, model has only 1 MTP layer)
Vanilla MTP (model has multiple MTP layers)
N >= M: uses M, produces M draft tokens
Test Coverage
PR Checklist
Please review the following before submitting your PR:
PR description clearly explains what and why. If using CodeRabbit's summary, please make sure it makes sense.
PR Follows TRT-LLM CODING GUIDELINES to the best of your knowledge.
Test cases are provided for new code paths (see test instructions)
Any new dependencies have been scanned for license and vulnerabilities
CODEOWNERS updated if ownership changes
Documentation updated as needed
Update tava architecture diagram if there is a significant design change in PR.
The reviewers assigned automatically/manually are appropriate for the PR.
Please check this after reviewing the above items as appropriate for this PR.
GitHub Bot Help
To see a list of available CI bot commands, please comment
/bot help.