You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Update README with recent highlights, benchmarks, and support matrix fixes
Add new Recent News entries for Qwen3, ESM2 low-precision (NVFP4/MXFP8),
Mixtral MoE, ESM2 PEFT, and Llama3 context parallelism. Add benchmark
figures for ESM2 low-precision on B300 and Llama3 70B CP on GB300. Update
support matrix with new model/recipe rows and fix stale WIP statuses.
Fix typos ("bionemo2" → "BioNeMo Framework") and remove stray character
in amplify README.
Signed-off-by: Timur Rvachov <trvachov@nvidia.com>
Copy file name to clipboardExpand all lines: README.md
+16-6Lines changed: 16 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,6 +43,11 @@ cd bionemo-framework/bionemo-recipes/recipes/esm2_native_te/
43
43
44
44
## Recent News
45
45
46
+
- 03/09/2026 [Qwen2.5 / Qwen3 model](bionemo-recipes/models/qwen/) with TE acceleration, FP8/MXFP8, KV-cache inference, and bidirectional HF checkpoint conversion.
47
+
- 03/05/2026 [ESM2 NVFP4 and MXFP8](bionemo-recipes/recipes/esm2_native_te/README.md#low-precision-performance-benchmarks) low-precision training — up to **2,367 TFLOPS/GPU** on NVIDIA B300 at 15B scale with per-layer precision control.
48
+
- 02/23/2026 [Mixtral MoE model](bionemo-recipes/models/mixtral/) with TE `GroupedLinear` for efficient parallel expert computation, FP8/FP4 support, and HF conversion.
49
+
- 02/13/2026 [ESM2 PEFT recipe](bionemo-recipes/recipes/esm2_peft_te/) for LoRA fine-tuning with sequence packing support.
50
+
- 01/14/2026 [Llama3 Context Parallelism](bionemo-recipes/recipes/llama3_native_te/README.md#performance-benchmarks) — scaling Llama 3 70B to 144K context on 36x GB300 NVL36 with ~65% MFU.
46
51
- 10/27/2025 [CodonFM recipe](https://github.com/NVIDIA/bionemo-framework/tree/main/bionemo-recipes/recipes/codonfm_ptl_te) released! This is an accelerated version of the original [research codebase](https://github.com/NVIDIA-Digital-Bio/CodonFM) with [scientific preprint](https://research.nvidia.com/labs/dbr/assets/data/manuscripts/nv-codonfm-preprint.pdf).
47
52
- 09/30/2025 Megatron/NeMo 5D parallel BioNeMo Framework image v2.7 [released on NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/clara/containers/bionemo-framework) for both x86 and ARM CPUs.
48
53
- 09/01/2025 [bionemo-recipes](https://github.com/NVIDIA/bionemo-framework/tree/main/bionemo-recipes) goes live! Lightweight and portable examples with state-of-the-art training performance you can riff on to meet your needs.
@@ -61,13 +66,18 @@ A core use-case of the BioNeMo Framework is to help digital biology scientists a
@@ -113,7 +123,7 @@ BioNeMo Framework is part of a larger ecosystem of NVIDIA Biopharma products. Ge
113
123
114
124
## Documentation Resources
115
125
116
-
-**Official Documentation:**Contents of `sub-packages` including user guides, API references, and troubleshooting, are documented on our [official documentation](https://docs.nvidia.com/bionemo-framework/latest/). Nightly builds of this documentation is available on [BioNeMo Framework GitHub Pages](https://nvidia.github.io/bionemo-framework/)
126
+
-**Official Documentation:**Documentation for sub-packages, including user guides, API references, and troubleshooting, is available on our [official documentation](https://docs.nvidia.com/bionemo-framework/latest/). Nightly builds of this documentation is available on [BioNeMo Framework GitHub Pages](https://nvidia.github.io/bionemo-framework/)
117
127
118
128
-**🚧 In-Progress Documentation 🚧:**`bionemo-recipes` documentation is currently work in progress, however the recipes are meant to be self-documented and easy to understand—we suggest you throw them into your favorite genai code assistant!
119
129
@@ -136,8 +146,8 @@ docker run --rm -it \
136
146
137
147
#### Initializing 3rd-party dependencies as git submodules
138
148
139
-
The NeMo and Megatron-LM dependencies are included as git submodules in bionemo2. The pinned commits for these submodules represent the "last-known-good" versions of these packages
140
-
that are confirmed to be working with bionemo2 (and those that are tested in CI).
149
+
The NeMo and Megatron-LM dependencies are included as git submodules in BioNeMo Framework. The pinned commits for these submodules represent the "last-known-good" versions of these packages
150
+
that are confirmed to be working with BioNeMo Framework (and those that are tested in CI).
141
151
142
152
To initialize these sub-modules when cloning the repo, add the `--recursive` flag to the git clone command:
0 commit comments