Skip to content

MS-PINPOINT/mindGlide

Repository files navigation

MindGlide

Ultrafast segmentation of real‑world brain MRI for multiple‑sclerosis patients — any modality, any quality.
Built with PyTorch + MONAI and trained on >23 000 scans.

MindGlide banner

Usage

You can install the tool directly from the GitHub repository using pip:

pip install git+https://github.com/MS-PINPOINT/mindGlide.git

After installation, verify that the command is available by running:

mindglide --help

To segment a scan, run:

mindglide -i /path/to/input.nii.gz -o /path/to/output.nii.gz

You can adjust the --sw_batch_size parameter (default: 4) based on your available VRAM. MindGlide runs in seconds on a GPU, and typically in under 3 minutes on a CPU. Device selection (GPU or CPU) is handled automatically.

The mindglide command also supports directories of NIfTI files as input, allowing you to process multiple scans at once without reloading the model each time.

The following table maps the segmentation codes to their corresponding region names:

Code Structure Name Code Structure Name
0 Background 10 Optic_chiasm
1 CSF 11 Cerebellar_vermis
2 Ventricles_3_4_5 12 Corpus_callosum
3 DGM 13 White_matter
4 Pons 14 Frontal_lobe_GM
5 Brainstem 15 Limbic_cortex_GM
6 Cerebellum 16 Parietal_lobe_GM
7 Temporal_lobe 17 Occipital_lobe_GM
8 Temporal_horn_lateral_ventricle 18 Lesion
9 Lateral_ventricle 19 Ventral_diencephalon

Run from scripts

Requirements

  • Git ≥ 2.13
  • IMPORTANT: Git LFS — one‑time setup: git lfs install
  • Docker
  • (Optional) Apptainer/Singularity for HPC environments

Clone & get the pretrained checkpoint

# 1) clone the code **and** the models sub‑repo in one step
git clone --recurse-submodules https://github.com/MS-PINPOINT/mindGlide.git
cd mindGlide

# 2) first‑time only – install Git‑LFS on your machine
#    macOS:  brew install git-lfs
#    Linux:  sudo apt install git-lfs
git lfs install

# 3) pull the large model file inside the submodule
git submodule foreach 'git lfs pull'


If you already cloned without the flag:

```bash
git submodule update --init --recursive

This pulls the model repo from the Hugging Face Hub and places models/_20240404_conjurer_trained_dice_7733.pt in the workspace.

PyTorch trained models are stored on Huggingface: https://huggingface.co/MS-PINPOINT/mindglide/tree/main

Trained models are shared in the models directory. They are trained on the datasets explained in the paper Nature Communications (2025).

Run in Docker (GPU)

docker run --gpus all \
  --ipc=host --ulimit memlock=-1 -it \
  -v /data:/data \
  <Mindglide image> -i /data/<your_scan>.nii.gz -o /data/output.nii.gz

For Singularity/Apptainer, build once then run:

singularity run --nv \
  --bind $PWD:/mnt \
  /path/to/mind-glide_latest.sif <your_scan>.nii.gz

Fine‑tuning

Use the scripts in scripts/ as a template. Start with a low learning rate (e.g. 1e‑3) to avoid catastrophic forgetting — shipped models were trained with 1e‑2.

Model weights

The primary checkpoint lives in this repo at:

models/_20240404_conjurer_trained_dice_7733.pt

Additional or legacy checkpoints are archived in the Hugging Face model repo.

📬 Citation

If you use MindGlide please cite this paper:

@article{Goebl2025,
    author = {Goebl, Philipp and Wingrove, Jed and Abdelmannan, Omar and {Brito Vega}, Barbara and Stutters, Jonathan and Ramos, {Silvia Da Graca} and Kenway, Owain and Rossor, Thomas and Wassmer, Evangeline and Arnold, Douglas L. and Collins, Louis and Hemingway, Cheryl and Narayanan, Sridar and Chataway, Jeremy and Chard, Declan and Iglesias, {Juan Eugenio} and Barkhof, Frederik and Parker, Geoffrey J. M. and Oxtoby, Neil P. and Hacohen, Yael and Thompson, Alan and Alexander, Daniel C. and Ciccarelli, Olga and Eshaghi, Arman},
    title = {Enabling new insights from old scans by repurposing clinical {MRI} archives for multiple sclerosis research},
    journal = {Nature Communications},
    volume = {16},
    number = {1},
    pages = {3149},
    year = {2025},
    month = apr,
    doi = {10.1038/s41467-025-58274-8},
    pmid = {40195318},
    pmcid = {PMC11976987}
}

Acknowledgements

This study/project is funded by the UK National Institute for Health and Social Care (NIHR) Advanced Fellowship to Arman Eshaghi (Award ID: NIHR302495). The views expressed are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care.

NIHR logo

About

Brain Segmentation with MONAI and Dynamic Unet (nn-unet)

Resources

License

Stars

Watchers

Forks

Packages