• Stars
    star
    146
  • Rank 252,769 (Top 5 %)
  • Language
    Python
  • License
    Other
  • Created over 14 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Automated anatomical brain label/shape analysis software (+ website)

Software

Mindboggle's open source brain morphometry platform takes in preprocessed T1-weighted MRI data, and outputs volume, surface, and tabular data containing label, feature, and shape information for further analysis. Mindboggle can be run on the command line as "mindboggle" and also exists as a cross-platform Docker container for convenience and reproducibility of results. The software runs on Linux and is written in Python 3 and Python-wrapped C++ code called within a Nipype pipeline framework. We have tested the software most extensively with Python 3.5.1 on Ubuntu Linux 14.04.

Release:|version|
Date:|today|

:ref:`modindex` and :ref:`genindex`

Contents

Links

Reference

A Klein, SS Ghosh, FS Bao, J Giard, Y Hame, E Stavsky, N Lee, B Rossa, M Reuter, EC Neto, A Keshavan. 2017. Mindboggling morphometry of human brains. PLoS Computational Biology 13(3): e1005350. doi:10.1371/journal.pcbi.1005350

Help

General questions about Mindboggle, or having some difficulties getting started? Please search for relevant mindboggle posts in NeuroStars or post your own message with the tag "mindboggle".

Found a bug, big or small? Please submit an issue on GitHub.

Installation

We recommend installing Mindboggle and its dependencies as a cross-platform Docker container for greater convenience and reproducibility of results. All the examples below assume you are using this Docker container, with the path /home/jovyan/work/ pointing to your host machine. (Alternatively, one can create a Singularity image.)

1. Install and run Docker on your (macOS, Linux, or Windows) host machine.

2. Download the Mindboggle Docker container (copy/paste the following in a terminal window):

docker pull nipy/mindboggle

Note 1: This contains FreeSurfer, ANTs, and Mindboggle, so it is currently over 6GB.*

Note 2: You may need to increase memory allocated by Docker to at least 5GB. For example: By default, Docker for Mac is set to use 2 GB runtime memory.

3. Recommended: download sample data. To try out the mindboggle examples below, download and unzip the directory of example input data mindboggle_input_example.zip (455 MB). For example MRI data to preprocess with FreeSurfer and ANTs software, download and unzip example_mri_data.zip (29 MB).

4. Recommended: set environment variables for clarity in the commands below (modify accordingly, except for DOCK -- careful, this step is tricky!):

HOST=/Users/binarybottle  # path on local host seen from Docker container to access/save data
DOCK=/home/jovyan/work  # path to HOST from Docker container (DO NOT CHANGE)
IMAGE=$DOCK/example_mri_data/T1.nii.gz  # brain image in $HOST to process
ID=arno  # ID for brain image
OUT=$DOCK/mindboggle123_output # output path ('--out $OUT' below is optional)

Tutorial

To run the Mindboggle jupyter notebook tutorial, first install the Mindboggle Docker container (above) and run the notebook in a web browser as follows (replacing $HOST with the absolute path where you want to access/save data):

docker run --rm -ti -v $HOST:/home/jovyan/work -p 8888:8888 nipy/mindboggle jupyter notebook /opt/mindboggle/docs/mindboggle_tutorial.ipynb --ip=0.0.0.0 --allow-root

In the output on the command line you'll see something like:

[I 20:47:38.209 NotebookApp] The Jupyter Notebook is running at:
[I 20:47:38.210 NotebookApp] http://(057a72e00d63 or 127.0.0.1):8888/?token=62853787e0d6e180856eb22a51609b25e

You would then copy and paste the corresponding address into your web browser (in this case, http://127.0.0.1:8888/?token=62853787e0d6e180856eb22a51609b25e), and click on "mindboggle_tutorial.ipynb".

Run one command

The Mindboggle Docker container can be run as a single command to process a T1-weighted MR brain image through FreeSurfer, ANTs, and Mindboggle. Skip to the next section if you wish to run recon-all, antsCorticalThickness.sh, and mindboggle differently:

docker run --rm -ti -v $HOST:$DOCK nipy/mindboggle mindboggle123 $IMAGE --id $ID

Outputs are stored in $DOCK/mindboggle123_output/ by default, but you can set a different output path with --out $OUT.

Run separate commands

If finer control is needed over the software in the Docker container, the following instructions outline how to run each command separately. Mindboggle currently takes output from FreeSurfer and optionally from ANTs. FreeSurfer version 6 or higher is recommended because by default it uses Mindboggle’s DKT-100 surface-based atlas to generate corresponding labels on the cortical surfaces and in the cortical and non-cortical volumes (v5.3 generates these surface labels by default; older versions require "-gcs DKTatlas40.gcs" to generate these surface labels).

  1. Enter the Docker container's bash shell to run recon-all, antsCorticalThickness.sh, and mindboggle commands:

    docker run --rm -ti -v $HOST:$DOCK -p 5000:5000 nipy/mindboggle
    
  2. Recommended: reset environment variables as above within the Docker container:

    DOCK=/home/jovyan/work  # path to HOST from Docker container
    IMAGE=$DOCK/example_mri_data/T1.nii.gz  # input image on HOST
    ID=arno  # ID for brain image
    

3. FreeSurfer generates labeled cortical surfaces, and labeled cortical and noncortical volumes. Run recon-all on a T1-weighted IMAGE file (and optionally a T2-weighted image), and set the output ID name as well as the $FREESURFER_OUT output directory:

FREESURFER_OUT=$DOCK/freesurfer_subjects

recon-all -all -i $IMAGE -s $ID -sd $FREESURFER_OUT

4. ANTs provides brain volume extraction, segmentation, and registration-based labeling. antsCorticalThickness.sh generates transforms and segmentation files used by Mindboggle, and is run on the same IMAGE file and ID as above, with $ANTS_OUT output directory. TEMPLATE points to the OASIS-30_Atropos_template folder already installed in the Docker container (backslashes split the command for readability):

ANTS_OUT=$DOCK/ants_subjects
TEMPLATE=/opt/data/OASIS-30_Atropos_template

antsCorticalThickness.sh -d 3 -a $IMAGE -o $ANTS_OUT/$ID/ants \
  -e $TEMPLATE/T_template0.nii.gz \
  -t $TEMPLATE/T_template0_BrainCerebellum.nii.gz \
  -m $TEMPLATE/T_template0_BrainCerebellumProbabilityMask.nii.gz \
  -f $TEMPLATE/T_template0_BrainCerebellumExtractionMask.nii.gz \
  -p $TEMPLATE/Priors2/priors%d.nii.gz \
  -u 0

5. Mindboggle can be run on data preprocessed by recon-all and antsCorticalThickness.sh as above by setting:

FREESURFER_SUBJECT=$FREESURFER_OUT/$ID
ANTS_SUBJECT=$ANTS_OUT/$ID
OUT=$DOCK/mindboggled  # output folder

Or it can be run on the mindboggle_input_example preprocessed data by setting:

EXAMPLE=$DOCK/mindboggle_input_example
FREESURFER_SUBJECT=$EXAMPLE/freesurfer/subjects/arno
ANTS_SUBJECT=$EXAMPLE/ants/subjects/arno
OUT=$DOCK/mindboggled  # output folder

Example Mindboggle commands:

To learn about Mindboggle's command options, type this in a terminal window:

mindboggle -h

Example 1: Run Mindboggle on data processed by FreeSurfer but not ANTs:

mindboggle $FREESURFER_SUBJECT --out $OUT

Example 2: Same as Example 1 with output to visualize surface data with roygbiv:

mindboggle $FREESURFER_SUBJECT --out $OUT --roygbiv

Example 3: Take advantage of ANTs output as well ("\" splits for readability):

mindboggle $FREESURFER_SUBJECT --out $OUT --roygbiv \
    --ants $ANTS_SUBJECT/antsBrainSegmentation.nii.gz

Example 4: Generate only volume (no surface) labels and shapes:

mindboggle $FREESURFER_SUBJECT --out $OUT \
    --ants $ANTS_SUBJECT/antsBrainSegmentation.nii.gz \
    --no_surfaces

Visualize output

To visualize Mindboggle output with roygbiv, start the Docker image (#1 above), then run roygbiv on an output directory:

roygbiv $OUT/$ID

and open a browser to localhost:5000.

Currently roygbiv only shows summarized data, but one of our goals is to work on by-vertex visualizations (for the latter, try Paraview).

Appendix: processing

The following steps are performed by Mindboggle (with links to code on GitHub):

  1. Create hybrid gray/white segmentation from FreeSurfer and ANTs output (combine_2labels_in_2volumes).

  2. Fill hybrid segmentation with FreeSurfer- or ANTs-registered labels.

  3. Compute volume shape measures for each labeled region:

  4. Compute surface shape measures for every cortical mesh vertex:

  5. Extract cortical surface features:

  6. For each cortical surface label/sulcus, compute:

  7. Compute statistics (stats_per_label in compute.py) for each shape measure in #4 for each label/feature:

    • median
    • median absolute deviation
    • mean
    • standard deviation
    • skew
    • kurtosis
    • lower quartile
    • upper quartile

Appendix: output

Example output data can be found on Mindboggle's examples site on osf.io. By default, output files are saved in $HOME/mindboggled/SUBJECT, where $HOME is the home directory and SUBJECT is a name representing the person's brain that has been scanned. Volume files are in NIfTI format, surface meshes in VTK format, and tables are comma-delimited. Each file contains integers that correspond to anatomical :doc:`labels <labels>` or features (0-24 for sulci). All output data are in the original subject's space. The following include outputs from most, but not all, optional arguments.

Folder Contents Format
labels/ number-labeled surfaces and volumes .vtk, .nii.gz
features/ surfaces with features: sulci, fundi .vtk
shapes/ surfaces with shape measures (per vertex) .vtk
tables/ tables of shape measures (per label/feature/vertex) .csv

mindboggled / $SUBJECT /

labels /

freesurfer_wmparc_labels_in_hybrid_graywhite.nii.gz: hybrid segmentation filled with FS labels

ants_labels_in_hybrid_graywhite.nii.gz: hybrid segmentation filled with ANTs + FS cerebellar labels

[left,right]_cortical_surface / freesurfer_cortex_labels.vtk: DKT cortical surface labels

features / [left,right]_cortical_surface /

folds.vtk: (unidentified) depth-based folds

sulci.vtk: sulci defined by DKT label pairs in depth-based folds

fundus_per_sulcus.vtk: fundus curve per sulcus -- UNDER EVALUATION --

cortex_in_MNI152_space.vtk: cortical surfaces aligned to an MNI152 template

shapes / [left,right]_cortical_surface /

area.vtk: per-vertex surface area

mean_curvature.vtk: per-vertex mean curvature

geodesic_depth.vtk: per-vertex geodesic depth

travel_depth.vtk: per-vertex travel depth

freesurfer_curvature.vtk: FS curvature files converted to VTK

freesurfer_sulc.vtk: FS sulc (convexity) files converted to VTK

freesurfer_thickness.vtk: FS thickness files converted to VTK

tables /

volume_per_freesurfer_label.csv: volume per FS label

volumes_per_ants_label.csv: volume per ANTs label

[left,right]_cortical_surface /

label_shapes.csv: per-label surface shape statistics

sulcus_shapes.csv: per-sulcus surface shape statistics

fundus_shapes.csv: per-fundus surface shape statistics -- UNDER EVALUATION --

vertices.csv: per-vertex surface shape statistics

More Repositories

1

nipype

Workflows and interfaces for neuroimaging packages
Python
749
star
2

nibabel

Python package to access a cacophony of neuro-imaging file formats
Python
653
star
3

nipy

Neuroimaging in Python FMRI analysis package
Python
383
star
4

nitime

Timeseries analysis for neuroscience data
Python
244
star
5

PySurfer

Cortical neuroimaging visualization in Python
Python
243
star
6

heudiconv

Flexible DICOM conversion into structured directory layouts
Python
235
star
7

pbrain

Python EEG and ECoG analysis software by John Hunter et al
Python
94
star
8

niwidgets

Neuroimaging widgets for jupyter notebooks
Jupyter Notebook
84
star
9

brainx

Tools for analysis of brain imaging-derived networks, based on NetworkX
Python
39
star
10

nireg

Brain image registration package
Python
30
star
11

nitransforms

a standalone fork of nipy/nibabel#656
Python
28
star
12

nilabels

Tools to automate simple manipulations and measurements of medical images and segmentations in nifti format.
Python
21
star
13

dmriprep

MOVED TO https://github.com/nipreps/dmriprep
Python
17
star
14

workshops

Brain imaging workshops
Jupyter Notebook
15
star
15

quickshear

Quickshear Defacing for Neuroimages
Python
11
star
16

nibotmi

NIPY BuildBot Master Instance
Python
11
star
17

nipy-artwork

Fliers, images, etc
TeX
9
star
18

nipy.github.io

Main website for the Neuroimaging in Python community
HTML
8
star
19

PyLocator

Localization of EEG-electrodes from MRI-recordings
Python
7
star
20

nipy-suite

Meta-project to provide a single point of getting/testing for all projects under NiPy umbrella
Shell
4
star
21

data-packaging

Utilities for managing data packages
Python
4
star
22

napari-nibabel

A napari plugin that uses nibabel to display medical images
Python
3
star
23

nidoodles

Work-in-progress notes and code fragments for the nipy projects
Python
2
star
24

nipyco

Nipy community web pages
Python
2
star
25

heudiconv-joss-paper

Repository to prepare HeuDiConv paper
Python
2
star
26

nipy-labs

Work-in-progress add-ons for nipy (require lapack)
C
2
star
27

blog

Makefile
1
star
28

nisext

Shareable distutils etc utilities
Python
1
star
29

nibabel-paper

Journal manuscript for the initial publication of NiBabel
Python
1
star