Coder Social home page Coder Social logo

sparkler0323 / pynets Goto Github PK

View Code? Open in Web Editor NEW

This project forked from dpys/pynets

0.0 1.0 0.0 418.28 MB

A Fully-Automated Workflow for Reproducible Graph Analysis of Functional and Structural Connectomes

License: Other

Makefile 0.45% Python 94.96% JavaScript 2.29% R 0.37% HTML 1.93%

pynets's Introduction

PyNets™

Build Status

About

A Fully-Automated Workflow for Reproducible Graph Analysis of Functional and Structural Connectomes

PyNets harnesses the power of Nipype, Nilearn, and Networkx packages to automatically generate a variety of 'meta-connectomic' solutions on a subject-by-subject basis, using any combination of graph hyperparameters. PyNets utilities can be integrated with any existing preprocessing workflow, and a docker container is provided to uniquely facilitate complete reproducibility of executions.

Learn more about Nipype: http://nipype.readthedocs.io/ Learn more about Nilearn http://nilearn.github.io/ Learn more about Networkx: https://networkx.github.io/


  1. Installation:

PyNets is available for both python2 and python3. We recommend using python3.

##Clone the PyNets repo and install dependencies
git clone https://github.com/dPys/PyNets.git
cd /path/to/PyNets
python setup.py install

#Or within the PyNets parent directory:
pip install -e .

#Or install from PyPi (recommended)
pip install pynets

To install using the included dockerfile, ensure you have installed Docker (https://www.docker.com/) and then run:

BUILDIR=$(pwd)
mkdir -p ${BUILDIR}/pynets_images
docker build -t pynets_docker .

docker run -ti --rm --privileged \
    -v /tmp:/tmp \
    -v /var/tmp:/var/tmp \
    pynets_docker

and to further convert this into a singularity container (e.g. for use on HPC):

docker run -ti --rm \
    --privileged \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -v ${BUILDIR}/pynets_images:/output \
    filo/docker2singularity "pynets_docker:latest"
  1. Usage:

See pynets_run.py -h for a complete list of help options.

Example A) You have a preprocessed (minimally -- normalized and skull stripped) functional fMRI dataset called "filtered_func_data_clean_standard.nii.gz" where you assign an arbitrary subject id of 997, you wish to analyze a whole-brain network, using the nilearn atlas 'coords_dosenbach_2010', thresholding the connectivity graph proportionally to retain 0.20% of the strongest connections, and you wish to use partial correlation model estimation:

pynets_run.py '/Users/dPys/PyNets_examples/997/filtered_func_data_clean_standard.nii.gz' -id '997' -a 'coords_dosenbach_2010' -mod 'partcorr' -thr '0.20'

      Example B) Building upon the previous example, let's say you now wish to analyze the Default network for this same dataset, but now also using the 264-node atlas parcellation scheme from Power et al. 2011 called 'coords_power_2011', you wish to threshold the graph iteratively to achieve a target density of 0.3, and you define your node radius at two resolutions (2 and 4 mm), you wish to fit a sparse inverse covariance model in addition to partial correlation, and you wish to plot the results:

pynets_run.py -i '/Users/dPys/PyNets_examples/997/filtered_func_data_clean_standard.nii.gz' -id '997' -a 'coords_dosenbach_2010,coords_power_2011' -n 'Default' -dt -thr '0.3' -ns '2,4' -mod 'partcorr,sps' -plt

      Example C) Building upon the previous examples, let's say you now wish to analyze the Default and Executive Control Networks for this subject, but this time based on a custom atlas (DesikanKlein2012.nii.gz), this time defining your nodes as parcels (as opposed to spheres), you wish to fit a partial correlation model, you wish to iterate the pipeline over a range of densities (i.e. 0.05-0.10 with 1% step), and you wish to prune disconnected nodes:

pynets_run.py -i '/Users/dPys/PyNets_examples/997/filtered_func_data_clean_standard.nii.gz' -id '997' -ua '/Users/dPys/PyNets_example_atlases/DesikanKlein2012.nii.gz' -n 'Default,Cont' -mod 'partcorr' -dt -min_thr 0.05 -max_thr 0.10 -step_thr 0.01 -parc -p 1

      Example D) Building upon the previous examples, let's say you now wish to create a subject-specific atlas based on the subject's unique spatial-temporal profile. In this case, you can specify the path to a binarized mask within which to performed spatially-constrained spectral clustering, and you want to try this at multiple resolutions of k clusters/nodes (i.e. k=50,100,150). You again also wish to define your node radius at both 2 and 4 mm, fitting a partial correlation and sparse inverse covariance model, you wish to iterate the pipeline over a range of densities (i.e. 0.05-0.10 with 1% step), you wish to prune disconnected nodes, and you wish to plot your results:

pynets_run.py -i '/Users/dPys/PyNets_examples/997/filtered_func_data_clean_standard.nii.gz' -id '997' -cm '/Users/dPys/PyNets_example/997_grey_matter_mask_bin.nii.gz' -ns '2,4' -mod 'partcorr,sps' -k_min 50 -k_max 150 -k_step 50 -dt -min_thr 0.05 -max_thr 0.10 -step_thr 0.01 -p 1 -plt

      Example E) You wish to generate a structural connectome, using probabilistic tractography applied to bedpostx outputs with 5000 streamlines, constrained to a white-matter waypoint mask, and avoiding ventricular CSF as defined by the subject's T1 image. You wish to use atlas parcels as defined by both DesikanKlein2012, and AALTzourioMazoyer2002, iterate over a range of densities (i.e. 0.05-0.10 with 1% step), prune disconnected nodes, and plot your results:

pynets_run.py -dwi /Users/PSYC-dap3463/Downloads/bedpostx_s002.bedpostX -id s002 -ua '/Users/PSYC-dap3463/Applications/PyNets/pynets/atlases/DesikanKlein2012.nii.gz,/Users/PSYC-dap3463/Applications/PyNets/pynets/atlases/AALTzourioMazoyer2002' -s 5000 -thr 0.1 -dt -min_thr 0.05 -max_thr 0.10 -step_thr 0.01 -p 1 -parc -anat '/Users/PSYC-dap3463/Downloads/s002/s002_anat_T1.nii.gz'
  1. Interpreting outputs:

      PyNets outputs various csv files and pickled pandas dataframes within the same subject folder in which the initial image was fed into the workflow. Files with the suffix '_neat.csv' within each atlas subdirectory contain the graph measure extracted for that subject from that atlas, using all of the graph hyperparameters listed in the titles of those files. Files with the suffix '_mean.csv' within the base subject directory contain averages/weighted averages of each graph measure across all hyperparameters specified at runtime.

Generate a glass brain plot for a functional or structural connectome Visualize adjacency matrices for structural or functional connectomes Input a path to a diffusion weighted dataset or bedpostx directory to estimate a structural connectome Visualize communities of restricted networks Use connectograms to visualize community structure (including link communities)

Happy Netting!

Please cite all uses with reference to the github repository: https://github.com/dPys/PyNets

pynets's People

Contributors

dpys avatar kunert avatar akinikolaidis avatar

Watchers

Sparkler avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.