Coder Social home page Coder Social logo

neurita / pypes Goto Github PK

View Code? Open in Web Editor NEW
16.0 9.0 7.0 4.48 MB

Reusable neuroimaging pipelines using nipype

Home Page: http://neuro-pypes.readthedocs.io

License: Other

Makefile 0.38% Python 98.46% MATLAB 1.15%
nipype neuroimaging preprocessing pet fmri dti ica plotting

pypes's Introduction

neuro_pypes

Build Status ReadTheDocs DOI

Reusable neuroimaging pipelines using nipype and state-of-the-art neuroimaging preprocessing software.

The objectives of this projects are:

  • easy-to-use, complete, reusable, and configurable pre-processing pipelines, and
  • high-quality results with the minimum manual intervention in the images.

Documentation

Please have a look at the documentation here.

Citation

If you use Pypes please cite it:

Alexandre M. Savio, Michael Schutte, Manuel Graña and Igor Yakushev
Pypes: Workflows for Processing Multimodal Neuroimaging Data
Front. Neuroinform., April 2017, vol 11, 2017, pages 25
issn: 1662-5196
doi: 10.3389/finf.2017.00025

https://doi.org/10.3389/fninf.2017.00025

...or the BibTeX:

@ARTICLE{10.3389/fninf.2017.00025,
AUTHOR={Savio, Alexandre M. and Schutte, Michael and Graña, Manuel and Yakushev, Igor},
TITLE={Pypes: Workflows for Processing Multimodal Neuroimaging Data},
JOURNAL={Frontiers in Neuroinformatics},
VOLUME={11},
PAGES={25},
YEAR={2017},
URL={http://journal.frontiersin.org/article/10.3389/fninf.2017.00025},
DOI={10.3389/fninf.2017.00025},
ISSN={1662-5196}
}

Also consider citing Nipype and all the other open source tools you use.

License

Licensed under the Apache License, Version 2.0.

pypes's People

Contributors

alexsavio avatar ikdripp avatar schutte avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pypes's Issues

Adapt to BIDS

I was wondering if you would be interested in making a containerized version of pypes that takes BIDS datasets as inputs. Basically a BIDS App (more here and here). It would improve exposure of pypes and help users get started. WDYT?

Add GIFT to neuro_pypes first as CLI option

  • be brave
  • decide how to include GIFT in the library (variable in config file and environment variable for CLI)
  • create a subcommand without many parameters in the neuro_pypes/cli/cli.py file
  • make a list of all the parameters we can set for GIFT
  • Add these parameters to the subcommand
    ... more to come?

FDG-PET Template for spatial normalization (instead of SPM 15O PET Template)

Rosa et al., 2014 used a neurodegenarzive FDG specific PET Template for spatial normalization:
https://www.ncbi.nlm.nih.gov/pubmed/24952892

Akdemir analysed effects of spatial normalization using the 15O-PET SPM template, Rosa et al., 2014 FDG-PET neurodegenrative template AND their own institutional FDG-PET template derived from "brain-healthy": "The Effect of Spatial Normalization of Brain 18F-FDG PET-MR Images Using SPM and Different PET Templates"
http://jnm.snmjournals.org/content/58/supplement_1/1260

It could be interesting implementing these FDG-PET templates in Pypes. I will contact them, to ask, whether they made their templates public!

NameError: ("Input formula couldn't be processed: ...)

Hi,

Thank you for sharing your code. I am trying to use your package for PET-MRI co-registration. By using "spm_anat_preproc" and "spm_mrpet_preproc" functions, I am running PETPVC pipeline on MR/PET images; but I am getting the following error.

ERROR:nipype.workflow:

Traceback (most recent call last):
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/plugins/multiproc.py", line 70, in run_node
result['result'] = node.run(updatehash=updatehash)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run
result = self._run_interface(execute=True)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface
return self._run_command(execute)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command
result = self._interface.run(cwd=outdir)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/base/core.py", line 521, in run
runtime = self._run_interface(runtime)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/utility/wrappers.py", line 144, in _run_interface
out = function_handle(**args)
File "/home/masoomeh/PET/pypes/neuro_pypes/interfaces/nilearn/image.py", line 34, in wrapped
res_img = f(*args, **kwargs)
File "", line 26, in math_img
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nilearn-0.4.2-py3.5.egg/nilearn/image/image.py", line 793, in math_img
result = eval(formula, data_dict)
File "", line 1, in
NameError: ("Input formula couldn't be processed, you provided 'img / nan',", "name 'nan' is not defined").

I checked the code and could not find where to modify the formula and define the "val". I will be grateful if you can provide me with some clue to solve this error!

There is not resting state fMRI workflow

We need an attach_rest_preprocessing function for:
https://github.com/Neurita/pypes/blob/master/pypes%2Fdatasets.py#L38

This workflow should do the basic preprocessing of resting state fMRI data.
This is a list of the main steps that should be implemented: http://fcp-indi.github.io/docs/user/preproc.html

We will have to identify which steps can be done with the standard tools, i.e., SPM, FSL or AFNI. If you ask me, I may have the answer for what these tools are capable of.

Other steps will have to be implemented in Python as Nipype nodes. We may be able to reuse or adapt code already done in other modules such as:

Lazy group template pipelines

  1. Normalize FDG-Signal images : normalized to average GM FDG signal; in subject space! pet_recon_pvc_norm.nii.gz; Do we have this also in grptemplate MNI space? Not yet!

ALSO:

we will need pet_recon_norm.nii.gz (same as above, but no pvc) in grptemplate

For yesterday!! ;)

Improve input Crumb definition

As a neuro_pypes user I want to be able to more easily configure the input paths.

What about put this in the config:

input.anat.image: anat_hc.nii.gz
input.pet.image: pet_fdg.nii.gz

Improve visibility of Software Choice

I think it would be better for the user, if you could clearer state which sofware you use for which step. Is that possible (e.g. in form of more comments)!?

Files needed/wrong file names

We have now:

media/user/external/Projects/MRPET15_Final/out/first_/Subj/mrpet/std_template
wpet_recon_rbv_pvc.nii → in MNI PVC PET using SPM PET template
wpet_recon_rbv_pvc_intnormed.nii → MNI PVC PET with signal normalization to grey matter using SPM PET template
pet_recon_grptemplate.nii → should be standard template in MNI space
wtissues_brain_mask.nii

media/user/external/Projects/MRPET15_Final/out/first_/Subj/mrpet
Hammers_mith_atlas_n30r83_SPM5_pet_recon.nii → in PET space
Head_MPRAGE_highContrast_pet_recon.nii → in PET space
pet_recon_pvc_norm.nii.gz → Pet spaced & signal normlalized to gm
/media/iripp/seagate_external/Projects/MRPET15_Final/out/first_/Subj/pet
pet_recon_mni.nii → MNI spaced using SPM PET template
media/user/external/Projects/MRPET15_Final/out/first_/Subj/pet/grp_template
pet_recon_grptemplate.nii → MNI space, but cropped!
WE NEED:

  1. pet_recon_grptemplate.nii with pvc
  2. pet_recon_grptemplate.nii with pvc & gm normalization
  3. pet_recon_grptemplate.nii with gm normalization

Fix the documentation

Change MPRAGE to structural or T1-weighted image.
Change any 'structural' reference to DTI to 'diffusion' or 'diffusion-tensor'.

Workflow configuration file

As some other projects are trying to do, such as C-PAC and pypreprocess, it is a nice idea of having as input one big-ish configuration file that defines the whole workflow that the user wants to run.

For this, it helps fixing names for input files and folders, like tools in Matlab do, such as REST and DPARSF.
I think this is not totally necessary, but a hint on the input folder structure and file names must be given to the software, in the same way as C-PAC does.

The main idea here is to build blocks of nipype nodes with fixed node names to make possible to configure each bunch of nodes for a typical task. These tasks can be, e.g., mprage segmentation, resting-state fMRI, DTI and PET pre-processing, and also further post-processing tasks like group analysis and comparisons as well as connectivity.

Note that the substitutions must be generic names, so if we need to compare different workflow parameters, we will have to create entire different workflows for each set of parameters in the search grid.

A first step could be having a configuration manager that reads an input file and from that file it is able to configure and setup the workflows.

This configuration file should have as keys the names of the processing nodes.

  • Proposal 1:
    For the parametrization of each node, an idea would be to add a config parameter in each function that outputs an interface object. This config parameter has the whole set of parameters contained in the configuration file. Each function reads whatever parameter name it needs.
    This is nipype with node names set up.
    The main issue is to keep the documentation for each node parameter options.

Crash with some nilearn stuff

	 Executing node main_workflow.spm_mrpet_preproc.petpvc.intensity_norm_gm.norm_img in dir: /home/iripp/data/raw/wd/main_workflow/spm_mrpet_preproc/petpvc/intensity_norm_gm/_diagnosis_healthy_stdpet_session_id_session_0_subject_id_zb_251552/norm_img
171219-14:30:26,208 workflow INFO:
	 [Job finished] jobname: rbvpvc.a491 jobid: 37
171219-14:30:26,276 workflow INFO:
	 Running node "norm_img" ("nipype.interfaces.utility.wrappers.Function").
171219-14:30:26,282 workflow INFO:
	 [Job finished] jobname: gunzip_pet.a490 jobid: 47
171219-14:30:28,323 workflow ERROR:
	 Node norm_img.a492 failed to run on host ferrari.
171219-14:30:28,323 workflow ERROR:
	 Saving crash info to /home/iripp/data/raw/wd/main_workflow/log/crash-20171219-143028-iripp-norm_img.a492-2728bc62-70b8-4417-8914-d7682a9cc766.pklz
Traceback (most recent call last):
  File "/home/iripp/software/nipype/nipype/pipeline/plugins/multiproc.py", line 51, in run_node
    result['result'] = node.run(updatehash=updatehash)
  File "/home/iripp/software/nipype/nipype/pipeline/engine/nodes.py", line 407, in run
    self._run_interface()
  File "/home/iripp/software/nipype/nipype/pipeline/engine/nodes.py", line 517, in _run_interface
    self._result = self._run_command(execute)
  File "/home/iripp/software/nipype/nipype/pipeline/engine/nodes.py", line 650, in _run_command
    result = self._interface.run()
  File "/home/iripp/software/nipype/nipype/interfaces/base.py", line 1089, in run
    runtime = self._run_interface(runtime)
  File "/home/iripp/software/nipype/nipype/interfaces/utility/wrappers.py", line 137, in _run_interface
    out = function_handle(**args)
  File "/home/iripp/software/pypes/neuro_pypes/interfaces/nilearn/image.py", line 49, in wrapped
    'decorator.'.format(out_file))
OSError: The file /home/iripp/data/raw/wd/main_workflow/spm_mrpet_preproc/petpvc/_diagnosis_healthy_stdpet_session_id_session_0_subject_id_zb_251552/rbvpvc/rpet_fdg_rbv_pvc_intnormed.nii.gz already exists, please add a presuffix to thedecorator.

Error because of new nipype version

File "/home/iripp/anaconda3/lib/python3.6/site-packages/nipype/pipeline/engine/workflows.py", line 188, in connect
    for edge in self._graph.in_edges_iter(destnode):
AttributeError: 'DiGraph' object has no attribute 'in_edges_iter'

How to spatially normlize PET images without involving T1-weighted MRI?

Documents describe below:
"Steps:
1.Spatially normalize FDG-PET to MNI using SPM12 Normalize.
There is a group template option for PET: first a group template is created, then all subjects are normalized to this group template."

I have some questions:

  1. All FDG-PET images use spm12 normlise module to warp PET images to the MNI space. Is the group template created by computing the average image of all warped PET images ? What is the tissue probability map used in this process? Is the /tpm/TPM.nii?
  2. Are all subjects normalized to the template image created by 1 using SPM8?
  3. How to normalize PET images of novel tracers without involving T1-weighted MRI?

replace fsl.IsotropicSmooth by Nilearn's smooth

# TODO in fmri.warp (and other places).
# I already tried this, but there is something that FSL does that Nilearn doesn't:
# When I look at the results in fslview the contrast is better with FSL,
# with Nilearn the image is very dark.

# smooth = setup_node(Function(function=smooth_img,
#                              input_names=["in_file", "fwhm"],
#                              output_names=["out_file"],
#                              imports=['from pypes.interfaces.nilearn import ni2file']),
#                      name="smooth_fmri")
# smooth.inputs.fwhm = get_config_setting('fmri_smooth.fwhm', default=8)
# smooth.inputs.out_file = "smooth_{}.nii.gz".format(wf_name)

Crazy names in the output of mrpet/std_template

My dear Alex,

Please clean the list of issues, you are such a lazy bastard. ;)

Also wtf is this file: pet_recon_grptemplate.nii doing in the std_template output. Could you check the names of these files, they are horrible and I do not know if there is any human being able to understand them. ;)

Have a look at this:
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session2_nii/.../mrpet/std_template/pet_recon_grptemplate.nii
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session2_nii/.../mrpet/std_template/wpet_recon_rbv_pvc.nii
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session2_nii.../mrpet/std_template/wpet_recon_rbv_pvc_intnormed.nii
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session2_nii/.../mrpet/std_template/wtissues_brain_mask.nii
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session2_nii/.../mrpet/std_template/y_rmHead_MPRAGE_highContrast_corrected.nii

Also:

/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session1_nii/.../mrpet/tissues/csf_Head_ep2d_pace_rest.nii
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session1_nii/.../mrpet/tissues/gm_Head_ep2d_pace_rest.nii
/media/iripp/seagate_external/Projects/MRPET15_trial/out/Session1_nii/.../mrpet/tissues/wm_Head_ep2d_pace_rest.nii

Crash of 01preprocessing.py

traits.trait_errors.TraitError: The trait 'in_file' of a GunzipInputSpec instance is an existing file name, but the path '/home/iripp/data/raw/raw/healthy_stdpet/nb_141659/session_0/pet_fdg.nii.gz' does not exist.
Error setting node input:
Node: gunzip_pet
input: in_file
results_file: /home/iripp/data/raw/wd/main_workflow/_diagnosis_healthy_stdpet_session_id_session_0_subject_id_nb_141659/selectfiles/result_selectfiles.pklz
value: /home/iripp/data/raw/raw/healthy_stdpet/nb_141659/session_0/pet_fdg.nii.gz

pypes cannot find TPM.nii

Error 'str' object has no attribute 'get' for line 44, in spm_tpm_priors_path in File "/home/iripp/anaconda3/envs/neuro/lib/python3.6/site-packages/neuro_pypes-1.0.1-py3.6.egg/neuro_pypes/utils/environ.py", line 44, in spm_tpm_priors_path when attemptin to run the script.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.