Coder Social home page Coder Social logo

pni-lab / pumi Goto Github PK

View Code? Open in Web Editor NEW
1.0 3.0 0.0 35.42 MB

PUMI: neuroimaging Pipelines Using Modular workflow Integration

Home Page: https://pumi.readthedocs.io/en/latest/

License: GNU General Public License v3.0

Python 80.78% Shell 3.65% Roff 15.58%
dwi neuroimaging nipype preprocessing resting-state-fmri

pumi's People

Contributors

englertr avatar khoffschlag avatar mberjano avatar mhmadabs avatar spisakt avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

pumi's Issues

tweak fsl parameters

The brain extractions do not look outstanding.
The vertical gradient probably needs to be adjusted.

Update Dockerfile

No longer use the latest software versions for FSL etc., instead use the versions with which we have tested PUMI.

Missing:

  • Templateflow
  • ANTs
  • HD-Bet
  • GPU support

revise QC.py sinking structure

Only do this after prior discussion

QC outputs needs to be sinked in a special structure to support easy quality inspection.

output and QC sink folder naming template

OUTPUT SINK:
<sink_dir>/subject-//<source_entities>_.
ex:
pumi-results/sub-001/anat/sub-001_T1w_brain.nii.gz
pumi-results/sub-001/anat/sub-001_T1w_brain_mask.nii.gz
pumi-results/sub-002/func/sub-001_task-rest_bold_mc.nii.gz

suffix: always defined by the subworkflow (we might be able to re-define it)


QC SINK:
//<subworkflow_name>/<source_entities>_.

<source_entities>: must contain the subject id

ex:
qc/anat/bet/sub-001_T1w_brain.ppm
qc/anat/bet/sub-001_T1w_brain_mask.ppm
pumi-results/qc/func/mc/sub-001_task-rest_bold_mc_rot.png

add **kwargs

  • pass kwargs to most straightforward node constructor
  • If there is more than one straightforward option: pass multiple dicts that follow the convention <software_tool>_kwargs

core enhancement: default fields as workflow peoperties?

The decorated function gets access to a PUMI.NestedWorkflow instance, passed in the function signature as wf.

PUMI workflows accept "default" strings in some cases, like:

wf.connect('inputspec',` 'in_file', despike, 'in_file')

Let's discuss if it makes sense to accept wf.inputspec or pass inputspec as a function parameter instead.

This applies to outputspec and sinker, too.

qc-pipelines can only sink png files

The standard qc-pipeline regex substitution not only adjusts the path to the desired format, but also replaces the filename with 'sub-<subject_id>.png'.

It should be possible to sink other formats as well (i.e. the actual file extension should be catched and appended to the end)

BET with Functional Data

Run the bet_subworkflow on functional data and compare the results with the results of RPN-Signature

test dockerfile

Check if the container created from the Dockerfile really works.
To do this, run a pipeline in the container.

add porting-notes to the documentation

Notes on porting from the old PUMI version to the new one:
I will use the terms „NP“ for „new PUMI“ and „OP“ for „old PUMI“.

  • Use NP NestedNodes and NestedWorkflow instead of nipype nodes and workflows
    • You only have to change the nipype-node-import to from PUMI.engine import NestedNode as Node and the nipype-workflow-import to from PUMI.engine import NestedWorkflow as Workflow
  • Every workflow of NP must use a (suitable) decorator.
    • Anatomical related workflows use „AnatPipeline“, functional ones „FuncPipelines“ and QC-Workflows use „QcPipeline“
    • You have to specify the inputspec_fields and the outputspec_fields when using the decorator
    • It‘s possible to specify your own DataSinker regex-substitutions with regexp_sub=[]
    • You can deactivate the default regex substitutions with default_regexp_sub=False (not advised if you don‘t really know what you do)
    • Example usage with comments can seen here in https://github.com/pni-lab/PUMI/blob/main/PUMI/pipelines/anat/segmentation.py at the qc_bet definition
  • Don‘t create workflow objects in your NP workflow methods. The decorator will create a workflow object and you can access it simply via using „wf“.
    • Example: wf.connect('inputspec', 'in_file', plot, 'registered_brain')
  • Don‘t create inputspec, outputspec, and SinkDir objects in NP. The decorator will handle this.
    • You can utilize these in connect-statements by using the strings „inputspec“, „outputspec“ and „sinker“.
      • Example 1: wf.connect(bet, 'out_file', 'outputspec', 'out_file')
      • Example 2: wf.connect(bet, 'out_file', 'sinker', 'out_file')
    • The inputspec_fields that you specify in the decorator are the IdentityInterface fields of the inputspec object in OP
    • The outputspec_fields that you specify in the decorator are the IdentityInterface fields of the outputspec object in OP
  • The first parameter of you workflow has to be „wf“ and the last one „**kwargs“. You do not need to pass wf yourself, the decorator will handle this
  • Currently you have to pass "qc_dir=wf.qc_dir" when creating a qc object. We will fix this in the future. This can be seen in https://github.com/pni-lab/PUMI/blob/main/PUMI/pipelines/anat/segmentation.py in the bet_fsl workflow
  • Don‘t use MapNodes. Most times it‘s enough to change MapNode to Node and delete the iterfields but verify that this makes sense when doing it!
  • You may stumble across such connection statements in OP (note the „outputspec.[...]“ and „inputspec.[...]“): wf_mc.connect(mytmpfilt, 'outputspec.func_tmplfilt', fmri_qc_mc_dspk_nuis_bpf, 'inputspec.func').Don‘t explicitly mention inputspec/outputspec in NP when connecting nested workflows. The NP equivalent would be: wf_mc.connect(mytmpfilt, 'func_tmplfilt', fmri_qc_mc_dspk_nuis_bpf, 'func').

Also note the coding conventions mentioned in NP README (https://github.com/pni-lab/PUMI)

adapt code to new style convention

Only possible after the discussion about the style and naming convention

Adapt the code (especially for connecting the nodes and pipelines) to the new convention.

BET node

only after the iterable vs. mapnode discussion

take the BET node from the old PUMI repo and "port" it to the new PUMI.
Adjust it based on the (possible) decision of using iterables instead if MapNodes for subjects.

repo folder structure

We need the following dirs:

examples: example pipelines (the current mapnode and the iterable pipelines must be moved here), mostly for "education" purposes, or to be looked up during development
pipelines: here we will have a collection of the actual PUMI-based pipelines for "production" purposes (like RPN-signature)
scripts: any helper scripts, + later maybe a wrapper for the docker command that runs pumi-pipelines

De-facing

A AnatPUMI pipeline that uses pydeface or similar to de-face an anatomical image.

  • get in touch with Balint to discuss which de-face software to use

  • create QC image (bet_wf)

  • create a test

fix Dockerfile

There are some problems with the Dockerfile, especially if you want to run PUMI in PyCharm with a docker remote interpreter.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.