Coder Social home page Coder Social logo

brainhack-ch / superres-mri Goto Github PK

View Code? Open in Web Editor NEW
5.0 7.0 1.0 13.31 MB

Portable, modular, reusable, and reproducible processing pipeline software for fetal brain MRI super-resolution

License: Other

Jupyter Notebook 82.72% Python 15.60% Dockerfile 1.17% Shell 0.51%
nipype bids-format bidsapp neuroimaging super-resolution slice-to-volume mri fetal opengeneva2019 brainhack-global

superres-mri's Introduction

A modular, portable, reusable and reproducible processing pipeline software for fetal brain MRI super-resolution

The essence of this project is to develop a modular BIDS App for fetal brain MRI super-resolution that can interface with the open-source C++ Medical Image Analysis Laboratory Super-Resolution ToolKit (MIALSRTK), a set of tools compiled and distributed as a Docker image, that provides a complete solution to the processing pipeline.

This project is built on standards, tools, and frameworks developed to address the reproducibility and transparency challenges in the neuroimaging field :

It investigates a novel framework, the Self-contained Interfaceable No-nonsense Application (SINAPP) framework, designed to provide modularity and reusability of the pipeline stages within the BIDS App framework.

Achievements during the previous Brainhack

Brainhack Open Geneva 2019 - March Friday 22nd and Saturday 23rd 2019

Project 03: Package a portable and reproducible software for motion-robust MRI super-resolution

Team: Seirios, Brenda, Snezana, Niloufar and Sébastien

Work done:

  • Creation of notebook/brainhack.ipnb
  • Implement a Nipype interface that calls the NLMdenoising program from the containerized MIALSRTK library
  • Implement simple workflow that uses the BIDSDataGrabber interface to load data, which is then connected to the NLMdenoising interface

See the final presentation

Brainhack Global 2019 - November Thursday 7th, Friday 8th and Saturday 9th 2019

Project 05: SINAPP (Modular BIDS App) for Motion-robust Super-Resolution MRI

Team: Priscille, Hamza, Manik, Olivier, Guillaume and Sébastien

Work done:

  • Refactored code for building the project as a python package pymialsrtk
  • Implement the pipeline for the NLMdenoising SINAPP which BIDS App parser
  • Creation of the SINAPP Dockerfile for pipeline containerization
  • Adopt Docker-out-of-Docker solution to run within a first container (the SINAPP Docker container) the NLMdenoising program contained in a second container (the MIALSRTK docker image)
  • Start integration and adaption of notebooks and interfaces implemented by Priscille and Hamza without the Docker framework (interface that calls directly the MIALSRTK program and not the program in the containerized MIALSRTK library)

See the final presentation

Bug and support

This project is conducted with transparency in mind. Please use GitHub Issues (https://github.com/brainhack-ch/superres-mri/issues) to report and and discuss bugs and new features.

Installation

  • Clone this repository:
git clone https://github.com/brainhack-ch/superres-mri.git <Your/installation/dir>
  • Install the supermri-env conda environment with all the packages and dependencies:
conda env create -f environment.yml
  • Download the latest version of the mialsuperresolutiontoolkit Docker image:
docker pull sebastientourbier/mialsuperresolutiontoolkit:latest

Run the SINAPP example

This example of SINAPP that performs NLM denoising was developed during the BrainHack Global Geneva 2019 and provides a proof of feasibility of the envisioned framework. Details about the solutions found to run a docker inside an other container are available at #5. Its architecture is illustrated by the figure below:

We provide two scripts in sinapps/example_scripts that facilitates the built and testing of this example. The script build_sinapp-core_and_sinapp-nlmdenoise.shshows you how we build the sinapp-nlmdenoise docker image. The script run_sinapp-nlmdenoise.sh shows you how to execute it.

Instructions

  • Edit the path defined by superes-mri_dir in build_sinapp-core_and_sinapp-nlmdenoise.sh by your <Your/installation/dir>.
  • Edit the path defined by bids_dir in the run_sinapp-nlmdenoise.sh script to the location of your BIDS dataset location (<Your/BIDS/dir>)
  • In a terminal go to your <Your/installation/dir>:
cd <Your/installation/dir>
  • Build the SINAPP:
sh sinapps/example_scripts/build_sinapp-core_and_sinapp-nlmdenoise.sh
  • Run the SINAPP on all T2w scans of sub-01 in <Your/BIDS/dir>:
sh sinapps/example_scripts/run_sinapp-nlmdenoise.sh

The results are generated in <Your/BIDS/dir>/derivatives/superres-mri/sub-01/nipype/sinapp_nlmdenoise.

Want to help with the development of the entire collection of SINAPPs?

Please check the github issue #7 dedicated to the management, tracking, and discussions about the developement of the SINAPP collection for the MIALSRTK library.

Running the notebooks

  • Go to the notebooks directory in the cloned repo:
cd <Your/installation/dir>/notebooks
  • Activate the conda environment supermri-env:
conda activate supermri-env
  • Launch the ipython notebook server:
jupyter notebook

License

This project is licensed under the BSD 3-Clause License - see the LICENSE file for details

Contributors

  • Priscille
  • Hamza
  • Manik
  • Olivier
  • Guillaume
  • Seirios
  • Brenda
  • Snezana
  • Niloufar

Instigators

  • Sebastien Tourbier
  • Michael Dayan

See also the list of contributors who participated in this project.

superres-mri's People

Contributors

mick-d avatar sebastientourbier avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

manikbh

superres-mri's Issues

Silent failure of nipype with Docker

When running a SINAPPS using Docker, the script reports no error but fails to run the executable inside docker if:

  • docker engine is not running
  • the relevant Docker image is not present

This should be detected as an error.

Error: BIDSDataGrabber with pybids >= 0.9

BIDS data grabber does not work with pybids >= 0.9:

Jupyter Notebook error:

191108-16:17:18,426 nipype.workflow INFO:
	 Workflow bids_demo settings: ['check', 'execution', 'logging', 'monitoring']
191108-16:17:18,441 nipype.workflow INFO:
	 Running serially.
191108-16:17:18,446 nipype.workflow INFO:
	 [Node] Setting-up "bids_demo.bids_grabber" in "/home/brainhacker/data/derivatives/mialsrtk/bids_demo/bids_grabber".
191108-16:17:18,450 nipype.workflow INFO:
	 [Node] Running "bids_grabber" ("nipype.interfaces.io.BIDSDataGrabber")
191108-16:17:18,899 nipype.workflow WARNING:
	 Storing result file without outputs
191108-16:17:18,901 nipype.workflow WARNING:
	 [Node] Error on "bids_demo.bids_grabber" (/home/brainhacker/data/derivatives/mialsrtk/bids_demo/bids_grabber)
191108-16:17:18,903 nipype.workflow ERROR:
	 Node bids_grabber failed to run on host bh-p05-03.
191108-16:17:18,903 nipype.workflow ERROR:
	 Saving crash info to /home/brainhacker/superres-mri/notebooks/crash-20191108-161718-brainhacker-bids_grabber-2e444a58-15b6-4bc4-b0c8-a98a3b1476bb.pklz
Traceback (most recent call last):
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/plugins/linear.py", line 48, in run
    node.run(updatehash=updatehash)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 479, in run
    result = self._run_interface(execute=True)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 569, in _run_interface
    return self._run_command(execute)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 662, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 384, in run
    outputs = self.aggregate_outputs(runtime)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 461, in aggregate_outputs
    predicted_outputs = self._list_outputs()  # Predictions from _list_outputs
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/interfaces/io.py", line 2836, in _list_outputs
    derivatives=self.inputs.index_derivatives)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/bids/layout/layout.py", line 226, in __init__
    index_metadata=index_metadata)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/bids/layout/layout.py", line 556, in add_derivatives
    self.derivatives[pipeline_name] = BIDSLayout(deriv, **kwargs)
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/bids/layout/layout.py", line 211, in __init__
    indexer.index_metadata()
  File "/home/brainhacker/miniconda3/envs/supermri-env/lib/python3.7/site-packages/bids/layout/index.py", line 284, in index_metadata
    file_md.update(pl)
ValueError: dictionary update sequence element #0 has length 4; 2 is required

191108-16:17:18,905 nipype.workflow INFO:
	 ***********************************
191108-16:17:18,905 nipype.workflow ERROR:
	 could not run node: bids_demo.bids_grabber
191108-16:17:18,906 nipype.workflow INFO:
	 crashfile: /home/brainhacker/superres-mri/notebooks/crash-20191108-161718-brainhacker-bids_grabber-2e444a58-15b6-4bc4-b0c8-a98a3b1476bb.pklz
191108-16:17:18,907 nipype.workflow INFO:
	 ***********************************
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-6-288982abd1f9> in <module>
     68 
     69 wf.connect(preparePaths, "docker_T2ws_paths", nlmDenoise, "input_images")
---> 70 res = wf.run()

~/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/engine/workflows.py in run(self, plugin, plugin_args, updatehash)
    597         if str2bool(self.config['execution']['create_report']):
    598             self._write_report_info(self.base_dir, self.name, execgraph)
--> 599         runner.run(execgraph, updatehash=updatehash, config=self.config)
    600         datestr = datetime.utcnow().strftime('%Y%m%dT%H%M%S')
    601         if str2bool(self.config['execution']['write_provenance']):

~/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/plugins/linear.py in run(self, graph, config, updatehash)
     69 
     70         os.chdir(old_wd)  # Return wherever we were before
---> 71         report_nodes_not_run(notrun)

~/miniconda3/envs/supermri-env/lib/python3.7/site-packages/nipype/pipeline/plugins/tools.py in report_nodes_not_run(notrun)
     93                 logger.debug(subnode._id)
     94         logger.info("***********************************")
---> 95         raise RuntimeError(('Workflow did not execute cleanly. '
     96                             'Check log for details'))
     97 

RuntimeError: Workflow did not execute cleanly. Check log for details

Update README

  • Running the python notebook does not work
  • Add instructions for docker container (mial-srtk)

Implementation of SINAPPS

List of SINAPPS to be implemented:

  • NLMdenoising
  • CorrectSliceIntensity
  • SliceBySliceN4BiasFieldCorrection
  • SliceBySliceCorrectBiasField
  • CorrectSliceIntensity
  • IntensityStandardization
  • HistogramNormalization
  • ImageReconstruction
  • TVSuperResolution

Tasks involved for each SINAPP:

  • Check if interfaces already implemented by Priscille and Hamza
  • Adapt to sinapp framework or implement interface for each program of the Containerized MIALSRTK library listed above.
  • Creation of each sinapp underlies the creation of the following files stored under sinapps/<sinapp-name>/ (see sinapps/nlmdenoise/ for an example):
    • The implementation of the run.py script which provides a BIDS app argument parser, the creation of the workflow and the execution of the sinapp.
    • The implementation of the Dockerfile for containerization that describes the computing environment used to execute the workflow based on an Ubuntu 16.04 LTS system. A conda environment.yml is used to facilitate the installation of the python environment.

Docker-out-of-Docker (DooD) Solution

Brainhack Global Geneva 2019 Challenge

Build and run a Docker image executes a nipype workflow that should run an other Docker image that contains the C++ NLMdenoising program of the MIALSRTK library.

Road Map:

  • Creation of first sinapp workflow with BIDS App parser (python / not containerized yet)
  • Creation of Dockerfile
  • Build, test, and make it work

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.