Coder Social home page Coder Social logo

nipy / nibabel Goto Github PK

View Code? Open in Web Editor NEW
629.0 37.0 258.0 18.26 MB

Python package to access a cacophony of neuro-imaging file formats

Home Page: http://nipy.org/nibabel/

License: Other

Makefile 0.30% Python 99.25% MATLAB 0.11% PowerShell 0.13% Shell 0.20%
python neuroimaging brain-imaging nifti gifti cifti-2 streamlines data-formats dicom tck

nibabel's Introduction

image

NIPY

Neuroimaging tools for Python.

The aim of NIPY is to produce a platform-independent Python environment for the analysis of functional brain imaging data using an open development model.

In NIPY we aim to:

  1. Provide an open source, mixed language scientific programming environment suitable for rapid development.
  2. Create software components in this environment to make it easy to develop tools for MRI, EEG, PET and other modalities.
  3. Create and maintain a wide base of developers to contribute to this platform.
  4. To maintain and develop this framework as a single, easily installable bundle.

NIPY is the work of many people. We list the main authors in the file AUTHOR in the NIPY distribution, and other contributions in THANKS.

Website

Current information can always be found at the NIPY project website.

Mailing Lists

For questions on how to use nipy or on making code contributions, please see the neuroimaging mailing list:

https://mail.python.org/mailman/listinfo/neuroimaging

Please report bugs at github issues:

https://github.com/nipy/nipy/issues

You can see the list of current proposed changes at:

https://github.com/nipy/nipy/pulls

Code

You can find our sources and single-click downloads:

Tests

To run nipy's tests, you will need to install the pytest Python testing package:

pip install pytest

Then:

pytest nipy

You can run the doctests along with the other tests with:

pip install pytest-doctestplus

Then:

pytest --doctest-plus nipy

Installation

See the latest installation instructions.

License

We use the 3-clause BSD license; the full license is in the file LICENSE in the nipy distribution.

nibabel's People

Contributors

anibalsolon avatar arokem avatar bcipolli avatar chrispycheng avatar cindeem avatar demianw avatar dimitripapadopoulos avatar djarecka avatar effigies avatar fmorency avatar grlee77 avatar htwangtw avatar jond01 avatar kaczmarj avatar kastman avatar larsoner avatar marccote avatar matthew-brett avatar mgxd avatar mih avatar moloney avatar mscheltienne avatar orduek avatar pauldmccarthy avatar rmarkello avatar robbisg avatar satra avatar unidesigner avatar yarikoptic avatar zvibaratz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nibabel's Issues

Extend data package installation commands to allow in-place use

Data packages refer to the nipy/nipy/utils/data.py data packages. We were going to move this stuff to nibabel, to allow dicom and other tests.

Add commands to data packages, perhaps in setup.py, of form:

python setup.py set_inplace_path
python setup.py unset_inplace_path

to allow easy use of data packages inplace.

nipy.tools cleanout

mynipy now not useful, because we typically only ever have one path to nipy code, instead of the branch = directory approach of bzr

Support for partial read/write access

This is closely related to issue #9, but asking for more general support of partial access to image data. Algorithms processing huge files might only need to have access to one slice/volume/voxel at a time. It would allow for implementing leaner tools if NiBabel would support partial read from images and partial write to images, i.e. either incrementally writing to a file (e.g. appending volumes), or iterating over a file and modifying its content in-place.

provide logging messages for normal operations/actions happening beyound the curtain

Some times it is interesting to see what nibabel is doing internally. Now even if I set all logging levels to 0 (when every log message should be printed, right?):

*In [2]: nib.imageglobals.logger.level
Out[2]: 0

*In [3]: nib.imageglobals.log_level
Out[3]: 0

*In [4]: nib.imageglobals.error_level
Out[4]: 0

there is no messages of any kind provided while nibabel is doing the job for me (scales the data, computes qform, etc). It would be great if there was an ability, at least in debug mode (default I guess ;-) ) to spit out messages about actions which were taken. I cannot revert to tracing whole code execution in an attempt of figuring out of what is cooking.

Thank you in advance

Recoder for mapping dtypes to header doesnt support data.dtype

img = nibabel.load('somefile')
dat = img.get_data().copy()
affine = dat.get_affine()
newimg = nibabel.Nifti1Image(dat, affine)
rasies an error
/usr/local/python2.6.2/lib/python2.6/site-packages/nibabel/analyze.pyc in set_data_dtype(self, datatype)
836 except KeyError:
837 raise HeaderDataError(
--> 838 'data dtype "%s" not recognized' % datatype)
839 dtype = self._data_type_codes.dtype[code]
840 # test for void, being careful of user-defined types

HeaderDataError: data dtype "float32" not recognized

/usr/local/python2.6.2/lib/python2.6/site-packages/nibabel/analyze.py(838)set_data_dtype()
837 raise HeaderDataError(
--> 838 'data dtype "%s" not recognized' % datatype)
839 dtype = self._data_type_codes.dtype[code]

This is due to it not passing 'np.float32', but instead passing dtype('float32') which is not in the _dtdefs.
Possible FIX? could add np.dtype(np.float32) to _dtdefs so these would pass...??

setting a tolerance for precision loss when writing

When an image is written out by nibabel, it tries to fit the range of the data into the datatype of the file. in certain instances this could result in a loss of precision. the goal would be to allow setting a tolerance (with appropriate defaults), so that users can control when the loss leads to an exception.

see discussion in #115

Need API for loading multiple images

Some image formats can contain more than one image in a single file.

By 'more than one image' I mean images where the shape, affine, data-type or other meta-information can differ across the images. For example nifti and friends can contain 4D images, but all volumes in the 4D series have the same meta-information, and so these are not multiple images.

New ecat writers not endian safe?

I think this may be for @cindeem and @fmorency

The PPC and sparc buildbots are failing on the ECAT tests:

http://nipy.bic.berkeley.edu/builders/nibabel-py2.6-osx-10.4/builds/71/steps/shell_1/logs/stdio
http://nipy.bic.berkeley.edu/builders/nibabel-py2.6-sparc/builds/17/steps/shell_1/logs/stdio

with a very byte-order looking error:

======================================================================
ERROR: test_save (nibabel.tests.test_ecat.TestEcatImage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/buildslave/nd-bb-slave-sparc/nibabel-py2_6-sparc/build/nibabel/tests/test_ecat.py", line 206, in test_save
    other = self.image_class.load(tmp_file)
  File "/home/buildslave/nd-bb-slave-sparc/nibabel-py2_6-sparc/build/nibabel/ecat.py", line 1037, in load
    return klass.from_filename(filespec)
  File "/home/buildslave/nd-bb-slave-sparc/nibabel-py2_6-sparc/build/nibabel/spatialimages.py", line 411, in from_filename
    return klass.from_file_map(file_map)
  File "/home/buildslave/nd-bb-slave-sparc/nibabel-py2_6-sparc/build/nibabel/ecat.py", line 905, in from_file_map
    hdr_fid)
  File "/home/buildslave/nd-bb-slave-sparc/nibabel-py2_6-sparc/build/nibabel/ecat.py", line 563, in __init__
    self.subheaders = self._get_subheaders()
  File "/home/buildslave/nd-bb-slave-sparc/nibabel-py2_6-sparc/build/nibabel/ecat.py", line 591, in _get_subheaders
    buf=tmpdat))
  File "/usr/lib/pymodules/python2.6/numpy/core/records.py", line 400, in __new__
    strides=strides)
TypeError: buffer is too small for requested array

Have you any idea what the problem is? I can set you up with access to the PPC machine if you send me your ssh public key by email.

Return native endian data as byteorder "="

Numpy linalg <= 1.6.1 has a problem with explicit endian codes for native endian. Specifically, if the endian code is '<' for an array, and the system is little endian, then numpy linalg will blow up doing - say - an SVD. The fix is to make the returned array = endian (implicit native endian).

See :

nipy/nipy#130
numpy/numpy#235

Support to load single volumes from 4D images

Dave Flitney requested a leaner way to get access to parts of a 4D image (NIfTI in his case), i.e. the ability to get data of a single volume or timeseries. He sent the following sketch::

class FSLNiftiImage(nb.Nifti1Image):
#    def __init__(self):
#       super(FSLNiftiImage, self).__init__()

    @memoized
        def get_fileobj(self):
        fileobj = allopen(self.get_filename())
        return fileobj

    def get_volume(self, v):
        dtype = self.get_data_dtype()
        s = self.get_header().get_data_shape()
        shape = (s[0], s[1], s[2])
        offset = self.get_header().get_data_offset() + (v * np.prod(shape))
        data = array_from_file(shape, dtype, self.get_fileobj(), offset)
        return data

Usage:

image = nb.load(fname)
image.__class__ = FSLNiftiImage
v27 = image.get_volume(27)

test_a2f_big_scalers fails on amd64 and sparc

on amd64 debian:

======================================================================
FAIL: nibabel.tests.test_utils.test_a2f_big_scalers
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/pymodules/python2.7/nose/case.py", line 187, in runTest
    self.test(*self.arg)
  File "/home/yoh/proj/nipy/nipy-suite/install/lib/python2.7/site-packages/nibabel/tests/test_utils.py", line 171, in test_a2f_big_scalers
    assert_array_equal(data_back, [-128, 0, 127])
  File "/usr/lib/pymodules/python2.7/numpy/testing/utils.py", line 686, in assert_array_equal
    verbose=verbose, header='Arrays are not equal')
  File "/usr/lib/pymodules/python2.7/numpy/testing/utils.py", line 618, in assert_array_compare
    raise AssertionError(msg)
AssertionError: 
Arrays are not equal

(mismatch 66.6666666667%)
 x: array([0, 0, 0], dtype=int8)
 y: array([-128,    0,  127])

on sparc similar but different:

======================================================================
FAIL: nibabel.tests.test_utils.test_a2f_big_scalers
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/pymodules/python2.6/nose/case.py", line 183, in runTest
    self.test(*self.arg)
  File "/home/yoh/proj/nipy/nipy-suite/install/lib/python2.6/site-packages/nibabel/tests/test_utils.py", line 171, in test_a2f_big_scalers
    assert_array_equal(data_back, [-128, 0, 127])
  File "/usr/lib/pymodules/python2.6/numpy/testing/utils.py", line 677, in assert_array_equal
    verbose=verbose, header='Arrays are not equal')
  File "/usr/lib/pymodules/python2.6/numpy/testing/utils.py", line 609, in assert_array_compare
    raise AssertionError(msg)
AssertionError: 
Arrays are not equal

(mismatch 66.6666666667%)
 x: array([ 0,  0, -1], dtype=int8)
 y: array([-128,    0,  127])

Bool should be saved as int8

At the moment:

In [1]: import nibabel as nib
In [2]: hdr = nib.Nifti1Header()
In [5]: hdr.set_data_dtype(np.bool)
---------------------------------------------------------------------------
HeaderDataError                           Traceback (most recent call last)
/home/mb312/ in ()
----> 1 hdr.set_data_dtype(np.bool)

/home/mb312/usr/local/lib/python2.6/site-packages/nibabel/analyze.pyc in set_data_dtype(self, datatype)
    576         except KeyError:
    577             raise HeaderDataError(
--> 578                 'data dtype "%s" not recognized' % datatype)
    579         dtype = self._data_type_codes.dtype[code]
    580         # test for void, being careful of user-defined types

HeaderDataError: data dtype "" not recognized

But surely np.bool could be punned to int8?

Windows test errors in 1.0.0 release

======================================================================
ERROR: nibabel.nicom.tests.test_csareader.test_csas0
----------------------------------------------------------------------
Traceback (most recent call last):
  File "c:\python25\lib\site-packages\nose-1.0.0-py2.5.egg\nose\case.py", line 187, in runTest
    self.test(*self.arg)
  File "C:\Python25\Lib\site-packages\nibabel\nicom\tests\test_csareader.py", line 33, in test_csas0
    csa_info = csa.read(csa_str)
  File "C:\Python25\Lib\site-packages\nibabel\nicom\csareader.py", line 100, in read
    up_str.unpack('64si4s3i')
  File "C:\Python25\Lib\site-packages\nibabel\nicom\structreader.py", line 86, in unpack
    values = pkst.unpack_from(self.buf, self.ptr)
error: unpack_from requires a buffer of at least 84 bytes

======================================================================
ERROR: nibabel.nicom.tests.test_csareader.test_csa_params
----------------------------------------------------------------------
Traceback (most recent call last):
  File "c:\python25\lib\site-packages\nose-1.0.0-py2.5.egg\nose\case.py", line 187, in runTest
    self.test(*self.arg)
  File "C:\Python25\Lib\site-packages\nibabel\nicom\tests\test_csareader.py", line 49, in test_csa_params
    csa_info = csa.read(csa_str)
  File "C:\Python25\Lib\site-packages\nibabel\nicom\csareader.py", line 100, in read
    up_str.unpack('64si4s3i')
  File "C:\Python25\Lib\site-packages\nibabel\nicom\structreader.py", line 86, in unpack
    values = pkst.unpack_from(self.buf, self.ptr)
error: unpack_from requires a buffer of at least 84 bytes

======================================================================
ERROR: nibabel.nicom.tests.test_csareader.test_ice_dims
----------------------------------------------------------------------
Traceback (most recent call last):
  File "c:\python25\lib\site-packages\nose-1.0.0-py2.5.egg\nose\case.py", line 187, in runTest
    self.test(*self.arg)
  File "C:\Python25\Lib\site-packages\nibabel\nicom\tests\test_csareader.py", line 86, in test_ice_dims
    csa_info = csa.read(csa_str)
  File "C:\Python25\Lib\site-packages\nibabel\nicom\csareader.py", line 100, in read
    up_str.unpack('64si4s3i')
  File "C:\Python25\Lib\site-packages\nibabel\nicom\structreader.py", line 86, in unpack
    values = pkst.unpack_from(self.buf, self.ptr)
error: unpack_from requires a buffer of at least 84 bytes

======================================================================
ERROR: test (nibabel.tests.test_image_load_save.test_filename_save)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\Python25\Lib\site-packages\nibabel\testing\_paramtestpy2.py", line 54, in run_parametric
    testgen.next()
  File "C:\Python25\Lib\site-packages\nibabel\tests\test_image_load_save.py", line 236, in test_filename_save
    nib.save(img, fname)
  File "C:\Python25\Lib\site-packages\nibabel\loadsave.py", line 70, in save
    img.to_filename(filename)
  File "C:\Python25\Lib\site-packages\nibabel\spatialimages.py", line 440, in to_filename
    self.to_file_map()
  File "C:\Python25\Lib\site-packages\nibabel\spm99analyze.py", line 285, in to_file_map
    import scipy.io as sio
ImportError: No module named scipy.io

======================================================================
ERROR: test_data_hdr_cache (nibabel.tests.test_spm2analyze.TestSpm2AnalyzeImage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\Python25\Lib\site-packages\nibabel\testing\_paramtestpy2.py", line 54, in run_parametric
    testgen.next()
  File "C:\Python25\Lib\site-packages\nibabel\tests\test_analyze.py", line 370, in test_data_hdr_cache
    img.to_file_map(fm)
  File "C:\Python25\Lib\site-packages\nibabel\spm99analyze.py", line 285, in to_file_map
    import scipy.io as sio
ImportError: No module named scipy.io

======================================================================
ERROR: test_data_hdr_cache (nibabel.tests.test_spm99analyze.TestSpm99AnalyzeImage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\Python25\Lib\site-packages\nibabel\testing\_paramtestpy2.py", line 54, in run_parametric
    testgen.next()
  File "C:\Python25\Lib\site-packages\nibabel\tests\test_analyze.py", line 370, in test_data_hdr_cache
    img.to_file_map(fm)
  File "C:\Python25\Lib\site-packages\nibabel\spm99analyze.py", line 285, in to_file_map
    import scipy.io as sio
ImportError: No module named scipy.io

======================================================================
FAIL: nibabel.tests.test_data.test_data_path(['C:\\Python25\\share\\nipy', 'C:\\Documents and Settings\\Matthew\\_nipy']
, ['C:\\Python25\\share\\nipy', 'C:\\Documents and Settings\\Matthew\\.nipy'])
----------------------------------------------------------------------
Traceback (most recent call last):
  File "c:\python25\lib\site-packages\nose-1.0.0-py2.5.egg\nose\case.py", line 187, in runTest
    self.test(*self.arg)
AssertionError: ['C:\\Python25\\share\\nipy', 'C:\\Documents and Settings\\Matthew\\_nipy'] != ['C:\\Python25\\share\\ni
py', 'C:\\Documents and Settings\\Matthew\\.nipy']

----------------------------------------------------------------------
Ran 8090 tests in 4.156s

FAILED (SKIP=10, errors=6, failures=1)

1.1.0: reports having data_scale and has Nones as scale/offset

$> python -c 'import nibabel as nb; print; print nb.__version__; im=nb.load("/home/research/nirs/pilot/YHIE040301/pilot+ep2d_bold_AXIAL_PILOT_IRE.feat/thresh_zstat17.nii.gz"); h=im.get_header(); print h.has_data_slope; print h.get_slope_inter(); print h' 

1.1.0
True
(None, None)
<class 'nibabel.nifti1.Nifti1Header'> object, endian='<'
sizeof_hdr      : 348
data_type       : 
db_name         : 
extents         : 0
session_error   : 0
regular         : r
dim_info        : 0
dim             : [ 3 64 64 32  1  1  1  1]
intent_p1       : 0.0
intent_p2       : 0.0
intent_p3       : 0.0
intent_code     : none
datatype        : float32
bitpix          : 32
slice_start     : 0
pixdim          : [ 1.      3.4375  3.4375  4.      1.      1.      1.      1.    ]
vox_offset      : 352.0
scl_slope       : 0.0
scl_inter       : 0.0
slice_end       : 0
slice_code      : unknown
xyzt_units      : 10
cal_max         : 0.0
cal_min         : 0.0
slice_duration  : 0.0
toffset         : 0.0
glmax           : 0
glmin           : 0
descrip         : FSL3.3
aux_file        : 
qform_code      : scanner
sform_code      : scanner
quatern_b       : -0.00603597983718
quatern_c       : -0.247024014592
quatern_d       : 0.968989014626
qoffset_x       : 106.474998474
qoffset_y       : 113.766998291
qoffset_z       : -13.3468999863
srow_x          : [ -3.43723011e+00  -1.07043998e-04  -4.98629995e-02   1.06474998e+02]
srow_y          : [  2.06087008e-02  -3.01797009e+00  -1.91482997e+00   1.13766998e+02]
srow_z          : [ -0.03757     -1.64568996   3.51153994 -13.34689999]
intent_name     : 
magic           : n+1

Float 128 usually not saved correctly

The nifti standard allows us to save float128, but numpy float128 is almost invariably not what nifti imagines to be float128 - it's usually intel 80 bit padded to 128 bits.

So, we'll save a confusing file which will load fine in numpy but be completely wrong elsewhere. So we should probably raise an error for float128 or do so when we're not running on an IBM S/360 or whatever has a real float128

Allow decrease of vox_offset

See discussion here: http://mail.scipy.org/pipermail/nipy-devel/2012-February/007357.html

Here was some discussion from that thread:

If we know that, on load, vox_offset does indeed point to the end of
the extensions, then it would be safe to shrink vox_offset.

This seems like a reasonable compromise. Do you see any downsides to this approach?

No - other than complicating the interface a bit. One idea might be
to make a 'protect_offset' keyword to the image constructor, which
would be None by default, and set to True iff vox offset was larger
than that predicted by the size of the header (possibly taking into
account rounding up to the nearest 16 byte memory block).

In trackvis.py: ValueError: The truth value of an array with more than one element is ambiguous.

ValueError: The truth value of an array with more than one element is ambiguous.
Use a.any() or a.all()

Hi,
I got that error when providing scalars (an numpy.array) whitin a tuple streamlines in function write at line trackvis.py:284
"if scalars:"

I think it should be:
"if scalars is not None:"

I suspected the same error will happen at line: trackvis.py:290 with
"if props:"

I'm currently using Python 2.7

Add dicom public and private data repositories

Use nipy data package mechanism for now

Collect releasable data into public package, probably hosted on sourceforge, as in:

http://nipy.sourceforge.net/data-packages/

With each dataset, include successful conversion by SPM or dcm2nii or other. Write comparison utilities of form:

our_converted_img = nibabel.dicom.nifti_from_files(example_data.some_data.dicom_files)
desired_img = nibabel.load(example_data.some_data.converted_fname)
assert_true(np.allclose(as_canonical(desired_img), as_canonical(our_converted_img))

Collect unreleasable data in private package, probably on Dropbox or similar, with restricted access. Collect datasets known to cause problems.

parrec2nii with wildcard input

when parrec2nii is executed with *.PAR in a directory, if one of the PAR files cannot be converted (scout image for example), then it stop converting rest of the PAR files in the directory that can be converted.

If given a list of files using wild card, parrec2nii should spit out the error if a file is not convertible but continue converting the ones that can be.

Test failures for latest pydicom

0.9.7. Mostly of form:

======================================================================

983ERROR: nibabel.nicom.tests.test_dicomwrappers.test_wrapper_from_data

984----------------------------------------------------------------------

985Traceback (most recent call last):

986 File "/usr/lib/pymodules/python2.7/nose/case.py", line 187, in runTest

987 self.test(*self.arg)

988 File "/usr/lib/pymodules/python2.7/numpy/testing/decorators.py", line 146, in skipper_func

989 return f(*args, **kwargs)

990 File "/usr/local/lib/python2.7/dist-packages/nibabel/nicom/tests/test_dicomwrappers.py", line 91, in test_wrapper_from_data

991 np.dot(didr.DPCS_TO_TAL, dw.get_affine()),

992 File "/usr/local/lib/python2.7/dist-packages/nibabel/nicom/dicomwrappers.py", line 282, in get_affine

993 ipp = self.image_position

994 File "/usr/local/lib/python2.7/dist-packages/nibabel/onetime.py", line 43, in __get__

995 val = self.getter(obj)

996 File "/usr/local/lib/python2.7/dist-packages/nibabel/nicom/dicomwrappers.py", line 578, in image_position

997 return ipp + np.dot(Q, vox_trans_fixes[:,None]).ravel()

998TypeError: unsupported operand type(s) for *: 'Decimal' and 'float'

http://travis-ci.org/#!/matthew-brett/nibabel/jobs/1282322

Set stacklevel=2 for deprecationwarnings

Particularly in spatialimages.py

Allows us to see what code was raising the warning. Of form:

def __array__(self):
    """Return data as a numpy array."""
    warnings.warn('Please use get_data instead',
                  DeprecationWarning,
                  stacklevel=2)
    return self.get_data()

update warning on missing dicom

when installing nibabel, we get the warning:

Missing optional package "dicom"

i think this should be change to:

Missing import "dicom" supplied by package "pydicom" (or something along those lines)

Unfriendly error when converting nifti-specific dtypes to Analyze

For example:

import numpy as np
import nibabel as nib
# uint16 only supported by nifti, not Analyze
img = nib.Nifti1Image(np.zeros((2,3,4), dtype=np.uint16), np.eye(4))
aimg = nib.analyze.AnalyzeImage.from_image(img)

raises a datacode 512 not recognized, which is not very helpful.

ecat framenumber bug

Line 477 in ecat.py:

        mlist_n = np.where(mlist[:,0] == id)[0][0]

should be:

        mlist_n = np.where(tmp == id)[0][0]

replace mlist[:,0] with tmp

Failing assert biggest_gap > 1 on big-endian while unittesting

I was troubleshooting pymvpa test/build-failures on some exotic architectures and decided to test nibabel as well on s390 (big-endian AFAIK) and got new failure (first in that cleaner-array-writers PR#73 branch and then verified in master, thus just filing a bug report):

test_to_from_fileobj (nibabel.tests.test_wrapstruct.TestWrapStruct) ... ok

======================================================================
FAIL: nibabel.tests.test_floating.test_floor_exact
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/pymodules/python2.7/nose/case.py", line 187, in runTest
    self.test(*self.arg)
  File "/home/yoh/nibabel/nibabel/tests/test_floating.py", line 97, in test_floor_exact
    assert_equal(int_flex(iv-1, t), iv-1)
  File "/home/yoh/nibabel/nibabel/tests/test_floating.py", line 87, in <lambda>
    int_flex = lambda x, t : as_int(floor_exact(x, t))
  File "/home/yoh/nibabel/nibabel/casting.py", line 356, in floor_exact
    assert biggest_gap > 1
AssertionError

pdb-ing

-> assert biggest_gap > 1
(Pdb) print biggest_gap
0.015625
(Pdb) print val
162259276829213363391578010288127
(Pdb) print flt_type
<type 'numpy.float128'>
(Pdb) print biggest_gap
0.015625

Reset of qform and sform codes on save

Andrew Connolly pointed out the following misbehavior:

import numpy as np
import nibabel as nib

from nose.tools import assert_equal

img = nib.Nifti1Image(np.zeros((2,3,4)), None)
hdr = img.get_header()
hdr['qform_code'] = 3
hdr['sform_code'] = 4
nib.save(img, 'afile.nii')
img_back = nib.load('afile.nii')
hdr_back = img_back.get_header()
# These succeed
assert_equal(hdr_back['qform_code'], 3)
assert_equal(hdr_back['sform_code'], 4)

nib.save(img_back, 'afile2.nii')
# These fail - the save has modified the image in memory
assert_equal(hdr_back['qform_code'], 3)
assert_equal(hdr_back['sform_code'], 4)
img_back_again = nib.load('afile2.nii')
hdr_back_again = img_back_again.get_header()
# These fail - the saved image is also modified
assert_equal(hdr_back_again['qform_code'], 3)
assert_equal(hdr_back_again['sform_code'], 4)

'Nifti1Header' object has no attribute '_structarr' upon copy

As reported originally on nipy-user
https://groups.google.com/forum/?fromgroups=#!topic/nipy-user/tDl4w2oN75U

    header_ = evds.a.imghdr.copy()
  File "/usr/lib/pymodules/python2.6/nibabel/nifti1.py", line 561, in copy
    self.binaryblock,
  File "/usr/lib/pymodules/python2.6/nibabel/wrapstruct.py", line 210, in binaryblock
    return self._structarr.tostring()
AttributeError: 'Nifti1Header' object has no attribute '_structarr'

and it is probably related to the trigger -- storing of this header into hdf5 using our PyMVPA hdf5 backend under nibabel 1.2.0 and then reloading it with nibabel 1.4.0, so per se it might not even be nibabel's issue... but that yet to be proven ;)

CSA slice normal can be not quite orthogonal

I have a data set where the slice normal in the Siemens CSA header is not quite orthogonal to the directions in the "ImageOrientationPatient". This causes the assertion on line 165 in nibabel.nicom.dicomwrappers to fail.

>>> from nibabel.nicom.dicomwrappers import wrapper_from_data
>>> dcm_wrp = wrapper_from_data(dcm)
>>> dcm_wrp.rotation_matrix
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
/home/moloney/<ipython-input-13-3a0219ff8c80> in <module>()
----> 1 dcm_wrp.rotation_matrix

/usr/global/lib/python2.7/site-packages/nibabel-1.2.0-py2.7.egg/nibabel/onetime.pyc in __get__(self, obj, type)
     41            return self.getter
     42 
---> 43        val = self.getter(obj)
     44        #print "** setattr_on_read - loading '%s'" % self.name  # dbg

     45        setattr(obj, self.name, val)

/usr/global/lib/python2.7/site-packages/nibabel-1.2.0-py2.7.egg/nibabel/nicom/dicomwrappers.pyc in rotation_matrix(self)
    163         assert np.allclose(np.eye(3),
    164                            np.dot(R, R.T),
--> 165                            atol=1e-6)
    166         return R
    167 

AssertionError: 

>>> dcm_wrp.slice_normal
array([ 0.        ,  0.43994078,  0.89802679])
>>> iop = dcm_wrp.image_orient_patient
>>> iop = np.array([float(x) for col in iop for x in col]).reshape(iop.shape)
>>>  np.cross(*iop.T[:]) - dcm_wrp.slice_normal
array([  4.39939181e-17,  -1.59949810e-06,   7.80545210e-07])

Allow char codes for dtypes

At the moment, using codes like 'i2', or 'f4' for setting header dtypes doesn't work:

In [7]: import nibabel as nib

In [8]: hdr = nib.Nifti1Header()

In [9]: hdr.set_data_dtype('i2')
---------------------------------------------------------------------------
HeaderDataError                           Traceback (most recent call last)
/home/mb312/ in ()
----> 1 hdr.set_data_dtype('i2')

/home/mb312/usr/local/lib/python2.6/site-packages/nibabel/analyze.pyc in set_data_dtype(self, datatype)
    576         except KeyError:
    577             raise HeaderDataError(
--> 578                 'data dtype "%s" not recognized' % datatype)
    579         dtype = self._data_type_codes.dtype[code]
    580         # test for void, being careful of user-defined types

It would be helpful if it did.

Need API for loading multiple images

Some image formats can contain more than one image in a single file.

By 'more than one image' I mean images where the shape, affine, data-type or other meta-information can differ across the images. For example nifti and friends can contain 4D images, but all volumes in the 4D series have the same meta-information, and so these are not multiple images.

Writing trackvis file which contains scalars is not idempotent

The first time a trackvis file containing scalars is saved, if the scalars type is not float the following test will be true: "if scalars.dtype != f4dt:", so the scalars will be converted into float ("scalars = scalars.astype(f4dt)") and concatenated to "pts" array to be saved ("pts = np.c_[pts, scalars]"). That is working great, but the second time (i.e. if you load a trackvis file containing float scalars), there will be no concatenation meaning that no scalars will be saved and causing an exception next time the file is loaded.

Unindent line 402 once to fix it.
https://github.com/nipy/nibabel/blob/master/nibabel/trackvis.py#L402

Nibabel Nifti and Analyze images do not pickle

In [1]: import nibabel

In [2]: img = nibabel.load('/usr/share/fsl/data/standard/MNI152_T1_1mm.nii.gz')

In [3]: import pickle

In [4]: s = pickle.dumps(img)
PicklingError: Can't pickle : it's not found as nibabel.analyze.ImageArrayProxy

Pickling is useful for many operations in Python frameworks, for instance hashing with an MD5, or parallel computing (e.g. with multiprocessing).

Docstring for EcatImage __init__ incorrect

I think it's not true, as the docstring states, that subheader or mlist can be None on input to the init for EcatImage.

Is there any way of making a sensible default for subheader or mlist if None was passed?

If so, could these change so that None is the default for these two parameters, maybe simplifying the creation of Ecat images ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.