fzj-inm1-bda / siibra-python Goto Github PK
View Code? Open in Web Editor NEWSoftware interfaces for interacting with brain atlases - Python client
License: Apache License 2.0
Software interfaces for interacting with brain atlases - Python client
License: Apache License 2.0
When selectin parcellations and spaces from an atlas object, the atlas.spaces
and atlas.parcellations
members should behave like the main brainscapes.spaces
and brainscapes.parcellations
glossaries, in that they allow autocompltion and need not be used with list indices.
siibra-explorer also reads the MRI-scale data from precomputed URLs. These URLs need to be included in the parcellations definition, and forwarded.
in feature_useVolumeSrc
branch, several instances references dict union operator (e.g.
siibra-python/siibra/region.py
Lines 442 to 444 in f9f1e34
They are introduced in python 3.9 (ref: https://www.python.org/dev/peps/pep-0584/) , and are breaking tests & compat for python 3.7 and python 3.8
The cached_get implementation in request uses all kwargs for creating the hash. This includes the authentication token for EBRAINS queries, which is not elegant as it iwll refuse to use the cached data if the HBP_AUTH_TOKEN is not set any more.
for detail, see: FZJ-INM1-BDA/siibra-explorer#942
Currently the CI unit tests are failing due to invalid EBRAINS KG tokens. This is likely a consequence of the EBRAINS KG migration to the new identity management service, which needs to be fixed in the CI configuration.
The ReceptorQuery
class caches extracted feature in a class instance list to avoid redundant queries for different parcellations / region selections (since object construction of the extractor happens for every feature query). However, during the loading process, region names are decoded with the parcellation of the first object constructed. This can lead to problems if a query with a new extractor object which has a different parcellation defining the same region is called later on, for example, a different version of the same parcellation.
version/hash: 6831d2d
to reproduce:
import siibra as sb
atlas=sb.atlases.MULTILEVEL_HUMAN_ATLAS
atlas.select_region('4p left')
regionprop=sb.region.RegionProps(
atlas.selected_region,
sb.spaces.MNI152_2009C_NONL_ASYM
)
expected result: returns regionprop
actual result: AttributeError:
[siibra:WARNING] No mask could be computed for Area 4p (PreCG) - left hemisphere
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/xiao/dev/projects/siibra-api/venv/lib/python3.6/site-packages/siibra/region.py", line 488, in __init__
M = np.asanyarray(mask.dataobj)
AttributeError: 'NoneType' object has no attribute 'dataobj'
some diagnosis seems to suggest that ParcellationMap
is indexed by non hemisphered region (ie, parent node of 4p left
)
Given a bounding box, I tried to extract a region from the BigBrain histological space and the corresponding isocortex segmentation.
Since I used the same bounding box for both, I would expect the resulting volumes to be aligned (up to the precision of the cortex segmentation), but they are not.
This is the code I used:
import numpy as np
import brainscapes as bs
from nilearn import plotting
from cloudvolume import Bbox
bb_img = bs.bigbrain.BigBrainVolume(bs.spaces.BIG_BRAIN_HISTOLOGY.url)
# Define two points in BigBrain space
p0 = np.array((-3.979, -61.256, 3.906))
p1 = np.array((5.863, -55.356, -2.487))
def mm_to_vox(p0, p1, img, mip):
# Bounding box needs to be defined in voxel space, so we need to apply the inverse affine matrix to the points
inv_aff = np.linalg.inv(img.affine(mip))
p0_vox = np.dot(inv_aff[:3, :3], p0) + inv_aff[:3, -1]
p1_vox = np.dot(inv_aff[:3, :3], p1) + inv_aff[:3, -1]
return p0_vox, p1_vox
# Read region from BigBrain
mip = 0
p0_vox, p1_vox = mm_to_vox(p0, p1, img=bb_img, mip=mip)
# Define bounding box
bbox = Bbox(p0_vox, p1_vox)
img = bb_img.Image(clip=bbox, mip=mip, force=True)
mask_url = "https://neuroglancer.humanbrainproject.eu/precomputed/BigBrainRelease.2015/classif/"
mask_img = bs.bigbrain.BigBrainVolume(mask_url)
# Bounding box needs to be redefined, as mask_img has different voxel space
p0_vox, p1_vox = mm_to_vox(p0, p1, img=mask_img, mip=mip)
# Define bounding box
bbox = Bbox(p0_vox, p1_vox)
mask = mask_img.Image(clip=bbox, mip=mip, force=True)
# Plot on top of each other. I would expect these images to be aligned, since I used the same bounding box, but they are not.
plotting.view_img(mask,
bg_img=img,
cmap="gray",
vim=0,
vmax=255,
resampling_interpolation="nearest",
symmetric_cmap=False)
I could imagine that there is an issue with how the bounding box coordiantes are rounded internally to match to the voxel space.
currently siibra-python only supports human multilevel atlas.
In order to reach feature parity with the existing siibra-explorer, siibra-python will need to support rodent atlases (Waxholm rat and Allen mouse)
Hi,
when installing siibra via the pip command, I think the wrong version of numpy gets installed alongside:
Successfully installed numpy-1.16.6
even when the newest numpy version was previously installed.
When trying to import siibra the following error occurs:
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
after manually installing the newest numpy version subsequently to the siibra installation, the siibra import works.
Best
L
It would be useful if siibra tried to recover from non-writeable cache directory e.g. by using tempfile.mkdtemp
similarly to how matplotlib handles the same situation:
It would streamline the notebooks in the EBRAINS lab.
following from #38
to reproduce:
import siibra as sb
atlas=sb.atlases.MULTILEVEL_HUMAN_ATLAS
# deliberately not selecting a region
# atlas.select_region("V1")
features = atlas.get_features(
siibra.modalities.GeneExpression,
gene=siibra.features.gene_names.MAOA)
expected result: raise exception (need to select region), or use a more efficient way to producing the mask
actual result: tries to calculate masks for every sub regions, and eventually freezes/OOMs
The names of functions affine
and Image
of the BigBrainVolume
class are misleading.
Based on the naming, both of them are easily confused with an attribute or a properties.
I would not expect an entity called affine
to be a function or to accept an argument.
In addition, Image
does not confirm to the naming convention for functions or attributes, it looks more like a nested class.
I would suggest renaming the functions to imply what they are doing from the name, for example buildAffine
or getImage
.
There is still a misconfiguration with or misinterpretation of the neuroglancer volumeSrc maps, which causes the cortical layer maps to be misaligned with the BigBrain template.
We are running a mirror of the predefined configurations, which is accessed if the main configuration repo is not reachable. Since v0.2 introduced breaking changes to the configurations, it is important that the mirror has the latest changes.
We need to check wether the mirroring of configurations is working.
The volumeSrc json defintions and data strucrtures in siibra-python are not very clean. In particular we need to discuss revise this deep structure (dict of dict of lists):
siibra-python/siibra/parcellation.py
Line 268 in 8e89d3c
It is unclear and not documented what the 'key' (index to second dict hierarchy) is supposed to be and how it is interpreted. It is in general difficult to work with and understand such a deep structure, and it is unclear how to deal with multiple volume sources for the same space - which one to choose then?
Some first thoughts:
min reproduce:
import siibra as sb
atlas=sb.atlases.MULTILEVEL_HUMAN_ATLAS
atlas.select_region('hoc1 left')
connpr1=atlas.get_features(sb.modalities.ConnectivityProfile) # takes a very long time
connpr2=atlas.get_features(sb.modalities.ConnectivityProfile) # also takes a very long time
expected behaviour: second call should return cached result, rather than fetch result from scratch
From a discussion with codemart, the following idea came up:
Currently, connectivity types are split into profiles and matrices. This makes not sense, since these are both the same modality, just different formats (profiles are in principle just the rows of the matrices). The format is determined by the concept used to issue the query: when querying with a brain region object, a profile is expected; when querying with a parcellation, a matrix is expected.
Instead, the modality should be split into: functional connectivity, structural connection strengths, structural connection lengths (cf. the HBP deliverable document about the multiscale connectome with some thought on this).
from etienne:
ImportError Traceback (most recent call last)
<ipython-input-1-0026f1f8a7a3> in <module>
----> 1 import siibra
2 siibra.logger.setLevel('INFO')
3 siibra.__version__
/run/media/etienne/DATA/Toolbox/BraiNets/siibra-python/siibra/__init__.py in <module>
27 from os import path
28
---> 29 from .space import REGISTRY as spaces
30 from .parcellation import REGISTRY as parcellations
31 from .atlas import REGISTRY as atlases
/run/media/etienne/DATA/Toolbox/BraiNets/siibra-python/siibra/space.py in <module>
14
15 from .commons import create_key
---> 16 from .config import ConfigurationRegistry
17 from .retrieval import download_file
18 from .bigbrain import BigBrainVolume
/run/media/etienne/DATA/Toolbox/BraiNets/siibra-python/siibra/config.py in <module>
16 from . import logger,__version__
17 from .commons import create_key
---> 18 from gitlab import Gitlab
19
20 # Until openminds is fully supported,
ImportError: cannot import name 'Gitlab' from 'gitlab' (/home/etienne/anaconda3/lib/python3.7/site-packages/gitlab/__init__.py)
version/hash: 6831d2d
to reproduce:
import siibra as sb
atlas=sb.atlases.MULTILEVEL_HUMAN_ATLAS
atlas.select_region('123 left')
[siibra:ERROR] Cannot select region. The spec "123 left" is not unique. It matches: Ch 123 (Basal Forebrain) - left hemisphere, Ch 123 (Basal Forebrain) - left hemisphere
after select_parcellation
, get_feature
is expected to get features from the selected parcellation. 0.0.9.dev3 currently does not.
>>> import siibra as sb
[siibra:INFO] Selected parcellation "Julich-Brain Cytoarchitectonic Maps 2.5"
[siibra:INFO] Version: 0.0.9.dev3
>>> atlas = sb.atlases.MULTILEVEL_HUMAN_ATLAS
>>> atlas.select_parcellation(sb.parcellations['minds/core/parcellationatlas/v1.0.0/94c1125b-b87e-45e4-901c-00daee7f2579-25'])
[siibra:INFO] Selected parcellation "Julich-Brain Cytoarchitectonic Maps 2.5"
>>> atlas.selected_parcellation.id
'minds/core/parcellationatlas/v1.0.0/94c1125b-b87e-45e4-901c-00daee7f2579-25'
>>> atlas.select_region('hoc1 left')
[siibra:INFO] Selected region Area hOc1 (V1, 17, CalcS) - left hemisphere
Area hOc1 (V1, 17, CalcS) - left hemisphere
>>> conn=atlas.get_features(sb.modalities.ConnectivityProfile)
>>> conn[1].parcellation.id
'minds/core/parcellationatlas/v1.0.0/94c1125b-b87e-45e4-901c-00daee7f2579'
>>> conn[1].parcellation.name
'Julich-Brain Cytoarchitectonic Maps 1.18'
>>>
At the first time in an empty environment, importing the siibra package takes quite a while since it pulls the configuration and some initial data, inlcuding a list of gene names. This should be avoided, loading of required data should follow a lazy strategy.
A short example to reproduce the error:
`
import siibra as bs
from siibra import parcellations
atlas_id = 'juelich/iav/atlas/v1.0.0/1'
atlas = bs.atlases[atlas_id]
parcelaltion_id = 'minds/core/parcellationatlas/v1.0.0/94c1125b-b87e-45e4-901c-00daee7f2579-273'
atlas.select_parcellation(parcelaltion_id, force=True)
reg_name = 'cerebellar nuclei'
selected_region = atlas.find_regions(reg_name)[0]
atlas.select_region(selected_region)
space_id = 'minds/core/referencespace/v1.0.0/dafcffc5-4826-4bf1-8ff6-46b8a31ff8e2'
r_props = selected_region.spatialprops(bs.spaces[space_id], force=True)
`
This results in IndexError: list index out of range
Changing: atlas.select_parcellation(parcelaltion_id, force=True)
To: atlas.select_parcellation(parcelaltion_id, force=False)
Returns a valid result without an error. So using an experimental parcellation version prevents the creation of RegionProps.
When creating RegionProps for the following Input (see also code bellow):
I get the following Error: TypeError: Non-integer label_image types are ambiguous
This error aslo occurs on other combinations of Parcellation/Space/Region
atlas = REGISTRY.MULTILEVEL_HUMAN_ATLAS
# Julich-Brain Probabilistic Cytoarchitectonic Maps (v2.5)
atlas.select_parcellation('minds/core/parcellationatlas/v1.0.0/94c1125b-b87e-45e4-901c-00daee7f2579-25')
# Region 'Area PFm (IPL) - right hemisphere'
selected_region = atlas.regiontree.find('Area PFm (IPL) - right hemisphere')
atlas.select_region(selected_region[0])
# MNI Colin 27
space_id = 'minds/core/referencespace/v1.0.0/7f39f7be-445b-47c0-9791-e971c0b6d992'
r_props = regionprops.RegionProps(atlas, find_space_by_id(atlas, space_id))
The meshes are currently maintained as neuroglancer mesh format on atlas VMs. The library needs to be able to forard the site URL, read them and also convert them to commonly used python data structures.
relevant trello card: https://trello.com/c/69JRHucS/65-provide-access-to-labelled-meshes-via-python-and-api-to-support-new-julich-brain-version-and-freesurfer
hash/version: e155a16
to reproduce:
import siibra as sb
atlas=sb.atlases.MULTILEVEL_HUMAN_ATLAS
assert atlas.selected_region is None
expected result: assertion passes
actual result: assertion fails. selected_region
seems to default to selected parcellation
environment:
> python --version
3.6.9
> pip --version
pip 21.0.1 from /home/xiao/dev/tmp/siibra-python/venv/lib/python3.6/site-packages/pip (python 3.6)
to reproduce:
> pip install siibra==0.0.9.dev1
> python
>>> import siibra as sb
error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/xiao/dev/tmp/siibra-python/venv/lib/python3.6/site-packages/siibra/__init__.py", line 26, in <module>
with open(path.join(PKG_DIR,"..","VERSION"),"r") as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/xiao/dev/tmp/siibra-python/venv/lib/python3.6/site-packages/siibra/../VERSION'
To reproduce:
atlas = siibra.atlases['human']
atlas.select(parcellation='julich 2.5',region='v1 left')
v1_mask = atlas.selected_region.build_mask("mni152")
plotting.plot_roi(v1_mask)
Expected: binary mask for left hemisphere's V1
Obtained: binary mask for V1 in both hemispheres.
For continuousmaps, using get_regional_map
, the output is correct.
Computing the surface area of a region is computationally expensive (marching cubes), but is constant for the same region over time. The spatial properties of brain regions should be precomputed and stored with the parcellation to avoid costly recomputing and the client side.
e.g. https://search.kg.ebrains.eu/instances/Dataset/33af3a71-4bd9-4ac6-9570-95769c8e1e49 seems to not be fetched (or the region parsed properly)
this is mainly a reminder to address https://jugit.fz-juelich.de/t.dickscheid/brainscapes-configurations/-/issues/2 , because, the current config points to an older version of julich brain.
Hi,
I tried to install siibra using pip on two separate systems (Ubuntu 20.04 and Windows 10 both using Python coming with Anaconda). In both cases, the installation process exists with errors.
From my Ubuntu system:
Building wheel for psutil (setup.py) ... error
ERROR: Command errored out with exit status 1:
command: /home/mario/anaconda3/envs/hbpatlas/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/setup.py'"'"'; __file__='"'"'/tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-5f77gmol
cwd: /tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/
Complete output (41 lines):
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.8
creating build/lib.linux-x86_64-3.8/psutil
copying psutil/_pssunos.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_pslinux.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psosx.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_compat.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_exceptions.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psbsd.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_common.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/__init__.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_pswindows.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psaix.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psposix.py -> build/lib.linux-x86_64-3.8/psutil
creating build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_sunos.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_linux.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_posix.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_misc.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_system.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_connections.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_unicode.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_aix.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_bsd.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_osx.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/__init__.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/__main__.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_process.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_memory_leaks.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_contracts.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_windows.py -> build/lib.linux-x86_64-3.8/psutil/tests
running build_ext
building 'psutil._psutil_linux' extension
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/psutil
gcc -pthread -B /home/mario/anaconda3/envs/hbpatlas/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DPSUTIL_POSIX=1 -DPSUTIL_VERSION=543 -DPSUTIL_LINUX=1 -DPSUTIL_ETHTOOL_MISSING_TYPES=1 -I/home/mario/anaconda3/envs/hbpatlas/include/python3.8 -c psutil/_psutil_common.c -o build/temp.linux-x86_64-3.8/psutil/_psutil_common.o
unable to execute 'gcc': No such file or directory
error: command 'gcc' failed with exit status 1
----------------------------------------
ERROR: Failed building wheel for psutil
Running setup.py clean for psutil
Building wheel for posix-ipc (setup.py) ... error
ERROR: Command errored out with exit status 1:
command: /home/mario/anaconda3/envs/hbpatlas/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-r42v1ccb/posix-ipc_be0f64e0767d45f49caaff0f1cfc8c1a/setup.py'"'"'; __file__='"'"'/tmp/pip-install-r42v1ccb/posix-ipc_be0f64e0767d45f49caaff0f1cfc8c1a/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-ne9azowm
cwd: /tmp/pip-install-r42v1ccb/posix-ipc_be0f64e0767d45f49caaff0f1cfc8c1a/
Complete output (23 lines):
******************************************************************************
* Setup can't determine if it needs to link to the realtime libraries on your
* system, so it will default to 'no' which may not be correct.
*
* Please report this message and your operating system info to the package
* maintainer listed in the README file.
******************************************************************************
******************************************************************************
* Setup can't determine the value of PAGE_SIZE on your system, so it will
* default to 4096 which may not be correct.
*
* Please report this message and your operating system info to the package
* maintainer listed in the README file.
******************************************************************************
running bdist_wheel
running build
running build_ext
building 'posix_ipc' extension
creating build
creating build/temp.linux-x86_64-3.8
gcc -pthread -B /home/mario/anaconda3/envs/hbpatlas/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/mario/anaconda3/envs/hbpatlas/include/python3.8 -c posix_ipc_module.c -o build/temp.linux-x86_64-3.8/posix_ipc_module.o
unable to execute 'gcc': No such file or directory
error: command 'gcc' failed with exit status 1
----------------------------------------
ERROR: Failed building wheel for posix-ipc
Running setup.py clean for posix-ipc
Failed to build psutil posix-ipc
Installing collected packages: psutil, posix-ipc, patsy, pandas, nibabel, networkx, matplotlib, json5, imageio, fpzip, fastremap, DracoPy, compressed-segmentation, cloud-files, args, statsmodels, scikit-image, python-gitlab, nilearn, memoization, cloud-volume, clint, appdirs, anytree
Running setup.py install for psutil ... error
ERROR: Command errored out with exit status 1:
command: /home/mario/anaconda3/envs/hbpatlas/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/setup.py'"'"'; __file__='"'"'/tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-n2s2go14/install-record.txt --single-version-externally-managed --compile --install-headers /home/mario/anaconda3/envs/hbpatlas/include/python3.8/psutil
cwd: /tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/
Complete output (41 lines):
running install
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.8
creating build/lib.linux-x86_64-3.8/psutil
copying psutil/_pssunos.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_pslinux.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psosx.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_compat.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_exceptions.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psbsd.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_common.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/__init__.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_pswindows.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psaix.py -> build/lib.linux-x86_64-3.8/psutil
copying psutil/_psposix.py -> build/lib.linux-x86_64-3.8/psutil
creating build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_sunos.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_linux.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_posix.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_misc.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_system.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_connections.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_unicode.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_aix.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_bsd.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_osx.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/__init__.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/__main__.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_process.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_memory_leaks.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_contracts.py -> build/lib.linux-x86_64-3.8/psutil/tests
copying psutil/tests/test_windows.py -> build/lib.linux-x86_64-3.8/psutil/tests
running build_ext
building 'psutil._psutil_linux' extension
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/psutil
gcc -pthread -B /home/mario/anaconda3/envs/hbpatlas/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -DPSUTIL_POSIX=1 -DPSUTIL_VERSION=543 -DPSUTIL_LINUX=1 -DPSUTIL_ETHTOOL_MISSING_TYPES=1 -I/home/mario/anaconda3/envs/hbpatlas/include/python3.8 -c psutil/_psutil_common.c -o build/temp.linux-x86_64-3.8/psutil/_psutil_common.o
unable to execute 'gcc': No such file or directory
error: command 'gcc' failed with exit status 1
----------------------------------------
ERROR: Command errored out with exit status 1: /home/mario/anaconda3/envs/hbpatlas/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/setup.py'"'"'; __file__='"'"'/tmp/pip-install-r42v1ccb/psutil_1bc071388f984cd48c81348fa3ceeec2/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-n2s2go14/install-record.txt --single-version-externally-managed --compile --install-headers /home/mario/anaconda3/envs/hbpatlas/include/python3.8/psutil Check the logs for full command output.```
command
docker build -t brainscape .
results in:
Sending build context to Docker daemon 220.2MB
Step 1/6 : FROM python:3.8-alpine
---> 8744555ae7bb
Step 2/6 : RUN apk update
---> Using cache
---> 6e57a4254f4a
Step 3/6 : RUN apk add make automake gcc g++ subversion python3-dev
---> Using cache
---> 1d26bffbff6d
Step 4/6 : ADD . /brainscapes_client
---> fd8a352f7c37
Step 5/6 : WORKDIR /brainscapes_client
---> Running in b970d5d565ba
Removing intermediate container b970d5d565ba
---> 4d6c90a6a6de
Step 6/6 : RUN pip install -r requirements.txt
---> Running in bd29d796eb5f
Collecting requests
Downloading requests-2.24.0-py2.py3-none-any.whl (61 kB)
Collecting nibabel
Downloading nibabel-3.2.0-py3-none-any.whl (3.3 MB)
Collecting anytree
Downloading anytree-2.8.0-py2.py3-none-any.whl (41 kB)
Collecting pandas
Downloading pandas-1.1.3.tar.gz (5.2 MB)
Installing build dependencies: started
Installing build dependencies: still running...
Installing build dependencies: still running...
Installing build dependencies: still running...
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: still running...
Getting requirements to build wheel: finished with status 'done'
Preparing wheel metadata: started
Preparing wheel metadata: finished with status 'done'
ERROR: Could not find a version that satisfies the requirement PIL (from -r requirements.txt (line 5)) (from versions: none)
ERROR: No matching distribution found for PIL (from -r requirements.txt (line 5))
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
Trying to get a template for a space throws a RuntimeError (see example below)
Example:
import siibra
template = siibra.spaces.BIG_BRAIN.get_template().fetch()
results in a RuntimeError:
RuntimeError: Could not resolve template image for Big Brain. This is most probably due to a misconfiguration of the volume src.
API for select_*
should be consistent
currently select_region
returns the selected region object. Same should happen for select_parcellation
in siibra-explorer, parcellations are (sometimes) grouped according to, say, functional modes, fiber bundle. Should siibra-python also have this functionality?
If yes to above, how should this be implemented?
pinging @dickscheid @marcenko @skoehnen
If passing a bounding box to the clip
argument to the Image
function of BigBrainVolume
, it expects the bounding box to be specified in voxel space of the specified resolution, which is highly inconvenient for users.
The bounding box should be defined in millimeter space and the class should do the conversion itself (which it can do using it's affine matrix).
as of 63f5141
to reproduce, run examples/Basic access to maps and data features.ipynb
expected result
assert len(receptors) > 0
actual result:
len(receptors) == 0 # True
(ebrains user fousekjan brought this to my attention, as his ebrains token return 0 results)
Will retrieve a list of gene acronyms from Allen Atlas now.
This may take a minute.
Traceback (most recent call last):
File "aioserver.py", line 25, in <module>
import brainscapes
File "/webjugex/brainscapes/brainscapes/__init__.py", line 24, in
<module>
from .atlas import REGISTRY as atlases
File "/webjugex/brainscapes/brainscapes/atlas.py", line 21, in <module>
from . import parcellations, spaces, features, logger
File "/webjugex/brainscapes/brainscapes/features/__init__.py", line
30, in <module>
extractor_types,gene_names,modalities = __init__()
File "/webjugex/brainscapes/brainscapes/features/__init__.py", line
22, in __init__
from .genes import AllenBrainAtlasQuery
File "/webjugex/brainscapes/brainscapes/features/genes.py", line 65,
in <module>
class AllenBrainAtlasQuery(FeatureExtractor):
File "/webjugex/brainscapes/brainscapes/features/genes.py", line 123,
in AllenBrainAtlasQuery
+"This may take a minute."))
File "/usr/local/lib/python3.7/json/__init__.py", line 348, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.7/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.7/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Happens quite seldom, but we should maybe catch the error if it happens.
---------- old comments ---------
Dickscheid, Timo
This is a very general thing. How shall we deal with a situation where the remote repositories could not be reached? Brainscapes works offline with cached data, but without a network connection on initial setup it doesn't quite make sense to use brainscapes. To be discussed, any opinions?
To provide a suggestion, I would distinguish some cases:
However, 2. has implications for the cache. We need to make sure we do not store an empty cache item that will be re-opened later on. If no cache item is generated, brainscapes will try to run the query again next time.
To reproduce:
import siibra
atlas = siibra.atlases['rat']
nc = atlas.get_region('neocortex')
for r in siibra.get_features(nc, 'ebrains'):
print(r.name)
Expected behavior: Returns only rat datasets.
Current behaviour: INcludes several human datasets, e.g. maps from the Julich-Brain parcellation.
for example, julich brain v2.5 is the next version of julich brain v1.18.
This is, I believe, currently missing.
Together with versioning metadata, we should also introduce shortname
for the version (not just the parcellation)
commit: 3eb030e
reproduce:
import siibra as sb
atlas=sb.atlases.MULTILEVEL_HUMAN_ATLAS
atlas.select_region('frontal-II gapmap left')
expected result:
Frontal-II (GapMap) - left hemisphere
or
Frontal-II (GapMap) left
actual result:
FrontalII (GapMap) left
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.