Coder Social home page Coder Social logo

hdfgroup / hdf5-json Goto Github PK

View Code? Open in Web Editor NEW
71.0 10.0 25.0 1.83 MB

Specification and tools for representing HDF5 in JSON

Home Page: https://hdf5-json.readthedocs.io

License: Other

Python 99.90% Dockerfile 0.10%
cloud-computing datacube file-format hdf5 json machine-learning

hdf5-json's Introduction

CI Dependency Review CodeQL Documentation Status

h5json

Specification and tools for representing HDF5 in JSON.

Introduction

This repository contains a specification (as BNF grammar and JSON Schema), and a package for working with HDF5 content in JSON. The package CLI utilities can be used to convert any HDF5 file to JSON or from a JSON file (using the specification described here) to HDF5.

The package is also useful for any Python application that needs to translate between HDF5 objects and JSON serializations. In addition to the utilities provided, the package is used by the HDF Server (a RESTful web service for HDF5), and HDF Product Designer (an application for planning HDF5 file content).

Websites

Reporting bugs (and general feedback)

Create new issues at http://github.com/HDFGroup/hdf5-json/issues for any problems you find.

For general questions/feedback, please post on the HDF Forum.

hdf5-json's People

Contributors

ajelenak avatar anabiman avatar gheber avatar jreadey avatar lewismc avatar mivade avatar xarthisius avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hdf5-json's Issues

Use templates for code generation utilities

The code generation files (for FORTRAN in particular) are rather difficult to grok due to lots of multi-line docstrings that break the normal Pythonic indentation flow. At the expense of an extra dependency such as Jinja2, this could be cleaned up considerably.

I don't know enough about what is going on here to want to try to take this on, so this is more of a suggestion than a bug.

Can we allow H5S_UNLIMITED or 0 in dims_array?

dims_array can't specify unlimited while maxdims_array can.

dims_array ::= positive_integer_array
maxdims_array ::= "[" maxdims_list "]"
maxdims_list ::= maxdim ("," maxdim)*
maxdim ::= positive_integer | "H5S_UNLIMITED"

Why not allowing H5S_UNLIMITED in dims_array? How can I handle a case like below in h5json:

 netcdf foo {    // example netCDF specification in CDL

 dimensions:
 lat = 10, lon = 5, time = unlimited;

 variables:
   int     lat(lat), lon(lon), time(time);
   float   z(time,lat,lon), t(time,lat,lon);
   double  p(time,lat,lon);
   int     rh(time,lat,lon);

   lat:units = "degrees_north";
   lon:units = "degrees_east";
   time:units = "seconds";
   z:units = "meters";
   z:valid_range = 0., 5000.;
   p:_FillValue = -9999.;
   rh:_FillValue = -1;

 data:
   lat   = 0, 10, 20, 30, 40, 50, 60, 70, 80, 90;
   lon   = -140, -118, -96, -84, -52;
 }

time is defined but doesn't contain any data (i.e., 0 dimension).

testing h5 file equivalence...

To verify my installation of hdf5-json I ran:

python h5tojson.py ../data/hdf5/tall.h5 > tall.json
python jsontoh5.py tall.json tall.h5

But then to check that they were correct I ran:

diff tall.h5 ../data/hdf5/tall.h5

and I got:

Binary files tall.h5 and ../data/hdf5/tall.h5 differ

I am assuming the contents are effectively equivalent modulo some irrelevant timestamp or something... What is the correct way to check that the process succeeded?

Error after switching to Apple M1

hello,

I just switched to M1 and h5tojson stopped working with an unknown error:

Traceback (most recent call last):
  File "/opt/homebrew/Caskroom/miniconda/base/bin/h5tojson", line 8, in <module>
    sys.exit(main())
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/h5tojson/h5tojson.py", line 244, in main
    dumper.dumpFile()
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/h5tojson/h5tojson.py", line 198, in dumpFile
    self.dumpGroups()
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/h5tojson/h5tojson.py", line 105, in dumpGroups
    item = self.dumpGroup(uuid)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/h5tojson/h5tojson.py", line 90, in dumpGroup
    attributes = self.dumpAttributes('groups', uuid)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/h5tojson/h5tojson.py", line 55, in dumpAttributes
    attr_list = self.db.getAttributeItems(col_name, uuid)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/hdf5db.py", line 1226, in getAttributeItems
    item = self.getAttributeItemByObj(obj, name, False)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/hdf5db.py", line 1176, in getAttributeItemByObj
    shape_json = self.getShapeItemByAttrObj(attrObj)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.8/site-packages/h5json/hdf5db.py", line 853, in getShapeItemByAttrObj
    if obj.shape is None or obj.get_storage_size() == 0:
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5a.pyx", line 443, in h5py.h5a.AttrID.get_storage_size
  File "h5py/defs.pyx", line 353, in h5py.defs.H5Aget_storage_size
RuntimeError: Unspecified error in H5Aget_storage_size (return value ==0)

I'm not sure if this is related to the switch to M1 or something else but it used to work in my old laptop.
I'd be great if someone can report back your results when using M1

Installation problem with h5serv

I am trying to follow these instruction to install the h5serv: https://h5serv.readthedocs.io/en/latest/Installation/ServerSetup.html
I use the git command to download the source codes. I think there's a problem on this line, when it says: From here cd to “server” (or “h5serv-master/server” if you extracted from ZIP file).
Instead of "server", the subfolder inside h5serv is also called h5serv, and that is where app.py is.
When I run app.py from there, I get the following message:

$ python app.py
Traceback (most recent call last):
File "app.py", line 40, in
import h5serv.config as config
ImportError: No module named h5serv.config

Now I can go to the source code and remove every instance of "h5serv." from all the python files, and finally app.py runs successfully, but I never saw the message "Starting event loop on port: 5000"
What am I doing wrong?

h5tojson fails to convert TES-Aura_L2-O3-Nadir_r0000011015_F05_07.he5.

The test file is at ftp://ftp.hdfgroup.uiuc.edu/pub/outgoing/NASAHDF/TES-Aura_L2-O3-Nadir_r0000011015_F05_07.he5

Traceback (most recent call last):
  File "h5tojson.py", line 232, in <module>
    main()
  File "h5tojson.py", line 229, in main
    dumper.dumpFile()    
  File "h5tojson.py", line 184, in dumpFile
    self.dumpDatasets()
  File "h5tojson.py", line 146, in dumpDatasets
    item = self.dumpDataset(uuid)
  File "h5tojson.py", line 135, in dumpDataset
    value = self.db.getDatasetValuesByUuid(uuid)
  File "../lib/hdf5db.py", line 1715, in getDatasetValuesByUuid
    values = dset[slices].tolist()  # just dump to list
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/tmp/pip_build_root/h5py/h5py/_objects.c:2405)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/tmp/pip_build_root/h5py/h5py/_objects.c:2362)
  File "/usr/local/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 467, in __getitem__
    self.id.read(mspace, fspace, arr, mtype)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/tmp/pip_build_root/h5py/h5py/_objects.c:2405)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/tmp/pip_build_root/h5py/h5py/_objects.c:2362)
  File "h5py/h5d.pyx", line 175, in h5py.h5d.DatasetID.read (/tmp/pip_build_root/h5py/h5py/h5d.c:3004)
  File "h5py/_proxy.pyx", line 150, in h5py._proxy.dset_rw (/tmp/pip_build_root/h5py/h5py/_proxy.c:1777)
  File "h5py/_proxy.pyx", line 319, in h5py._proxy.needs_bkg_buffer (/tmp/pip_build_root/h5py/h5py/_proxy.c:3304)
  File "h5py/_proxy.pyx", line 316, in h5py._proxy.needs_bkg_buffer (/tmp/pip_build_root/h5py/h5py/_proxy.c:3243)
KeyError: 'Conversion function not found (No appropriate function for conversion path)'

"jsontoh5 data/json/comp_complex.json" error

jsontoh5 data/json/comp_complex.json command fails with this error:

ValueError: setting an array element with a sequence.

The error comes when converting this dataset's value (relevant HDF5/JSON content only) to a NumPy array:

        "a4afebb4-9b72-11ec-b0a3-8c8590747994": {
            "alias": [
                "/phony_compound_var"
            ],
            "shape": {
                "class": "H5S_SIMPLE",
                "dims": [
                    2
                ]
            },
            "type": {
                "class": "H5T_COMPOUND",
                "fields": [
                    {
                        "name": "yy",
                        "type": {
                            "base": {
                                "class": "H5T_COMPOUND",
                                "fields": [
                                    {
                                        "name": "x",
                                        "type": {
                                            "class": "H5T_COMPOUND",
                                            "fields": [
                                                {
                                                    "name": "i",
                                                    "type": {
                                                        "base": "H5T_STD_I16LE",
                                                        "class": "H5T_INTEGER"
                                                    }
                                                },
                                                {
                                                    "name": "j",
                                                    "type": {
                                                        "base": "H5T_STD_I32LE",
                                                        "class": "H5T_INTEGER"
                                                    }
                                                }
                                            ]
                                        }
                                    },
                                    {
                                        "name": "y",
                                        "type": {
                                            "base": {
                                                "base": "H5T_IEEE_F64LE",
                                                "class": "H5T_FLOAT"
                                            },
                                            "class": "H5T_ARRAY",
                                            "dims": [
                                                2
                                            ]
                                        }
                                    }
                                ]
                            },
                            "class": "H5T_ARRAY",
                            "dims": [
                                2
                            ]
                        }
                    }
                ]
            },
            "value": [
                [
                    [
                        [
                            [
                                1,
                                200000
                            ],
                            [
                                -100000.285657,
                                3.1415926
                            ]
                        ],
                        [
                            [
                                2,
                                400000
                            ],
                            [
                                200000.151617,
                                273.15
                            ]
                        ]
                    ]
                ],
                [
                    [
                        [
                            [
                                3,
                                600000
                            ],
                            [
                                -200000.285657,
                                6.1415926
                            ]
                        ],
                        [
                            [
                                4,
                                800000
                            ],
                            [
                                400000.151617,
                                476.15
                            ]
                        ]
                    ]
                ]
            ]
        }

jsontoh5 version of the value as a Python object is:

[((((1, 200000), (-100000.285657, 3.1415926)), ((2, 400000), (200000.151617, 273.15))),), ((((3, 600000), (-200000.285657, 6.1415926)), ((4, 800000), (400000.151617, 476.15))),)]

The NumPy dataset's value (from h5py) is:

[([((1, 200000), [-1.00000286e+05,  3.14159260e+00]), ((2, 400000), [ 2.00000152e+05,  2.73150000e+02])],), ([((3, 600000), [-2.00000286e+05,  6.14159260e+00]), ((4, 800000), [ 4.00000152e+05,  4.76150000e+02])],)]

They differ in how the H5T_ARRAY components of the dataset's value are converted. jsontoh5 exclusively uses tuples while NumPy prefers the list in this case. The dataset's NumPy dtype in both cases is the same:

dtype([('yy', [('x', [('i', '<i2'), ('j', '<i4')]), ('y', '<f8', (2,))], (2,))])

Support large files

h5tojson.py and jsontoh5.py can't convert files whose size is comparable to the amount of physical memory on the machine the convertor is running on.

Investigate the source of roundtrip JSON -> HDF5 -> JSON difference

This is the input HDF5/JSON:

{
    "apiVersion": "1.0.0",
    "datasets": {
        "6497e74e-6e8e-4290-bee9-535f5a66f665": {
            "alias": [
                "/a"
            ],
            "attributes": [
                {
                    "creationProperties": {
                        "nameCharEncoding": "H5T_CSET_UTF8"
                    },
                    "description": "",
                    "id": "d2922e88-a7c1-4013-bce5-b6f2f78baa36",
                    "name": "DIMENSION_LIST",
                    "shape": {
                        "class": "H5S_SIMPLE",
                        "dims": [
                            1
                        ],
                        "maxdims": [
                            1
                        ]
                    },
                    "type": {
                        "base": {
                            "base": "H5T_STD_REF_OBJ",
                            "class": "H5T_REFERENCE"
                        },
                        "class": "H5T_VLEN"
                    },
                    "value": [
                        [
                            "datasets/edd5c2fe-db6d-4916-9e68-069c4df80005"
                        ]
                    ]
                }
            ],
            "creationProperties": {
                "layout": {
                    "class": "H5D_CONTIGUOUS"
                }
            },
            "description": "",
            "shape": {
                "class": "H5S_SIMPLE",
                "dims": [
                    0,
                    3
                ],
                "maxdims": [
                    "H5S_UNLIMITED",
                    3
                ]
            },
            "type": {
                "base": "H5T_IEEE_F32LE",
                "class": "H5T_FLOAT"
            }
        },
        "edd5c2fe-db6d-4916-9e68-069c4df80005": {
            "alias": [
                "/m"
            ],
            "attributes": [
                {
                    "creationProperties": {
                        "nameCharEncoding": "H5T_CSET_UTF8"
                    },
                    "description": "",
                    "id": "3d1ad24b-514c-41d8-b019-fe89933c7505",
                    "name": "CLASS",
                    "shape": {
                        "class": "H5S_SCALAR"
                    },
                    "type": {
                        "charSet": "H5T_CSET_ASCII",
                        "class": "H5T_STRING",
                        "length": 16,
                        "strPad": "H5T_STR_NULLTERM"
                    },
                    "value": "DIMENSION_SCALE"
                },
                {
                    "creationProperties": {
                        "nameCharEncoding": "H5T_CSET_UTF8"
                    },
                    "description": "",
                    "id": "dee3c381-0f35-4d84-8b34-71aa8c8deb21",
                    "name": "REFERENCE_LIST",
                    "shape": {
                        "class": "H5S_SIMPLE",
                        "dims": [
                            1
                        ],
                        "maxdims": [
                            1
                        ]
                    },
                    "type": {
                        "class": "H5T_COMPOUND",
                        "fields": [
                            {
                                "name": "dataset",
                                "type": {
                                    "base": "H5T_STD_REF_OBJ",
                                    "class": "H5T_REFERENCE"
                                }
                            },
                            {
                                "name": "index",
                                "type": {
                                    "base": "H5T_STD_I32LE",
                                    "class": "H5T_INTEGER"
                                }
                            }
                        ]
                    },
                    "value": [
                        [
                            "datasets/6497e74e-6e8e-4290-bee9-535f5a66f665",
                            0
                        ]
                    ]
                }
            ],
            "creationProperties": {
                "layout": {
                    "class": "H5D_CONTIGUOUS"
                }
            },
            "description": "",
            "shape": {
                "class": "H5S_SIMPLE",
                "dims": [
                    0
                ],
                "maxdims": [
                    "H5S_UNLIMITED"
                ]
            },
            "type": {
                "base": "H5T_IEEE_F32LE",
                "class": "H5T_FLOAT"
            }
        }
    },
    "groups": {
        "31c7d987-47ce-4a03-92f7-2d4b3f0e5fb5": {
            "alias": [
                "/"
            ],
            "attributes": [],
            "description": "Group: /",
            "links": [
                {
                    "class": "H5L_TYPE_HARD",
                    "collection": "datasets",
                    "id": "6497e74e-6e8e-4290-bee9-535f5a66f665",
                    "title": "a"
                },
                {
                    "class": "H5L_TYPE_HARD",
                    "collection": "datasets",
                    "id": "edd5c2fe-db6d-4916-9e68-069c4df80005",
                    "title": "m"
                }
            ]
        }
    },
    "id": "31c7d987-47ce-4a03-92f7-2d4b3f0e5fb5",
    "root": "31c7d987-47ce-4a03-92f7-2d4b3f0e5fb5"
}

h5tojson fails with Type Error

Hi,

I am currently trying to convert an HDF5 file to JSON but running into an issue with types as below:

File "/home/hdfs/envs/parser_env/lib/python3.8/site-packages/h5json-1.1.3-py3.8.egg/h5json/h5tojson/h5tojson.py", line 244, in main
dumper.dumpFile()
File "/home/hdfs/envs/parser_env/lib/python3.8/site-packages/h5json-1.1.3-py3.8.egg/h5json/h5tojson/h5tojson.py", line 204, in dumpFile
print(json.dumps(self.json, sort_keys=True, indent=4))
File "/usr/local/lib/python3.8/json/init.py", line 234, in dumps
return cls(
File "/usr/local/lib/python3.8/json/encoder.py", line 201, in encode
chunks = list(chunks)
File "/usr/local/lib/python3.8/json/encoder.py", line 431, in _iterencode
yield from _iterencode_dict(o, _current_indent_level)
File "/usr/local/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict
yield from chunks
File "/usr/local/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict
yield from chunks
File "/usr/local/lib/python3.8/json/encoder.py", line 405, in _iterencode_dict
yield from chunks
File "/usr/local/lib/python3.8/json/encoder.py", line 325, in _iterencode_list
yield from chunks
File "/usr/local/lib/python3.8/json/encoder.py", line 438, in _iterencode
o = _default(o)
File "/usr/local/lib/python3.8/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.class.name} '
TypeError: Object of type bytes is not JSON serializable

I understand that this is because JSON expect strings and not bytes but I am not sure what is causing this error in the HDF5 file to see if I can convert it into string before the JSON conversion. Do you know any workarounds for this?

h5tojson tool adds fillValue creation property when it is not specified.

I think fillValue should not appear if it is not specified.

The JSON output is form the following file:
ftp://ftp.hdfgroup.uiuc.edu/pub/outgoing/NASAHDF/GSSTF_NCEP.3.2008.12.31.he5

         "7bf1e82a-324d-11e5-b18e-005056008d34": {
            "alias": [
                "/HDFEOS INFORMATION/StructMetadata.0"
            ], 
            "creationProperties": {
                "allocTime": "H5D_ALLOC_TIME_LATE", 
                "fillTime": "H5D_FILL_TIME_IFSET", 
                "fillValue": "", 
                "layout": {
                    "class": "H5D_CONTIGUOUS"
                }

Values truncated for Fixed-width null term strings

See testWriteFixedNullTermStringAttribute in hdf5dbTest.py - values where the length of the string is equal to the type length values have the last character silently truncated.

Equivalent program using C library does have this problem.

security bug in pip version?

I installed from pip
Typing the command
h5tojson.py outputs/autoencoder_model.h5

gives me a bunch of error messages including

from: can't read /var/mail/h5json

Why is it trying to read from my mail folder?

The package build from scratch doesn't have this problem. Please update pip.

Create hdf5-json validator.

I hope someone can write hdf5-json validation tool in Python. It should also check that valid uuids are used for dim scales within a JSON file.

Support Python 3

New projects should be using modern versions of Python which makes this library not possible to use at the moment. The main issue I see so far is using the legacy print statement instead of the modern print function. Additional problem areas may arrise, but these should be easily found using an automated testing tool such as tox.

I am willing to help with this, and in fact have started work on this.

No strings attached?

What about compound field names and enumeration keys?

ascii_string ::= TBD
char_string ::= TBD <-- Do we need this?
utf8_string ::= TBD
hdf5_path_name ::= TBD
hdf5_path_name_list ::= TBD
url ::= TBD
url_fragment ::= TBD
url_path ::= TBD

link_name and unicode

The grammar has this for link_name:

  ascii_string_wo_slash | unicode_string_wo_slash 

What is the intended distinction here? The JSON RFC says that all strings should be in unicode using utf-8.

h5tojson fails with SNDR data file

Running h5tojson.py fails with "Segmentation Fault" with the file: SNDR.SNPP.ATMS.20150515T0012.m06.g003.L1B_sample.v00_00_01.T.150901132017.nc.

File is available here: s3://hdfgroup/data/hdf5test/SNDR.SNPP.ATMS.20150515T0012.m06.g003.L1B_sample.v00_00_01.T.150901132017.nc.

Arrays & Co.

byte_array ::= TBD
maxdims_array ::= TBD
non_negative_integer ::= TBD
non_negative_integer_array ::= TBD
positive_integer ::= TBD
positive_integer_array ::= TBD

h5tojson_test raises "no attribute 'get_create_plist'" exception while converting several test files

After cloning and "sudo python setup.py install", ran "python testall.py".
h5tojson_test failed while converting the following files with the error AttributeError: 'h5py.h5t.TypeCompoundID' object has no attribute 'get_create_plist'

  • committed_type.h5
  • compound_committed.h5
  • namedtype.h5
  • sample.h5

Traceback (most recent call last):
File "../../h5json/h5tojson/h5tojson.py", line 248, in
main()
File "../../h5json/h5tojson/h5tojson.py", line 244, in main
dumper.dumpFile()
File "../../h5json/h5tojson/h5tojson.py", line 202, in dumpFile
self.dumpDatatypes()
File "../../h5json/h5tojson/h5tojson.py", line 184, in dumpDatatypes
item = self.dumpDatatype(uuid)
File "../../h5json/h5tojson/h5tojson.py", line 173, in dumpDatatype
attributes = self.dumpAttributes('datatypes', uuid)
File "../../h5json/h5tojson/h5tojson.py", line 55, in dumpAttributes
attr_list = self.db.getAttributeItems(col_name, uuid)
File "/usr/lib/python2.7/site-packages/h5json-1.1.3-py2.7.egg/h5json/hdf5db.py", line 1219, in getAttributeItems
for name in obj.attrs:
File "/usr/lib/python2.7/site-packages/h5py-2.9.0-py2.7-linux-x86_64.egg/h5py/_hl/attrs.py", line 257, in iter
cpl = self._id.get_create_plist()
AttributeError: 'h5py.h5t.TypeCompoundID' object has no attribute 'get_create_plist'

Environment:
Fedora release 30
h5py 2.9.0
HDF5 1.10.4
Python 2.7.16 (default, Apr 30 2019, 15:54:43)
[GCC 9.0.1 20190312 (Red Hat 9.0.1-0.10)]
sys.platform linux2
sys.maxsize 9223372036854775807
numpy 1.16.3

Add a command line option to jsontoh5 to specify the library version when creating a file

Currently every HDF5 file produced with jsontoh5 will use the latest file format, H5F_LIBVER_LATEST: https://github.com/HDFGroup/hdf5-json/blob/develop/h5json/hdf5db.py#L167. This creates a problem for software that still does not use, or never will, the latest version of the HDF5 library.

jsontoh5 should have something like --libver option with either earliest or latest for values. I don't think that h5py yet supports all the new options described here: https://portal.hdfgroup.org/display/HDF5/H5P_SET_LIBVER_BOUNDS.

jsontoh5.py fails on converting json file

Hello,
I'm converting an hdf5 file produced by keras to json which works fine.
However, upon converting back from this json file to hdf5 I am running into errors with a call within h5json/hdf5db.py, in the function toRef(....)
In Line 1745, in toRef, for item in data:
Nonetype object is not iterable.

Is there a current issue in handling hdf5 files generated by Keras?

h5tojson.py should exit with a non-zero error code when the file does not exist

I am using this hytojson.py as a command line script. When I attempt to run it with a file that is missing, it returns a 0 error code with the output `

hdf5-json/util# python h5tojson.py hello
Cannot find file: hello
hdf5-json/util# echo $?
0

This does not follow the same sort of pattern that other command line utilities follow.

hdf5-json/util# echo $?
2
hdf5-json/util# cat notafile
cat: notafile: No such file or directory
hdf5-json/util# echo $?
1

I think that if the script can not find the file, it should not return a 0 status code. Thoughts?

Provide an array type example of compound dataset.

Please provide compound dataset example that matches the following h5dump output:

              DATASET "Sensor" {
                  DATATYPE  H5T_COMPOUND {
                     H5T_IEEE_F64LE "Time";
                     H5T_ARRAY { [4] H5T_IEEE_F32LE } "Concentration";
                     H5T_ARRAY { [8] H5T_STD_I8LE } "Species";
                  }
                  DATASPACE  SIMPLE { ( 15 ) / ( H5S_UNLIMITED ) }
               }

The current example doesn't cover the above case.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.