You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First - thanks for your efforts to get this standardized!
I found that when using more than ~8,500 channels, initializing the miniDAS object fails with an error (below). I suspect (but have not rigorously tested) it's the lat/lon/elevation arrays that become too large for HDF5 attributes, in which case maybe they need to be listed as data arrays instead? This can be reproduced even in the test functions, by cranking up nchannels in conftest.py to be =8500.
File ~/anaconda3/envs/dasrcn/lib/python3.8/_collections_abc.py:832, in MutableMapping.update(self, other, **kwds)
830 if isinstance(other, Mapping):
831 for key in other:
...
File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()
File h5py/h5a.pyx:50, in h5py.h5a.create()
RuntimeError: Unable to create attribute (object header message is too large)`
The text was updated successfully, but these errors were encountered:
Thanks, Daniel for testing.
It has been decided (yesterday) to actually discontinue this miniDAS format in favour of the upcoming (during 2023) official IRIS format
The github page has been updated accordingly
Thanks for the update - I guess there's very soon a webinar discussing your DAS month and formats? I could have waited a few more hours. In any case thanks for putting this forward, initially!
First - thanks for your efforts to get this standardized!
I found that when using more than ~8,500 channels, initializing the miniDAS object fails with an error (below). I suspect (but have not rigorously tested) it's the lat/lon/elevation arrays that become too large for HDF5 attributes, in which case maybe they need to be listed as data arrays instead? This can be reproduced even in the test functions, by cranking up nchannels in conftest.py to be =8500.
`---> 27 md = miniDAS.from_numpy(file="data_miniDAS/test6.h5", data=data.T.astype('float32'), meta=meta, force=True)
File ~/...../miniDAS/format.py:145, in miniDAS.from_numpy(cls, file, data, meta, compress, force)
138 dataset = container.create_dataset(
139 "miniDAS",
140 chunks=True,
141 data=data,
142 compression="lzf" if compress else False,
143 )
144 dataset.attrs["version"] = cls.version
--> 145 dataset.attrs.update(asdict(meta))
147 return cls(dataset, meta)
File ~/anaconda3/envs/dasrcn/lib/python3.8/_collections_abc.py:832, in MutableMapping.update(self, other, **kwds)
830 if isinstance(other, Mapping):
831 for key in other:
...
File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()
File h5py/h5a.pyx:50, in h5py.h5a.create()
RuntimeError: Unable to create attribute (object header message is too large)`
The text was updated successfully, but these errors were encountered: