comp.lang.idl-pvwave archive
Messages from Usenet group comp.lang.idl-pvwave, compiled by Paulo Penteado

Home » Public Forums » archive » Writing large datasets in HDF5 file fails
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Switch to threaded view of this topic Create a new topic Submit Reply
Writing large datasets in HDF5 file fails [message #80062] Wed, 09 May 2012 11:11
d.rowenhorst@gmail.co is currently offline  d.rowenhorst@gmail.co
Messages: 10
Registered: August 2008
Junior Member
Running IDL 8.1 on Mac 10.7.3.
I can't write out a single dataset to a HDF5 file that is larger than 2^32-1 bytes. If I try I get an error at the H5D_WRITE procedure. I can have multiple datasets in one file that add up to more than 4.3GB, but an individual dataset cannot be more than 4.3GB.
So if writing float data I can write 2^32/4-1 values, but not 2^32/4. Writing double allows for 2^32/8-1 but not more than that.

Can anyone check and see if this is platform dependent, or am I missing something more fundamental here? Here is my test code:

PRO Testhdf5file
filename = 'test.hdf5'
npoints = 2ll^31-1 ; This works
;npoints = 2ll^31 ; This does not work
data = BINDGEN(npoints)
file_id = H5F_CREATE(filename)

datatype_id = H5T_IDL_CREATE(data)
dataspace_id = H5S_CREATE_SIMPLE([npoints])
temp_ds_id = H5D_CREATE(file_id, 'Fake Data', datatype_id,dataspace_id)
H5D_WRITE, temp_ds_id, data
H5D_CLOSE, temp_ds_id
H5F_CLOSE, file_id
END
  Switch to threaded view of this topic Create a new topic Submit Reply
Previous Topic: Re: Matrix multiplication again...
Next Topic: IDL Books on Sale

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Wed Oct 08 20:00:02 PDT 2025

Total time taken to generate the page: 0.00296 seconds