I have created a hdf5 file in Matlab with a Matrix size of (1 x 19,000,000). The file had a size of 150 megabytes.
My question is on how to find the perfect chunk size and deflate ratio? After playing around I have discovered that a chunk size of 1 x 1,000,000 with deflate set to 7 achieves a file of 100 megabytes.
My second problem is that I am unable to import this file in Python
Matlab
h5create('Xn.h5','/rawdata',size (data),'ChunkSize',[1 1000000],'Deflate',7 )
Python
import h5py
filename = 'Xn.h5'
f = h5py.File(filename, 'r')
print("Keys: %s" % f.keys())
I expected that Python will handle the data smoothly just as matlab but this never happened
f[list(f.keys())[0]]a_group_key = list(f.keys())[0] ;data1 = list (f1[a_group_key])takes 2 hoursdata1 = list (f1[a_group_key])command you issue converts the returned data into a list. Please execute exactly what I suggested:f[list(f.keys())[0]]