How read extremely large HDF5 format data and resolve out of memory issue?

13 visualizaciones (últimos 30 días)
I need to read several datasets from a 2TB HDF5 file which are needed for further computation.
If I simply coding as following,
varible1=h5read('path to .h5 file', 'path to dataset')
It would require ~500 GB array memory.
Is there any good way to solve this problem?
Thanks!
  1 comentario
Jonas
Jonas el 3 de Mayo de 2022
Editada: Jonas el 3 de Mayo de 2022
does this help? https://de.mathworks.com/matlabcentral/answers/423693-read-and-divide-hdf5-data-into-chunks
maybe you could also spit data using oython beforehand, i saw multiple those scripts when googling

Iniciar sesión para comentar.

Respuestas (1)

ROSEMARIE MURRAY
ROSEMARIE MURRAY el 3 de Mayo de 2022
You could use a fileDatastore with the read function h5read, which would allow you to specify a certain amount to read at a time.

Productos


Versión

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by