Compress only selected variables when saving to .mat
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I have two variables data and meta, which I am saving in a compressed .mat file (version '-v7' ). The data variable is usually 800mb uncompressed, while meta might not even be 1mb. I have lots of these .mat files and sometimes I just need to loop through all the meta variables. However, since the file is compressed, it still takes lots of time to load the meta variable alone , i.e. same time as if I were to load both variables.
Is it possible to selectively compress specific variables in a .mat file ? Alternative data designs?
Note : I already have an overall single meta which is basically the concatenation of the single smaller ones. However, I will need to abandon this approach because it does not scale well size-wise and performance-wise.
6 comentarios
per isakson
el 25 de Jun. de 2014
Editada: per isakson
el 25 de Jun. de 2014
I've spent too much time experimenting with the low and high level HDF5 API of Matlab and alternatives. I'm not sure my "results" are relevant to your use case.
My use case:
- many hundred 1MB time series
- reading much more important than writing performance
- typically reading of entire time series
My conclusions:
- the low level HDF5 API is not worth the trouble (in my case)
- the system cache is important to the performance (buy more RAM)
- store double only when necessary
- chunking comes at a high price (performance)
I think it is difficult to recommend anything without knowing a bit more about the internal structure of the 800MB and the 1MB together with descriptions of some typical "queries".
.
"However, it means changing my API in many places, and risk of introducing bugs."
I guess the documentation refers to
Tool Name: h5repack
Purpose:
Copies an HDF5 file to a new file with or without compression
and/or chunking.
whether h5repack can help depends on your queries.
Respuestas (1)
Jeremy
el 23 de En. de 2015
I know this is old, but I have been doing some similar work recently since I could tell it was taking longer than it should to load a small portion of the data using the matfile method. I also spent two days understanding how to use all the low level HDF5 commands only to find that it did not really help on the read side.
Then I realized that the issue is originating on the write side and the compression that occurs. the savefast utility in the file exchange saves using the high level hdf5 commands and this does NOT compress the data. This didn't quite work for me since I am saving complex numbers but I was able to use the same approach and I am now saving uncompressed v7.3 files and reading small portions of my matrix over 100 times faster!
If your matrix is just real numbers, you should be able to create a v7.3 file with your metadata and then use the simple high-level h5write command to save additional variables to the same file uncompressed.
Ver también
Categorías
Más información sobre HDF5 en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!