Strategy / process for working with large (300 x 300 x 10,000) matrices that aren't fitting in memory
2 views (last 30 days)
Hey MATLAB community - could use your help understanding how to efficiently work with multiple large matrices.
Shape of the data
I have 10 matrices that measure 300 x 300 x 10,000 stored as single precision.
Current (probably incorrect) approach
I'm storing the 10 matricies in a structure S, where each matrix is assigned to a field (e.g., S.var1 = 300 x 300 x 10,000 single precision matrix)
I need to perform various operations on these matrices that range from simple (like cropping and rotating) to complex (like calculating correlations and covariance of certain points along all the dimensions).
Needless to say, I commonly run into memory errors on my system and I've started to consider alternative approaches like: datastores, tall arrays, mapreduce, and distributed arrays.
It would be great if I could get anyone's guidance or perspective on the best strategy or process to work with these large matrices.
PS - for reference my system has 16GB of RAM, and ~100GB of free space on an SSD. I realize upgrading my computer may be the simplest option, but would like to avoid that if at all possible.