Running a large array
29 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hi,
I am trying to load in an array (the array is 362x332x75x1032 (34.7GB)), which exceeds maximum array size preference (16.0GB). How do i go about loading it in.
It is a variable which is part of a .nc dataset
0 comentarios
Respuestas (3)
Star Strider
hace alrededor de 9 horas
I have rarely needed to use them, so I have little experience with them.
2 comentarios
Star Strider
hace alrededor de 6 horas
I was not aware that it was a CDF file. There is a set of functions to work with netCDF Files, and another set of functions to work with CDF Files. They may have options that would work. (I rarely use CDF files, so I do not have extensive experience with them and functions using them.)
John D'Errico
hace alrededor de 5 horas
Editada: John D'Errico
hace alrededor de 5 horas
Memory is cheap. Get more memory.
I'm sorry, but if you want to work with big data, you will often need sufficient capabilities to handle that data. No matter what, working with huge arrays, lacking sufficient will be slow. So your next question will be, how can I make my code run faster. Again ... get sufficient memory. Or, solve smaller problems.
2 comentarios
Walter Roberson
hace alrededor de 1 hora
Did the professor assign the hardware and say "you must run it on this hardware" ?
If not, then:
- if it is your own hardware, then there is always the option of upgrading it
- if it is university hardware, then there is always negotiating with the university to obtain an upgrade
Voss
hace alrededor de 3 horas
You can read and process the file in sections, one section at a time, in a loop, by specifying the start and count input arguments to ncread.
And/or specify the stride argument to read only every so many values instead of all of them.
0 comentarios
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!