Matlab performance handling large arrays
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I have a code which depending on how many iterations I choose, may end up with arrays in excess of 5,000,000 by 3. I soon started running into "out of memory" type problems because of the individual size of the large matrices.
I initialised my matrices beforehand, so all memory should have been allocated. Still sometimes I would get memory probs, but more interesting is that as the simulation progressed, it got gradually slower, eventually reaching 100% but at an exponentially slower pace it seemed.
I solved it by using smaller arrays, and after a set number of steps, assigning those matrices to other fixed matrices (which are not accessed in every loop), and then restarting the "looping array". So for e.g. say A is an array accessed in every loop. After the first 100 loops, i assign A to B (which is fixed and not accssed). I clean out A, then use it to fill in the next 101-200 steps, assign that portion to say array C, etc etc. So A is the only "dynamic" variable here.
So I fixed the issue it runs much better now, I am just curious to know why this would happen? Can anyone shed some light?
0 comentarios
Respuestas (1)
Jan
el 19 de Ag. de 2012
A [5e6 x 3] double array needs 120MB RAM. This should not cause out-of-error messages. To understand the cause of your problems, we have to see the code.
0 comentarios
Ver también
Categorías
Más información sobre Loops and Conditional Statements en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!