Append data to matfile using parallel method
Mostrar comentarios más antiguos
Hi:
I have a lots of data that needs to be saved into a test.mat file, below are my test code:
x=rand(10000,1);
save('test.mat','x');
for i=1:1:100
eval(['va_',num2str(i),'=rand(10000,1);'])
eval(['save(','''','test.mat','''',',','''','va_',num2str(i),'''',',','''','-append','''',')'])
end
the problem is that, this is a test code, in my real situation:
1. the number of variables is very large in my situation (up to va_10000).
2. the size of data of each 'va_i' is very large (up to size of 2e6*1).
in this way, although I have upgrade my drive into 960EVO ssd, the saving time is still significantly large.
is there anyway to improve the code into parallel saving? so that I could save the computational cost?
Thanks!
Yu
6 comentarios
2e6 elements per array for 10000 arrays... assuming double type, this requires in the region of 160 Gigabytes of memory. Does your computer have that much memory?
Do all of those arrays need to be in MATLAB memory at the same time? How do you want to process the data in that file? Would multiple files be acceptable instead?
"is there anyway to improve the code into parallel saving?"
Avoiding eval would be a start.
The MATLAB JIT engine does a lot of optimizations if it can... but not if you use eval:
Yu Li
el 13 de Sept. de 2018
Walter Roberson
el 13 de Sept. de 2018
function para_save(i)
varname = sprintf('va_%d', i);
savestruct.(varname) = rand(10000,1);
save('test.mat', '-struct', 'savestruct', '-append');
However, MATLAB does not promise that you can have multiple simultaneous save() to the same file.
Walter Roberson
el 13 de Sept. de 2018
Is the size and data type of each variable the same?
Is the data likely to be compressible?
Yu Li
el 13 de Sept. de 2018
Respuestas (2)
Steven Lord
el 13 de Sept. de 2018
0 votos
Consider writing each variable to a different file in such a way that when you want to use them later on you can construct a datastore using that collection of files and make a tall array from the datastore.
1 comentario
Yu Li
el 13 de Sept. de 2018
Walter Roberson
el 13 de Sept. de 2018
0 votos
You cannot write to a mat file in parallel. If writing in parallel to a mat file is a requirement then your problem cannot be solved.
If computation of the items is expensive, then do the computation in parallel, writing to different mat files (though potentially one per parallel core rather than one per variable.) Afterwards, merge the files together in a serial loop.
With the data not being compressible, either write in binary or else use the -7.3 option to not compress the output.
2 comentarios
Yu Li
el 13 de Sept. de 2018
Walter Roberson
el 13 de Sept. de 2018
Overall saving time might not increase under the assumption that calculation of the array is expensive. If the average rate of graduation is less than the time required to save one variable then parfor for the calculation and merging afterwards can potentially save time.
Another approach in the case where calculations are expensive is to use a pollable data queue to calculate results in parallel and send them back to the client process to do the saving.
If the average rate of graduation is faster than the time to save one variable then you are probably bandwidth limited in writing to the ssd, and increasing the number of simultaneous writers will not increase the bandwidth.
Categorías
Más información sobre Workspace Variables and MAT Files en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!