MATLAB Answers

Memory usage very high

28 views (last 30 days)
Julia Rhyins
Julia Rhyins on 22 Nov 2019
Commented: Walter Roberson on 25 Nov 2019
I always have problems with matlab (R2019b) using too much memory (way more than the variables I have saved). Currently I'm running a function to extract data from a number of structures. I paused the function because the level of RAM being used just doesn't make any sense. Task manager says that Matlab is using 4.7gb of memory, even though I'm not running anything right now. The total size of all the variables in my workspace is ~0.055gb and I have no figure windows open. The only two programs I have running on my computer are Matlab and Task Manager. Is there any reason that Matlab would be using so much memory and is there a way for me to reduce it?

  0 Comments

Sign in to comment.

Answers (1)

Jan
Jan on 22 Nov 2019
How do you observe the memory consumption? The Taskmanager displays the memory reserved for Matlab. If Matlab allocates memory and releases it afterwards, this is not necessarily free'd directly. As long as no other application asks for the memory, it is efficient to keep its status.
Does it cause any troubles, that the OS reserves 4.7GB RAM for Matlab? Why to you say, that this is "too much" memory?
Although the current consumption of memory is small, growing arrays can need much more memory. Example:
x = [];
for k = 1:1e6
x[k] = k;
end
Although the final array x occupies 8MB of RAM only (plus about 100 Bytes for the header), the intermediate need fro RAM is much higher: sum(1:1e6)*8 bytes = 4 TerraBytes. Explanation: If x=[1], the next step x(2)=2 duplicates the former array and appends a new element. Although the intermediately used memory is released, there is no guranateed time limit for the freeing.
Can you post some code, which reproduces the problem.

  6 Comments

Show 3 older comments
Jan
Jan on 23 Nov 2019
@Julia: The task manager tells you, how much memory is reserved for Matlab. Loading a file might reserve space for disk caching - this detail is not documented, so I'm only guessing. This would mean, that the consumption in the task manager has no "real" meaning. As soon as another application needs the memory, it can be distributed again.
What does "it will almost definitely crash my computer" exactly mean? What happens? Do you get error messages? If so, which one?
In Matlab the clear commands are usually a waste of time only.
By the way, path is an important Matlab command. Do not use this name for a variable to avoid serious troubles during debugging.
Use fullfile instead of creating the file name manually. Replace
sprintf('%s\\%s',idx(i).folder,idx(i).name)
by
fullfile(idx(i).folder, idx(i).name)
Loading MAT files can create figures and other hidden data, e.g. stored persistently in function or user-defined classes. Checking the sizes of the variables in the workspace is not enough to exclude such side-effects.
Your oiriginal code shows exactly, what I have mention with the need for pre-allocation. replace
snrs = [];
for i = 1:length(idx)
snrs = [snrs;SNRdB];
end
by
snrs = zeros(1, length(idx));
for k = 1:numel(idx)
snrs(k) = SNRdB;
end
numel is more stable than length, because the latter chooses the longest dimension, while numel does exactly, what is wanted. For vectors the result is the same, but in real code this is applyied to matrices too often. Using numel is clear and direct.
Using "i" as loop counter is deprecated by MathWorks to avoid a confusion with 1i. Well, this might be a question of taste.
Julia Rhyins
Julia Rhyins on 25 Nov 2019
Someone pointed out to me that .mat files are compressed so when the file is loaded it may be larger than what I see in the file explorer. I think this may be my problem because each file that I load contains a couple of figure handles. In terms of the error, I think there is some initial memory warnign, but it is quickly buried in the command window by continuous printing of 'Warning: Error updating line. Update failed for unknown reason'
Walter Roberson
Walter Roberson on 25 Nov 2019
The total size of all the variables in my workspace is ~0.055gb and I have no figure windows open.
each file that I load contains a couple of figure handles.
There is a contradiction there. The only way to load figure handles is to create figure windows from them. Those figures might not be visible but they are open. And if you do not close those figures after you are finished with them then the memory for the (possibly not visible) figures will add up.

Sign in to comment.

Sign in to answer this question.