I've been struggling with a MCR executable that runs out of memory during 2-3 weeks of continuous running (we are using MCR for realtime continuous measurements and analysis).
After hours and hours of debugging I was able to get rid of the memory leak by removing annotations from my figures. I've submitted a service request to MathWorks but they are not able to reproduce my problem (the service request is still open so all hope is not lost).
The fact that Mathworks is unable to reproduce my findings troubles me. Are my memory problems related to "external" components (i.e. java installation etc)?
I'm therefore hoping that some of you would try to run the script below and let me know what you see. The Niters variable determines the number of loops. It's currently set to 1000 but feel free to modify it. The example is based on suggestions from MathWorks and Niters have to be large enough to see a clear trend.
The script uses the memory() function which is only available in later Matlab versions. On Matlab 2012a (which is what we use currently) I see a 20Mb increase in used matlab memory after the test run. This memory is never released - you have to quit Matlab before getting it back. I've also run a few tests on 2012b. Here, the problem is not so large but I still see an increase in used memory (3-5Mb).
Anyway.. The script is presented below. Sorry if the formatting gets screwed up. If I comment the two annotation lines the problem goes away on my setup.
clear all; close all; clc;
t = clock; Niters = 1000; memused = zeros(1,Niters); for iter = 1:Niters fprintf('Iteration %d of %d\n',iter,Niters);
info = memory; memused(iter) = info.MemUsedMATLAB; close all; clc; clearvars -except Niters memused iter t end