On 14 Des, 19:09, "Alex Berkovich" <alex...@tx.technion> wrote: > Hello all, > > I'm running a mex function that calls for c code (compiled in the same file as the mex function itself). > when a simulation is done, it seems matlab doesn't release the memory it uses since i see a growing amount of memory being used by matlab (via task manager). > > it reaches the limit of my computer and then either crashes or simply continues to run forever. > > any suggestion on how to solve this problem is very appriciated.
Any data structures you allocate with malloc-type functions (or with new in C++) should be freed by the appropriate corresponding calls to free or delete before your MEX function returns.
As for mxCreateXXXX, only use them for whatever you intend to return back to the matlab workspace. I've never understood how the matlab memory managment system works (it seems to be based on some Java-type garbage collector) and I've always ended up with segmentation faults when I used mxFree on local variables inside the MEX function.