> Hi, > > I work on a program using Monte-Carlo simulations so I manipulate huge > data : vectors of more than 20 millions of elements. > I want to estimate (and understand) the max memory consuption of my > algorithm in order to prevent "Out of memory" on computation, in order > to find the max numbers of elements i can treat. > > It's easy to see between each instructions how much memory is needed for > data on workspace. > But it's more difficult to estimate how much memory is needed to execute > built-in functions or some complex instructions like : > Vector1 = Vector1 + exp(param1 + param2*rand(20e6,1)).*(param3<Vector2) > where Vector1 and Vector2 are both 20 millions elements vectors. > > I've hoped that using the undocumented feature "profile memory on" can > help me to estimate memory, but it seems that it doesn't take care of > complex instructions or buit-in instruction (for example sort). > > Is there a way to profile more efficiently the memory in Matlab ?
You've tried breaking 'Vector1' into smaller parts, and insert code breaks?