The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » Software » comp.soft-sys.matlab

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Memory consumption
Replies: 2   Last Post: Jul 3, 2012 2:57 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]

Posts: 16
Registered: 6/18/12
Re: Memory consumption
Posted: Jul 2, 2012 1:05 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On Mon, 02 Jul 2012 07:50:30 -0700, SLt wrote:

> Hi,
> I work on a program using Monte-Carlo simulations so I manipulate huge
> data : vectors of more than 20 millions of elements.
> I want to estimate (and understand) the max memory consuption of my
> algorithm in order to prevent "Out of memory" on computation, in order
> to find the max numbers of elements i can treat.
> It's easy to see between each instructions how much memory is needed for
> data on workspace.
> But it's more difficult to estimate how much memory is needed to execute
> built-in functions or some complex instructions like :
> Vector1 = Vector1 + exp(param1 + param2*rand(20e6,1)).*(param3<Vector2)
> where Vector1 and Vector2 are both 20 millions elements vectors.
> I've hoped that using the undocumented feature "profile memory on" can
> help me to estimate memory, but it seems that it doesn't take care of
> complex instructions or buit-in instruction (for example sort).
> Is there a way to profile more efficiently the memory in Matlab ?

You've tried breaking 'Vector1' into smaller parts, and insert code breaks?

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.