Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: spmd and parfor with large broadcast variables
Replies: 2   Last Post: Dec 31, 2012 3:11 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Edric Ellis

Posts: 691
Registered: 12/7/04
Re: spmd and parfor with large broadcast variables
Posted: Jun 15, 2012 2:55 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"Benjamin " <bfiles@usc.edu> writes:

> I'm new to the idea of parallel computing with matlab on a cluster. I
> understand that using broadcast variables, like
>
> matlabpool 10
> bigd = loadBigData(); % loads a large matrix of class double
> r = zeros(100,1);
> parfor ii = 1:100,
> r(ii) = doSomething(ii,bigd);
> end
>
> causes a lot of extra network traffic. My first question is whether
> the variable bigd is sent over the network 10 times (i.e. once per
> lab) or 100 times (once per loop iteration).


Data broadcast into a PARFOR loop is sent over once per lab - however,
even that is possibly more than ideal. You could for example use my
Worker Object Wrapper

<http://www.mathworks.com/matlabcentral/fileexchange/31972>

to minimise the data transfer. If 'loadBigData' can run on the labs
(i.e. has access to the data files), you could do:

bigdWrapper = WorkerObjWrapper(@loadBigData, {});
r = zeros(100,1);
parfor ii = 1:100,
r(ii) = doSomething(ii,bigdWrapper.Value);
end

otherwise create bigdWrapper like so:

bigdWrapper = WorkerObjWrapper(loadBigData());

Cheers,

Edric.



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.