Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: GPU and Neural Network ... is this operation legitimate?
Replies: 3   Last Post: Nov 20, 2012 6:12 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
condor

Posts: 30
Registered: 3/2/10
GPU and Neural Network ... is this operation legitimate?
Posted: Nov 20, 2012 4:36 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

Hi, I need to compute many simple Neural Networks... thousands.

I would like to use a parallel for on the GPU.

The FIRST problem is that I am not able to compute the net on the GPU:

% Net creation
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize);
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
%Training
net.trainFcn = 'trainscg';
net.trainParam.showWindow = false;
net.trainParam.showCommandLine = false;
% Send input and target on the GPU
input_A=gpuArray(input)
target_A=gpuArray(target)
% Train
[net_new,~] = train(net,input_A,target_A)

I receive this error:
Error using network/train (line 293)
Number of samples (rows of gpuArrays) of data arguments do not match.

QUESTION:
Is it possibile to compute the net on the GPU (I want that every core train a different net)? If yes, what am I doing wrong?



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.