time_max = 20; timestep = 0.1; t = 0:timestep:time_max-timestep; x = zeros(10,size(t));
for i=1:1:time_max/timestep for index=1:1:20 Input = I(index); if (i>(1/timestep)) Input=0; end w = x(index,i+1); signal = 0; for k=1:1:10 if (k ~= index) signal = signal + x(k,i); end end x(index,i+1) = -A*x(index,i)+(B-x(index,i))*(w+I)-x(index,i)*signal; end end
%plot(t,x(1,:)) %plot(t,x(2,:)) %plot(t,x(3,:))
Hey I'm writing this code and it won't compile correctly and I have no idea why. The basis of this code (in case it helps) is that there are ten columns to x with #rows=time/timestep. The loops go through each column for a row to compute a certain value and then move on to the next row. Also I is a 1-D array of size 10 with constants.
First there is a warning concerning the initialization x = zeros(10,size(t)); -- " Input arguments must be scalar."
Secondly, what I'm really concerned about, is that there is an error concerning the line x(index,i+1) = -A*x(.... which says -- " Undefined function 'plus' for input arguments of type 'cell' "
Anybody have some idea why this is happening? Thanks, in advance. PS: First post, so not sure if this kind of 'debugging' posts is ok