"inti" wrote in message <firstname.lastname@example.org>... > Hi to all, > i set a period (T) of sending signal = 100ms, but when in my file log i saw that the difference between sending signal is varying , as example: (437 after 477 , the difference between these is < 100ms) For this reason, i interpolated it to get an estimated constant interval T by using interp1 function. > > The problem here that after applying interp1 , it gives me a 4320 row data OR my spreadsheet RES contains 392. > My program > > //if > s=data(:,3)+data(:,4)/1000; % convert to seconds+fractions > t = datenum(2014,07,18,data(:,1),data(:,2),s)*86400; > t = t - t(1); > sample = interp1(t,res(:,1:3), 0:0.01:t(end)); % i fixed a time to 100ms to obtain estimated data in this time > > > > // end > > > Is my program correct please? it is normal to obtain a rows sampled data > original data
Think about what you did.
What is the third argument you passed into interp1?
What is t(1)? Why would you start interpolating at 0? What do you expect it to do?