Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Calculating a smooth 90% limit for differences in a time series
Posted:
Feb 28, 2014 8:38 AM


I have 50 data sets. Each set has three related time series: fast, medium, slow. My end purpose is simple, I want to generate a number that indicates a relative degree of change of the time series at each point. That relative degree of change should range between 01 for all the time series and all the data sets. The scales of the data set range from .0001 to 100.
To accomplish this, I calculate the differences in a time series, delta(ts)=ts(t)  ts(t1). Now I am trying to calculate an upper limit of 90% of those deltas. In other words, draw a smooth line on those differences such that only about 10% of the differences exceed that line. I will use that 90% limit to establish a maximum to normalize the differences between 01. Is this the best way to do this?
I've been working on this for months, mostly linear programmatic methods, with no success. And trying to get it to work across 50 data sets is killing me. I'm sure there has to be an elegant mathematical way to do this. I can't be the first guy in town trying to normalize a relative degree of change of a time series.
Any help or directions for research are greatly appreciated! Obviously my math skills are weak so examples would be most helpful. Thank you everyone for your time and brain power!



