Grouping time stamp data into intervals
Mostrar comentarios más antiguos
I have a set of time stamps with a sampling frequency of 1000 (samples every .001). I want to take any time stamps that are continuous for more than 10 seconds (10,000 data points) to be taken as an interval. Is this possible? I attached a example data of time stamps.
8 comentarios
>> u=unique(t);
>> length(u)==length(t)
ans =
logical
1
>>
There are no timestamps that are repeated in the file.
Systematically Neural
el 16 de Oct. de 2018
Systematically Neural
el 16 de Oct. de 2018
Systematically Neural
el 16 de Oct. de 2018
dpb
el 16 de Oct. de 2018
Did you try any of those methods? Any look like should work to find the sequences in the input vector.
I still don't know what your expected output is, though.
How about a very small artificial sample that illustrates what you have in mind, exactly?
"have more specific time stamps so perhaps the derivative is off?"
You don't show anything that you've actually tried so we can't infer, but...
NB: the example dataset at the link is integer-valued and you have floating point. There's rounding with floating point so that a test for "identically equal" with == or find is not robust; you need to incorporate a tolerance to allow for that to ensure you find the values you intend that match.
Jonas below used a sizeable value of 10% of the difference; in this case given that you're looking for a fixed delta that will work; in the other thread I showed an example using eps that will be in the actual neighborhood of the values of rounding error that will actually be present.
If you tried some of those methods or just straight diff and find, undoubtedly you ran into this problem; I did a test while trying to figure out the earlier question of "repeated" values and there were something like 30-40% of the locations that should have been considered connected that weren't between the two. Moral of that story is never forget about FP not being exact. It's particularly true w/ a number like 0.001 that isn't possible to be stored as it is not representable exactly by a binary fraction.
Systematically Neural
el 17 de Oct. de 2018
Editada: Systematically Neural
el 17 de Oct. de 2018
I don't have any idea what you mean by "the TMW toolbox" but FP precision and rounding is inherent in FP storage by definition.
There are ways to minimize the magnitude of it, one of the prime examples of difference possible in seemingly innocuous calculations are with data such as you show where one can do something like
dt=0.001;
t=[0:dt:10]; % generate a time vector for 10 sec @ 1 kHz
as compared to
t=linspace(0,10,1000*10+1); %
Actual numerics by example--
dt=0.001;
t1=[0:dt:10];
t2=linspace(0,10,1000*10+1);
t3(1)=0;for i=2:10001,t3(i)=t3(i-1)+dt;end
t4(1)=0;for i=2:10001,t4(i)=(i-1)*dt;end
Let's compare...
NNZ(method1==method2)
colon linspace summation product
colon 8658 19 9032
linspace 17 8663
summation 18
The magnitude of difference at the end in units of FPP precision at the value. All actually round correctly at the end except for the simplistic addition that compounds the rounding of the value of dt every step. The differences are in the intermediary values of just how the error is split excepting for the summation that compounds the error.
>> [[t1(end);t2(end);t3(end);t4(end)]-10]/eps(10)
ans =
0
0
-58
0
>>
What the correlation-like table shows is that colon is most like summation excepting it "fixes up" the end point while linspace is more like but not exactly the same as the product
What the last shows is that one may need a fairly large tolerance by the end of the series if one computes the timestamps in one fashion but does the comparison using another technique to calculate time values.
The other issue is the precision of the data file as to whether full precision values are stored/read back or there's rounding there in a text format that can cause issues.
Respuesta aceptada
Más respuestas (0)
Categorías
Más información sobre Matrix Indexing en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!