Yeah regularly sampled is the goal almost always, and great when its available! The main times I deal with non-uniformly sampled data is with jitter and missing data etc
I guess I'm not totally sure what the canonical way would be, probably convolution with the N'th derivative of a guassian smoothing kernal where the smoothing response is chosen by frequency analysis, or something along those lines. You could also just smooth the signal then differentiate it numerically (probably equivalent but less efficient). I would personally go for this bayesian filtering approach or some kind of local polynomial approximation like splines or the Savitzky-Golay filter people are talking about this comment section because it would probably be easier to deal with missing data etc.
Noise is added here:
```
# Generate noisy sinusoidal data with random time points
np.random.seed(0)
t = sorted(np.random.uniform(0.0, 10.0, 100))
noise_std = 0.01
y = np.sin(t) + noise_std * np.random.randn(len(t))
true_first_derivative = np.cos(t)
true_second_derivative = -np.sin(t)
```
changing noise_std will change the magnitude of the noise added, hope that helps!
Ah true! Now I see it on the image too, I was looking at it on the phone and it did not look like it. I will definitely give it a try! Thank you very much
Thanks so much! Yeah this was also a key reason I like this approach. Quite often we end up with repeated values due to quantisation of signal or timing differences or whatever and we get exactly that problem you describe, either massive gradients or 0 gradient and nothing in betweeen. With the KF approach you can just strip out the repeated values and run the filter with them missing and its fine. In the quantisation case you can approximate the quantisation observation noise by using resolution*1/sqrt(12) and it also all just works nicely. If you have any sample data of some fun problems and don't mind sharing then let me know and we could add some demos to the library!
Sounds like a fun project! I've not spent much time on ensemble KF but my mate Sam (https://github.com/samDuffield/) did a lot of work in his PhD on them for high dimensional datasets. Is your dataset specifically high dimensional and so not something you'd use an unscented filter for?
Glad you like it! This library will not generate a set of convolutional filter coefficients for you if that is what you are after, I'm sure it would be possible to do some fairly nasty maths to get out some kind of equivalent convolutional kernal for a given tuning, or you could wrap an optimiser round it and try to walk your coefficients to something equivalent. I would say though that the juice would almost certainly not be worth the squeeze. The kalman filter is easily lightweight enough to run in real time itself (it was developed for this task), I've deployed several in real time embedded scenarios on a range of platforms (inc. microcontrollers) and it also has the added advantage of doing handling jitter in input timing etc.
Imagine you have a speed sensor eg. on your car and you would like to calculate the jerk (2nd derivative of speed) of your motion (useful in a range of driving comfort metrics etc.). The speed sensor on your car is probably not all that accurate, it will give some slightly randomly wrong output and it may not give that output at exactly 10 times per second, you will have some jitter in the rate you receive data. If you naiively attempt to calculate jerk by doing central differences on the signal twice (using np.gradient twice) you will amplify the noise in the signal and end up with something that looks totally wrong which you will then have to post process and maybe resample to get it at the rate that you want. If instead of np.gradient you use kalmangrad.grad you will get a nice smooth jerk signal (and a fixed up speed signal too).
There are many ways to do this kind of thing, but I personally like this one as its fast, can be run online, and if you want you can get uncertainties in your derivatives too :)
I'd been researching Kalman filters to smooth out some sampling values (working on mobile: anything from accelerometer values to voice activation detection), but hadn't got around to revising the mathematics, so I appreciate the explanation. Out of curiosity, what other ways might this be achieved? I haven't seen much else beyond Kalman filters.
You could almost certainly construct a convolutional kernal that computes smoothed derivatives of your function by the derivative of a gaussian smoothing kernal (that kind of technique is mostly used for images if I remember correctly ), in fact I recon this might work nicely https://docs.scipy.org/doc/scipy/reference/generated/scipy.n... although you would need to enforce equally spaces inputs with no misssing data. Alternatively you might also set up an optimisation problem in which you are optimising the values of your N'th derivative on some set of points and then integrating and minimising their distance to your input data, also would work well probably but would be annoying to do regularisation on your lowest derivative and the whole thing might be quite slow. You could also do B-splines or other local low order polynomial methods... the list goes on and on!
Kalman filters are usually the way to go because for some cases it is mathematically proven that they are optimal, in the sense that they minimize the noise. About alternatives, not sure if people actually do this but I think Savitzky-Golay filters could be used for the same purpose.
High dimensional GAs are a relatively hot topic of research but something we haven't aimed to target with this library (mostly because its not our main research area). My personal preference for working with high d algebras would be Stéphane Breuils and Vincent Nozick's GARAMON library (https://github.com/vincentnozick/garamon) (https://link.springer.com/article/10.1007/s00006-019-0987-7), it is efficient and easy to use, and the authors are very friendly and helpful. You could also try out TbGAL (https://github.com/Prograf-UFF/TbGAL), I haven't used it myself but the work of the Leandro Fernandes is typically high quality.