Location: Seoul
Remote: Yes
Willing to relocate: No
Tech: LLM, python, ML, research
CV: https://www.linkedin.com/in/zhanibek-omarov/
Email: zhanibek [dot] om @ gmail
Summary:
I'm a physics PhD doing ML research and engineering. Can adapt and learn fast.
GPU is one part where modern design of Julia truly shines as it allows seamless reusability and composability. I wish the language gets the proper recognition and adoption
This is nice parallel with soft engineering staff. I often find that whoever does lotsa tinkering (building) can better which tool to use and why.
Linux tinkering being it window managers, bash scripts might sound like a waste of time to the most, but it offers intangible experience of knowing what software/stacks to choose and why. That is, such tinkering experience gives you 'intuition' about today's vast array of tools
Scikitlearn has a bunch of machine learning routines including knn, xdg, decision trees and so one. There is even a model zoo there. Id say bayesian mixed gaussian models to be in the same class of algorithms. Bayesian models have its own peculiarities and statistical foundation, but it isn't far fetched to imagine it inside scikitlearn and to be considered a machine learning routine.
Yes, there is a BayesianGaussianMixture class in sklearn.mixture [0]. There’s also sklearn.gaussian_process [1] which offers Gaussian process classification and regression, Bayesian learning algorithms which can be thought of as analogs of the support vector machine. Rasmussen’s Gaussian Processes for Machine Learning (2006) is a great introduction despite being a little old. [2].
The analogy with physics and basically differential equations brings in familiar concepts from the "spherical cow in vacuum" world. If we wanna go technical, then acceleration too does not matter, cause jerk (acceleration of acceleration) matters even more. We could keep going deeper and deeper into derivatives at no gain. You can introduce mass, air resistance and whatever you want, but at the end of the day this is all just an analogy. Getting into technicalities of jargon has no practical benefit.
When people say "velocity" they actually refer to all the "derivatives". Just like when I say my car faster, well it is both faster and accelerates faster and all of those things. A Tesla is faster than a normal bicycle. Faster and "velocity" have mental association and that's all it matters. Changing a the term from "velocity" to "acceleration" and whatnot just changes the label, not practical meaning and what people want to convey.
As if acceleration doesn't have a direction. And how many dimensions does software development have, anyway? Replacing one analogy (distance) by another (mass) isn't going to help.
They typically play a role if you care about how smooth the transitions between two accelerations are (e.g. vehicles)
Similarly designers and engineers look at derivatives of curvature (1/radius) if they want to achieve smooth transitions between curved surfaces (e.g. car bodies).
Cannot speak about motion, but in surface continuity you will also look at higher "derivatives" if you want really really smooth transfers between two curves. So you make the transition of the curvature comb of a curve's curvature comb tangetial or so. There they just call the transitions g0, g1, g2, g3, g4 and so on.
I cannot judge whether snap, crackle and pop are things people actually use when they talk about those derivatives in motion.