That was a fun read. While it might be unsurprising to some, it's a testament to breadth of applications of modern machine learning frameworks.
I enjoyed this sentence in particular.
> This should look familiar, because it is a neural network layer with no activation function and no bias.
I thought it should look familiar because it's matrix multiplication. That it looks like a neural network layer first and foremost to some is maybe a sign of the times.
> it's a testament to breadth of applications of modern machine learning frameworks.
More like a testament to the the breadth of applications of linear algebra. It is absolutely remarkable what we're able to compute and analyze in the form of y = A x (hilbert spaces are a wild invention).
But it really isn't a testament to modern ML frameworks in any way. The fourier transform has been easy to compute/fit in this exact way (fourier = linear problem + solving linear problems by optimization) by modern-at-the-time frameworks for over two centuries.
I enjoyed this sentence in particular.
> This should look familiar, because it is a neural network layer with no activation function and no bias.
I thought it should look familiar because it's matrix multiplication. That it looks like a neural network layer first and foremost to some is maybe a sign of the times.