It must have been quite painstaking to hand-code this neural network into WebGL shaders. It could have been easier if the browser vendors would just implement the WebCL standard.
This seems like a throwback to the pre-CUDA "GPGPU" era, when people were implementing numerical algorithms in OpenGL to be able to leverage GPUs for general purpose computing.
Yeah, it was really painful. In order to speed up the process, I first implemented the network on the CPU, so that I could quickly verify my GPU implementation.
This seems like a throwback to the pre-CUDA "GPGPU" era, when people were implementing numerical algorithms in OpenGL to be able to leverage GPUs for general purpose computing.