I find neural networks pretty interesting, but most useful examples I know of require too much training data to be useful when it comes to trying to understand the whole process. I used a DOS program over 20 years ago in school to train an NN to do a simple table lookup. That isn't very exciting of course, since it is pretty obvious that a simple array would do much better.
I am wondering, are there any small, self-contained NN examples that actually do something useful, while still being small enough to play with an implementation in my favourite programming language?
One of the simpler data sets available is the MNIST set of labelled hand-written digits. These days it should be easy to build and train a 3-layer NN for classifying those images.