Good example, but in this case it's actually very easy to find weights for the neural network that would do rounding with any desired precision (look for the "stack of towers" method in the article).
Yeah, I think I picked rounding because the article primed me to think about it (the proof presented is essentially based on the ability to create steps, and rounding is my go-to example of a step function). However, any attempt to construct such a network would have a finite amount of steps in it, while the actual step function would have infinite.
In practice, if we needed such a network, we would probably have a restricted domain, so there is only a finite number of discontinuous, and there would be a sufficiently large network that could solve it to an arbitrary precision.
We would not be able to do this with more exotic functions that are densly discontinues, such as the function:
f(x) = 0 iff x is rational
f(x) = 1 iff x is irrational
There is no way a neural network can reasonably model that function.