I think this gets into what you consider to be “information.” Random noise is high entropy and thus high information in one sense, and zero information in another.
Well the information used in the article is the classical shannon's information, so the former. Though I suspect that the entropy of what we can actually "randomly" type is not that high.