Hacker News new | past | comments | ask | show | jobs | submit login

What if you are typing not an English text, but a series of random letters? This gets you to 5-6 bits per letter.



I think this gets into what you consider to be “information.” Random noise is high entropy and thus high information in one sense, and zero information in another.


Well the information used in the article is the classical shannon's information, so the former. Though I suspect that the entropy of what we can actually "randomly" type is not that high.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: