>How does storing state in a microscopic state turn a computer into a microscope? I assure you that no computer hardware works by looking at stored state with a lens and using computer vision algorithms to determine the state of the device.
Oh sorry. It's a somewhat strange concept I admit, but it's served me well. There are a couple of ways to motivate it. Let us say that a computer has 4G of memory (roughly 10^10 bits). That is a vast amount of data. If that were to be printed out into a sheet of paper, with each bit a dot about what the naked human eye can see (call it .1 mm square) then it would be a sheet about 10 square meters (10^5 * 10^-4m = 10m^2).
The actual size of a 4G RAM chip is more like 1mm^2. This is a reduction factor of 10,000.
Let us say that a physical screen operates at the density of our original printout . This implies that the software sitting between the display and main memory is essentially functioning as a microscope, making visible to the naked eye a sheet of paper that is 10,000x smaller than we can see.
But if we consider information rather than data, the magnification is an order-of-magnitude higher, at least. (A screenful of characters 10px on each side requires 100bits, whereas the codepoint takes around 8bits.) And of course if we consider hard-drives rather than main memory, realize that a TB is roughly 1000x main memory, or 1000 sheets of paper, each 10 meters square (about 310 square meters).
>Oh sorry. It's a somewhat strange concept I admit, but it's served me well...(What follows is a brilliant analogy where I was expecting total insanity.)
You know. I think I'm going to use that in the future. Thanks.
Oh sorry. It's a somewhat strange concept I admit, but it's served me well. There are a couple of ways to motivate it. Let us say that a computer has 4G of memory (roughly 10^10 bits). That is a vast amount of data. If that were to be printed out into a sheet of paper, with each bit a dot about what the naked human eye can see (call it .1 mm square) then it would be a sheet about 10 square meters (10^5 * 10^-4m = 10m^2).
The actual size of a 4G RAM chip is more like 1mm^2. This is a reduction factor of 10,000.
Let us say that a physical screen operates at the density of our original printout . This implies that the software sitting between the display and main memory is essentially functioning as a microscope, making visible to the naked eye a sheet of paper that is 10,000x smaller than we can see.
But if we consider information rather than data, the magnification is an order-of-magnitude higher, at least. (A screenful of characters 10px on each side requires 100bits, whereas the codepoint takes around 8bits.) And of course if we consider hard-drives rather than main memory, realize that a TB is roughly 1000x main memory, or 1000 sheets of paper, each 10 meters square (about 310 square meters).