Throughput is directly related to bandwidth. Modern radios have a spectral efficiency of 5-15 bps/Hz (depending on range, noise, and multipath), meaning that a 6 MHz white space channel is worth 30-90 Mbps, a 20 MHz 802.11 channel can give 100-300 Mbps, and the entire 2.4 GHz band theoretically has 410-1230 Mbps of capacity.
Very interesting. What is currently limiting the amount of bandwidth per Hz? IANAP, but why can't you just lock in a very specific frequency range, and blip morse code at a very high clock speed (say, 100MHz), and get a huge bps/Hz?
The only way to transmit information over a wave of fixed frequency is to mess up with that wave (if the wave is really fixed, it doesn't carry any information). When you mess up with the wave, you necessarily fluctuate it's frequency. So you have to send you signal within range of frequencies.
Intuitively, the more information you want to transmit, the more you have to mess up with the frequencies. Hence the bandwidth per Hz thing.
(Disclaimer: I don't know much about the subject. Corrections, clarifications, and precisions are
welcome.)
That's not the only way to transmit information with radio waves. What's wrong with this: To send binary data, you could either send the pure signal (1), or you could not send any signal at all (0). As long as the clock cycles of the sender and receiver are the same, it should work (eg: 100MHz).
Given that this isn't the way it works, there must be some major flaw with it.
When you do a Fourrier transform of a signal of finite time, you notice that there's more than the fundamental frequency. Basically, when you cut off a signal, then resume it, over and over, then that signal ceases to be of only one frequency.