Can someone help be understand "intuitively" why light is able to transmit so much more than electrons over a wire (is that even correct)? I guess this is a request for ELI5.
Is it because with light, you can basically cram infinite photons along a pipe and they can all travel without any effect on each other and the "medium" itself (of course, no medium)? Whereas with electrons in a wire, they start to interfere with each other, the harmonics / capacitance of the wire, and more? Or is there a natural minimum pulse length that electrons cannot go beyond in a wire?
And to get out all the bandwidth at the end (when using light), you just split out the wavelengths with as many narrow band filters as you need, then send that to a decoder that then turns it into the usual bottlenecked ethernet at your favorite router?
I can't speak to whether or not you can get the same bandwidth with wire/electricity as you can with lasers/fiber but there's also a couple practical things here:
Electrical signal propagation takes a lot more energy than light propagation. (physics folks can explain why way better than I can). That's why you can send 10GBE about 100M over twisted pair, but many KMs over fiber from the same SFP+ port (aka using the same power draw).
Long wires make good antennas, so you have a lot more sources of interference on the long wires meaning the signal gets lost in the noise easier if there happens to be lightning, power wires, big motors, etc near-by. (see also energy needs). Dealing with this uses some combination of complex switching schemes in multiple wires, shielding and signal processing. More complex than "on and off" from your laser. (Other things like capacitance and inductance in wires adds to complexity here too).
Point of all that being: I think part of the reason you get the impression that photons carry more data than electrons is that more effort is put into photons carrying lots of data fast - purely from a practical "it's easier to engineer" standpoint. Why worry about all the hard practical electrical engineering when we can just get good at turning lasers on and off really fast, and have good optical sensors where the laser is pointing?
By the way, if you know, what are the kinds of detectors at the other end of the laser pulse that receive the signal? Are they like the pixels of a CCD but without the need for periodic / time-bound readout? They just are like silicon photocells that can react at the femto(?)-second speed and turn it into pulses translated into electrons?
Most simple answer is you can cram N number of light streams via WDM (wave division multiplexing). You transmit N number of parallel streams on different wavelengths.
The highest frequency you can practically send long distances over coax is around 1-2GHz. When I had a cable modem the highest I saw was 900MHz. Therefore, regardless of what encoding scheme you use you will never get more then a couple GHz of bandwidth.
Visible light starts at 400THz so right off the bat you get 10k times more more headroom before physics becomes the limiting factor.
I was thinking somehow that the response / recovery time of the silicon detectors to react to the laser pulses would be a limiting factor, is that at all valid?
Like, you cannot blink faster than <x> nanoseconds and get a CCD(?) to see it properly. (I'm sure it's not like a CCD with readout, etc. but whatever mechanism is the correct one, is there some natural minimum read time?)
It's analogous to frequency multiplexing in radio (and radio carried over a wire -- which high speed codings like modern Ethernet or PCIe are, in practice). For example, sending one channel on a carrier at 100 MHz, and another at 102 MHz. These days potentially thousands of carriers may be used in parallel. Practical bandwidths in radio are limited to a few gigahertz at most, though. By comparison the visible spectrum is several hundred thousand GHz wide. There's much more bandwidth to work with at optical frequencies.
Is it because with light, you can basically cram infinite photons along a pipe and they can all travel without any effect on each other and the "medium" itself (of course, no medium)? Whereas with electrons in a wire, they start to interfere with each other, the harmonics / capacitance of the wire, and more? Or is there a natural minimum pulse length that electrons cannot go beyond in a wire?
And to get out all the bandwidth at the end (when using light), you just split out the wavelengths with as many narrow band filters as you need, then send that to a decoder that then turns it into the usual bottlenecked ethernet at your favorite router?