This is not too far off what experimental systems already do.
My lab has equipment that collects data from 288 electrodes in the brain, each at 16 bits/sample x 30,000 samples/sec. This works out to 140 mbps, not counting overhead or other types of data we collect, and it’s not abnormally large. If we had a research question that required it, the vendor sells versions with up to 512 channels—-and you can gang them together for even more.
However, this probably isn’t the direction BCI is headed. A lot of the signal is redundant. Some of this is because nearby neurons tend to do very similar things, and some of it is because signals propagate pretty well through the brain so electrodes pick up the same signal in different places. As a result, most non-research applications don’t need all of that data and it’s increasingly possible to do some preprocessing right at the brain: folks at Imperial have made all kinds of cool ASICs that extract spikes.