Hacker News new | past | comments | ask | show | jobs | submit login
FCC votes unanimously in favor of using whitespace for 'super WiFi' (engadget.com)
77 points by lotusleaf1987 on Sept 23, 2010 | hide | past | favorite | 24 comments



From the title I thought that the FCC ruled on 'super WiFi' over 'superWiFi'


So what exactly is "Whitespace airwaves"?


Area in the spectrum normally reserved for television, but not intentionally actively used by television broadcasters.

Simplified (it used to work this way, now it's slightly more complicated), each television channel frequency required a certain small range (the tuner frequency +/- a certain number of MHz) to carry its signal. Ideally, the affected range of spectrum wouldn't exceed that specced amount, but electronics aren't perfect, and RF is a tricky matter.

In practice, you tend to see splatter near a particular channel, outside its allotted frequency range. Furthermore, receivers have to filter out any signal not within the frequency range for the channel they want to watch, but the filtering circuitry (called a bandpass filter) will tend to cause distortions within the intended frequency range the better they are at filtering frequencies outside that range. There are also other complicating factors, like signal overloading receivers, and mis-tuned transmitters.

This resulted in a margin space between channels, so that adjacent channels would interfere with each other less. As receiver electronics have gotten better, more selective and more error tolerant, these margin spaces became less necessary, and we have space in the spectrum that's no longer required for its original purpose of providing a margin.

That space is your 'whitespace'.


" As receiver electronics have gotten better, more selective and more error tolerant, these margin spaces became less necessary..."

More to the point, as the technology used to broadcast has completely changed from analog to digital the margin spaces have become completely unnecessary. Additionally, digital television does not use the entire spectrum previously allocated to analog broadcasts.

Edit: additionally, in locations where there aren't local television broadcasts on certain channels those channels can be used for wifi.


Digital signal transmission tends to include an increase in error tolerance, which I mentioned. The margin reduces the strain on the error tolerance, which is also needed to cope with signal loss issues.

So, no, the margin spaces aren't completely unnecessary with digital signaling.

Furthermore, I'll pick a couple nits and point out that it's only television broadcasting which has completely switched to digital in the US. AM/FM radio is still analog by definition, and there are large swaths of spectrum which use those and other signaling methods.


Where does the rest of the world fit into this? Do other countries copy the FCC restrictions? If not, why haven't "super WiFi" devices already been built in countries that already had this spectrum open?


There is some degree of international coordination under the International Telecommunication Union (radio waves don't really recognize national borders), but individual countries still have leeway to make their own internal rules. There is also some international cooperation aimed at keeping people from stepping on each others toes (e.g. I cannot use part of the 70cm amateur allocation because I am too close to Canada who makes different use of those frequencies). That said, as far as I can tell, it seems rules adopted by a significant number of countries (or a number of significant countries) are likely to become international.


Doesn't it mean that your laptop bought in the States won't work with WiFi stations in Europe?


Opening up the 2.4ghz spectrum to everyone didn't create ubiquitous free internet, nor even ubiquitous free-with-ads internet. Why are some postulating that this will mean the end of cellular carriers and that phones will somehow use this spectrum for free?

In terms of utilizing frequencies that can provide connections over longer range than 2.4GHz, that's exactly what WiMax already is. And WiMax specs aren't leading us towards any kind of free internet, they're the new high priced 4G service you now buy from the same cell companies as always.


I thought the point was that this is UNLICENSED spectrum.

I can't legally set up a WiMax tower like I can set up a WiFi hot spot. Doesn't opening whitespaces mean that you could set up, say, a city-wide access point without asking permission? Or did I miss something? If that's the case, it's likely to spawn LOTS more competition than assigning frequencies to the highest bidder - which is actually 0 competetion, actually; it's a granted monopoly.

If I'm understanding right, there could be networks that are not for profit, or indirectly so. Like, maybe a hospital could have a Wi-Fi-like network that spans the whole city, which their ambulances could use to transmit vitals to the hospital while bringing in a patient.


I'm not sure if this answers your question, but any unlicensed communicate device has limitations on the amount of power it's allowed to radiate. Generally, that's the limiting factor in how far-reaching the signal is. Of course, you could transmit in a very narrow angle using antennas and get quick respectable distances, but then you lose omnidirection.


Opening up 2.4Ghz was done because of microwaves leaking from appliances.


Simplistically, there 2 parts - backbone and last mile. The last mile part wasn't solved by 2.4GHz, yet the situation has significantly improved - Google free wifi for example is an example of a [still limited] solution to the "last mile". Any significant improvements upon WiFi will make "last mile" solvable much better and cheaper in many more places.

Cell phone wise - lets say we have 3000ft range in superWiFi instead of WiFi 300ft. Would the overlaps be enough to imagine a robust voice/data carrying by hoping from router to router (FidoNet style) with coverage similar to the early days of cell phones?

So while there wouldn't be an end to the carriers (whom we're kidding - the Ma'Bell is practically still alive) there are big improvements to come.


I wish I had the expertise or the time to give a better rebuttal to this, instead I'll just point out a few things:

First, the 2.4ghz spectrum is not the best for wireless comms, there were many reasons for opening up that spectrum and very few of those reasons were because it was an optimal signal frequency.

Second, bandwidth. The 2.4ghz spectrum is confined within an 82mhz band. The available whitespace spectrum has a significantly larger bandwidth.

Third, technology. 802.11b and g are based on technology that is now stale by more than a decade. Today's wireless broadband tech makes those protocols look like a 56k modem.

The combination of these factors allows for stunning possibilities. More efficient broadband technologies, better broadcast frequencies, more bandwidth. These allow for more users with higher data rates at farther distances from base stations. It also makes things like bridging access points a lot easier. There's little reason to believe that in 10 years whitespace broadband access points won't be utterly ubiquitous.

As for WiMax, that's not open spectrum, consumers can't buy a WiMax AP on newegg. That makes it an entirely different beast.


In large cities there is less than 82 MHz of white space available AFAIK. Also, white space channels are 6 MHz wide while 802.11 channels are 20 MHz.


What does an 82MHz band really mean? I understand that devices are confined to communicate within the frequencies of 2.4GHz +/- 41MHz (come to think of it, that's ridiculously narrow... a margin of .00000034%), but is band related to actual bps bandwidth? If so, how? And why is the available band so narrow?


Throughput is directly related to bandwidth. Modern radios have a spectral efficiency of 5-15 bps/Hz (depending on range, noise, and multipath), meaning that a 6 MHz white space channel is worth 30-90 Mbps, a 20 MHz 802.11 channel can give 100-300 Mbps, and the entire 2.4 GHz band theoretically has 410-1230 Mbps of capacity.


Very interesting. What is currently limiting the amount of bandwidth per Hz? IANAP, but why can't you just lock in a very specific frequency range, and blip morse code at a very high clock speed (say, 100MHz), and get a huge bps/Hz?


The only way to transmit information over a wave of fixed frequency is to mess up with that wave (if the wave is really fixed, it doesn't carry any information). When you mess up with the wave, you necessarily fluctuate it's frequency. So you have to send you signal within range of frequencies.

Intuitively, the more information you want to transmit, the more you have to mess up with the frequencies. Hence the bandwidth per Hz thing.

(Disclaimer: I don't know much about the subject. Corrections, clarifications, and precisions are welcome.)


That's not the only way to transmit information with radio waves. What's wrong with this: To send binary data, you could either send the pure signal (1), or you could not send any signal at all (0). As long as the clock cycles of the sender and receiver are the same, it should work (eg: 100MHz).

Given that this isn't the way it works, there must be some major flaw with it.


When you do a Fourrier transform of a signal of finite time, you notice that there's more than the fundamental frequency. Basically, when you cut off a signal, then resume it, over and over, then that signal ceases to be of only one frequency.

That's why your method doesn't work.


If Google runs Android, and builds out an ad-supported "super wifi" network what value do traditional carriers bring? Infrastructure for relays?


[deleted]


It's not new spectrum, it's using empty spectrum in the TV broadcast range. These waves are in the 500 MHz range, so unless part of your body is tuned to 2.5 feet (75 cm), you won't absorb a significant proportion of the energy in the wave. There's really no reason to expect a human body to interact with any amount of power in this frequency range. Not that there's much to start with, since digital broadcasts are using much less power than the old analog signals.


TV channels actually span a much wider range than just near 500MHz. Channel 2 starts at 54MHz, while the highest UHF station is near 800MHz. There are gaps in the middle to accommodate other uses, such as FM radio.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: