Smaller stuff, from maybe 10kt up to 1Gt (Tunguska was ~15 Mt), though, we can reliably predict its course and evacuate the affected area.
To detect the small stuff at close range, of course, you need to be able to detect the big stuff a ways off. The problem is that massive star surveys are essentially a brand-new thing now that we have Big Data capabilities (we still get a substantial amount of info from scanner photographic plates of the last century). The astronomical community has mostly been focused on producing expensive 1-2 unit runs of a brand new design every decade, each much bigger and better than the last. Survey telescopy, however, is perfectly amenable to economies of scale in large numbers of small scopes, and to the data from a single-purpose survey being used for dozens of different scientific goals.
http://www.fallingstar.com/ , the ATLAS project, is a minimum viable project for this purpose, using small COTS telescopes & sensors to scan the sky rapidly & systematically for objects that have photometric characteristics of something that is headed our way, a few days or a few weeks from Earth.
I see no reason why we shouldn't build twenty of those, scattered around the world, just for immediate situational awareness. We could initiate the project and have it done in six months.
On a longer-term basis, I think rather than constructing things like LSST under the old paradigm, we should be focusing on smaller, 1-2m telescopes (like PAN-STARRS, albeit perhaps without the extraordinary sensors) in very large, economical quantities. Mechanizing the production line is something that's only really been done for much smaller telescopes.
When you have a large, distributed array of automated telescopes, you can sweep the sky quickly and resiliently to weather conditions, or you can turn a portion on a single target that needs a higher signal to noise ratio, for confirming observations, and perform like a much larger telescope's light-gathering ability.
Lastly, there are projects like the Gaia mission that are doing space photometry. While extremely data-starved (better space communications via radio relays & lasers really needs to get here), Gaia would be in an ideal position to detect all the Earth-crossing NEOs that we can't realistically see from the ground due to the sun's glare... if it was at Venus L2 rather than Earth L2, and ideally if there were half a dozen of them instead of one.
The longer out we see it, the better we can deflect it - and our predictive abilities are in the hundreds of years range if we actually bother to spend the money on detection as if it were as important as space toilets on the ISS. The tech is here.
Smaller stuff, from maybe 10kt up to 1Gt (Tunguska was ~15 Mt), though, we can reliably predict its course and evacuate the affected area.
To detect the small stuff at close range, of course, you need to be able to detect the big stuff a ways off. The problem is that massive star surveys are essentially a brand-new thing now that we have Big Data capabilities (we still get a substantial amount of info from scanner photographic plates of the last century). The astronomical community has mostly been focused on producing expensive 1-2 unit runs of a brand new design every decade, each much bigger and better than the last. Survey telescopy, however, is perfectly amenable to economies of scale in large numbers of small scopes, and to the data from a single-purpose survey being used for dozens of different scientific goals.
http://www.fallingstar.com/ , the ATLAS project, is a minimum viable project for this purpose, using small COTS telescopes & sensors to scan the sky rapidly & systematically for objects that have photometric characteristics of something that is headed our way, a few days or a few weeks from Earth.
I see no reason why we shouldn't build twenty of those, scattered around the world, just for immediate situational awareness. We could initiate the project and have it done in six months.
On a longer-term basis, I think rather than constructing things like LSST under the old paradigm, we should be focusing on smaller, 1-2m telescopes (like PAN-STARRS, albeit perhaps without the extraordinary sensors) in very large, economical quantities. Mechanizing the production line is something that's only really been done for much smaller telescopes.
When you have a large, distributed array of automated telescopes, you can sweep the sky quickly and resiliently to weather conditions, or you can turn a portion on a single target that needs a higher signal to noise ratio, for confirming observations, and perform like a much larger telescope's light-gathering ability.
Lastly, there are projects like the Gaia mission that are doing space photometry. While extremely data-starved (better space communications via radio relays & lasers really needs to get here), Gaia would be in an ideal position to detect all the Earth-crossing NEOs that we can't realistically see from the ground due to the sun's glare... if it was at Venus L2 rather than Earth L2, and ideally if there were half a dozen of them instead of one.
The longer out we see it, the better we can deflect it - and our predictive abilities are in the hundreds of years range if we actually bother to spend the money on detection as if it were as important as space toilets on the ISS. The tech is here.