I really hope these make the hard-disk and RAM obsolote. The hard-disk of course is the current bottle neck, so removing it entirely from the picture with potentially smaller, and cheaper devices alone will lead to an insane amount of innovation.
For example, the idea that we can move away from the von Neumann model and have in-memory processing (so-to-speak) could lead to a revolution in computer vision and other highly parallel, compute-intensive, pattern matching applications.
I guess clunky hard-disks with all their moving parts will be around for some time to come. Just because they have already been around for so long and survived several other up-and-coming technologies.
Instead of just two states, on or off, as with transistors, memristors can represent many states. This means we can create new types of computing models, we can also create analog computers, which you don’t program, but you let them learn. You can then replicate the learning to other memristor analog computers.
- We might be able to use memristors in a similar way to synapses in the human brain.
I'm not claiming to be knowledgeable, but isn't it true, that analog computers are easy and fast to use for computations (sliding logarithmic ruler, self regulating hydraulic systems,...). But notoriously hard to construct IF you're looking for something multipurpose and programmable.
It's mostly the "you don’t program, but you let them learn" part that's a bit weird/inaccurate. You have to program analog computers, and don't get any automatic learning built in. It's just a different kind of programming. Some things, from a digital-programming perspective, are amazing freebies that would require difficult programming on a digital computer. Other things, that on a digital computer would be trivial, are frustratingly difficult to implement. But it's still programming of a sort either way.
And it doesn't make a ton of difference these days anyway, because "continuous" vs. "discrete, but with trillions of gradations" are indistinguishable for a lot of problems, so you can simulate an analog computer on a digital one, or vice versa, with reasonable accuracy.
I really hope these make the hard-disk and RAM obsolote. The hard-disk of course is the current bottle neck, so removing it entirely from the picture with potentially smaller, and cheaper devices alone will lead to an insane amount of innovation.
For example, the idea that we can move away from the von Neumann model and have in-memory processing (so-to-speak) could lead to a revolution in computer vision and other highly parallel, compute-intensive, pattern matching applications.
For now, HP will begin to deploy these things in tiny, very sensitive sensors that will lead to, potentially, more efficient discovery of oil: http://www.hpl.hp.com/news/2009/oct-dec/cense.html