Hacker News new | past | comments | ask | show | jobs | submit login
Machine Learning using Quantum Algorithms (googleresearch.blogspot.com)
72 points by vikram360 on Nov 26, 2011 | hide | past | favorite | 23 comments



Could please someone give a layman's explanation for this bombshell?

> Assume I hide a ball in a cabinet with a million drawers. How many drawers do you have to open to find the ball? Sometimes you may get lucky and find the ball in the first few drawers but at other times you have to inspect almost all of them. So on average it will take you 500,000 peeks to find the ball. Now a quantum computer can perform such a search looking only into 1000 drawers. This mind boggling feat is known as Grover’s algorithm.

If 999,000 drawers are left unopened, how can the algorithm guarantee that the ball will be found?


Disclaimer: I do not know much about quantum computation, but got interested in it only very recently. I'll give you my understanding, take it with a grain of salt.

AFAIU, Grover's algorithm is not really an algorithm of search of unstructured data, like phone book, but rather a quantum algorithm for inverting a function. The way it is related to search is that you have to have a computable function f such that f(x)=1 if 'x' is one of the million 'boxes' and f(x)=0 otherwise. Your goal is to find that one 'x' (assumption being, of course, is that 'f' is not easily inversible).

How would traditional computer solve the problem - by going through all x's and once f(x)=1, you have your x.

The power of quantum computer is that the state of the quantum computer is the quantum superposition of many classic states. As such quantum computer may perform a computation of 'f' at the superposition of many 'x's at one and get a superposition of function results (that can be measured with a certain degree of uncertainty)

The algorithm essentially goes through superpositions narrowing sets of possible candidate x's pretty fast. Algorithm is inherently probabilistic but the point is that you can converge to x that is your correct x with a high degree of certainty in a (relatively) small amount of steps.

If anyone has more qualified explanation to offer I will be happy to stand corrected.


"AFAIU, Grover's algorithm is not really an algorithm of search of unstructured data, like phone book, but rather a quantum algorithm for inverting a function."

Grover's search algorithm works both for inverting a function and for database search. But if you want to use it for database search, then your database needs to be accessible in coherent superposition, i.e., in some sort of "quantum RAM." For many problems, this probably does not make sense.


The 'database' is not really a database in the sense that it stores a list of things. Instead the database is like a predicate p(x) where x ranges over some finite set (you can assume that this set is just 0..N). So if p(x) really has to search a database to determine whether x is the one you want, then the algorithm doesn't work. It is assumed that executing p(x) is fast. Classically you'd have to call the predicate on each of 0..N to find the x for which p(x) = 1. In a quantum computer you can start with a superposition of all states 0..N (think probability distribution over 0..N), and by cleverly applying a quantum analog to a predicate p in Grover's algorithm you can change this probability distribution so that the x for which p(x) is true gets more and more likely (i.e. more and more probability mass). This probability mass increases fast enough that on average you need O(sqrt(N)) steps to find the x that satisfies p(x).


Thanks for asking this question so I didn't have to.. this sounds so interesting but its a bit hard to grasp not having a science background in this field


I feel compelled to point out (as is done here http://michaelnielsen.org/blog/quantum-computing-for-everyon...) that it is possible that no such explanation exists. My understanding is that most people doing theoretical research in quantum computing don't need to have intuitive explanations for their results, since the calculations work themselves out. This is a depressing but honest observation.

I found quantum mechanics courses very frustrating for exactly this reason, especially in grad school, when the calculations became especially dense and the rationale impenetrable (I have since left physics). Occasionally there would be some little gem of insight into a particular phenomena where the result could be explained in a clear way, but the default approach was to just "do the calculation".

I can't explain exactly how a quantum computer works (because I don't understand it myself) but I will say that the reason the kind of analogy given above might make sense from a quantum mechanical perspective is that the state of the system and the operations that mutate the state have semantics different from classical computers. In particular, the state of the system is given by a bunch of complex numbers, which you can think of as a collection of arrows on the plane. The length and direction that these arrows are pointed in determines the probability for a bit having a particular sign. For example if an arrow is aligned more east-to west the measured classical bit will more likely be a 1, whereas if it is aligned north to south, the bit will tend to be a zero. The tricky part comes when one tries to consider how the state changes in time when subject to some operations. Fundamental particles like electrons or photons are "indistinguishable", which is to say that there are no properties intrinsic to a particular particle that allow one to distinguish it from another such particle of the same type. The result is that the arrows describing our bits (which we can think of as fundamental particles) become coupled, or entangled. Unlike classical operations on bits which one can construct by composing operations of single bits separately, quantum operations mutate the state of the whole system, which affects all the directions of our arrows (and thus the probability for realizing a measured classical state) simultaneously, so long as this entanglement can be maintained. The logic of how these arrows mutate is the complicated part, but this is how the quantum computer achieves its properties. [deleted a long and ultimately unhelpful explanation].

This is certainly not a layman's explanation, but this discussion of Deutsch's algorithm is quite nice http://physics.stackexchange.com/questions/3390/can-anybody-...


"My understanding is that most people doing theoretical research in quantum computing don't need to have intuitive explanations for their results, since the calculations work themselves out."

I don't believe this. Maybe your classical intuition is wrong, but that doesn't mean you can't develop a quantum intuition. Quantum theory is just a natural generalization of probability theory (with "list of probabilities adding to one" replaced by "list of amplitudes whose squares add to one"), so particularly for anyone who thinks much about randomized algorithms the intuition is very similar.


It's an interesting question: how those who work on quantum algorithms think about quantum algorithms. As someone who worked on quantum algorithms for many years, the interesting intuition that I developed was not what made quantum algorithms tick, but more along the lines of "when is a method likely to NOT yield a quantum speedup." That is I found I could pretty quickly tell when I was doing something that had a nice classical analogy and was therefore doomed to failure. Classic examples where when your intuition about why you might get a speedup involved exploring an exponentially large solution space without some reason that interference could be used to help dig out the interesting parts of the global superposition you created.

An interesting example of how one discovers a quantum algorithm is the story of the NAND tree algorithm (See http://www.scottaaronson.com/blog/?p=207 for a nice explanation.) Quite literally the quantum algorithm was discovered by thinking about interference of particles hoping around, combined with physicists uncanny ability to calculate scattering cross sections. Crazy stuff!


This is an old article (from 2009). Hartmut Neven provided an update at ICML 2011 in the latter part of his keynote talk - http://techtalks.tv/talks/54457/


You know, it's funny how ridiculous D-wave sounded when they insinuated they were scaling qubits about as fast as Moore's law, but here they are in 2011 with a 512 qubit chip that clearly works. That's nuts.

It's hard to believe our technology has advanced this far so fast.

Combined with technologies like IBM Watson, at this rate in 10 years we'll have no idea what the fuck our computers are even doing anymore.


Though one should be careful as the D-wave qubits aren't of the kind that produce a universal quantum computer AND I don't think anyone knows whether their current machines are scaling better than classical analogues.


You are correct, their QC is more of a hardware solver for a very specific problem than a general purpose computer. On scaling, I was a D-Wave a few weeks ago, they seem to be prepared to scale-up the chip aggressively in 2012.


I think one has to be careful about the engineering challenge of scaling (which I think D-wave has heroically succeeded at so far) versus the question of whether the adiabatic algorithms in their system (finite temperature, imperfections, etc) will scale. I think they can pull of the first challenge (again, an amazing accomplishment), but the second is a question of whether the physics and computer science of their device will scale. I think the good news is that they will probably be able to answer this question in the next few years (thanks to the purchase of one of their devices my Lockheed Martin), but I do worry the answer they get won't be the one they are hoping for.


Non js-heavy version, for those who need/want it:

http://googleresearch.blogspot.com/2009/12/machine-learning-...


Someone should modify the post to note that this is from Dec. 2009. Not sure what to think here. Google is endorsing, but IEEE is slamming. DWave actually has a reasonable looking dev kit on this page, but Scott Aaronson is quite critical.

http://spectrum.ieee.org/computing/hardware/loser-dwave-does...

http://www.dwavesys.com/en/dev-tutorials.html

http://www.scottaaronson.com/blog/?p=306

http://www.scottaaronson.com/blog/?p=291

Would be nice if a real expert weighed in on this thread.


D-Wave's computer is basically a hardware solver for Ising/QUBO models. It is not programable in the traditional sense and you need to find a way to express the problem you want to solve in a way that maps well over the hardware.


To clarify: you express your problem as an Ising model, load the model into the D-Wave One QC and it will solve it for you. The interesting thing is that solving Ising models is a NP-hard problem.

D-Wave's current chip has only 128 qubits and it is somewhat limited. As they scale it up (and they are prepared to do so) you will be able to use their computer to solve larger and larger models that could not be solved with standard silicon in a reasonable amount of time.


Indeed that is why many in quantum computing cringe at some of their PR. A minor nit: D-waves machine is programmable, it's just not (thought to be) universal.


In my line of work I see RAM limitation rather than CPU clock speed limitation. Will this technology also solve the big data problem?


This isn't about RAM or CPU limitations, it's about entirely different genres of algorithms that aren't feasible with classical computers.


Quantum Algorithms (run on quantum computers) are the future for Machine Learning. This article was from 2009 though, and I haven't seen a whole lot of progress in the field. Artificial Neural Networks that can model every possible value of every node at the same time is very powerful.


I don't buy it. Quantum computing is extremely important, ground breaking even for complexity theory and theoretical computer science in general. Quantum computing is to complexity theory what the real numbers are to number theory.

From a practical point of view though there are only two interesting family of algorithms: Shor's Quantum Fourier transform and Grover's square root speedup for searching an unstructured list (give or take). The former doesn't seem very applicable to machine learning (yet?). The later is AFAI am concerned useless for machine learning: if we really want to speed up searching some unordered structure then I think we will rely on sorting for a very long time to come - the advantage is much more impressive than Grover's square root speedup.

May I remind you that in the past 20 years there hasn't been much progress on "practical quantum algorithms" (that is not to say quantum complexity results). I think until something brand new comes out of the quantum computing community, these quantum claims are great PR but that's where it ends ...


That is not how it works Viscanti, gate-based QCs are still toys and it remains to be proved if they will become useful anytime soon. On the other hand D-Wave's adiabatic chip is very promising but it requires you to fit a "square peg in a round hole" - expressing problems into a format their hardware can handle is tricky.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: