Hacker News new | past | comments | ask | show | jobs | submit login

Wouldn't the D-Wave kinda-but-not-really quantum computer that came out a decade ago be ideal for AI? Annealing sounds like exactly the kind of problem that needs to be solved for ML training?



I can't find mention of it online, but back in 2013 Lockheed Martin purchased a D-Wave machine because they wanted to use it for "AI", which it turned out meant software verification (of fighter jets?) I believe by searching for the possibility of some kind of invalid program state in a large program, which IIRC they couldn't manage to solve with standard solvers. But in that case the number of qubits in a D-Wave machine appears to me far too few for that to be possible, although I don't know the task exactly.

If by "AI" you include operations research (as opposed to statistical machine learning), yes, adiabatic quantum annealing makes sense for certain optimisation problems which you can manage to naturally formulate as a QUBO problem. By 'naturally' I mean it won't blow up the number of variables/qubits, as otherwise a far simpler algorithm on classical computer would be more efficient. I know someone who published QUBO algorithms and ran them on a NASA D-Wave machine, while I myself was using a lot of annealing for optimisation, I didn't want to get involved in that field.

But if you want to do machine learning on a large amount of data using quantum annealing, no, that's terribly badly matched, because the number of physical qubits needed is proportional to the amount of data you want to feed in.


Well, but to be useful, it would need to be better at this annealing than a classical computer that just uses good old (pseudo) random numbers.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: