Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
MemComputing vs. Quantum Computing (memcpu.com)
29 points by velmu on May 11, 2023 | hide | past | favorite | 17 comments


These guys claimed that they could factor semi-primes in polynomial time [0], essentially P == NP. Scott Aaronson predictably had some thoughts on that [1]

[0] https://pubs.aip.org/aip/cha/article/27/2/023107/135054/Poly... [1] https://scottaaronson.blog/?p=2212


> “smuggle the exponentiality” somewhere that isn’t being explicitly considered, exactly like all proposals for perpetual-motion machines smuggle the entropy increase somewhere that isn’t being explicitly considered.

this is the gist of this. so he is saying that the scalability issues they will run into, come from the entropy increasing with the N such that heat is the real problem? (ah, so maybe this is also why Quantum computers are usually frozen???)

also, how big of an N are we talking about?

I see a weird echo here, I am thinking about exporting the waste (heat, byproducts, the specific 'how-to? by which, in a way, the entropy rules get 'enforced') as being the problem.

so then, he wrote:

> while the proponents wield the sharp steel of accepted physical law

I find this analogy quite cutting... with sharp enough steel and enough acceptance (regardless of how the acceptance comes about.... ok), the heat/entropy can get forced around? time to go get my tin foil shield,

then again, I have only been trying to understand exponentiation ever since I found out it existed, and I had not even heard about the "extended" version of the church-turing thesis, hence I'm some kind of ignorant crank (sorry if this comes across as bitter, but I am studying a master's degree in theory of computer science and I had not ever heard about an extended version of that famous thesis;l surely my own fault for not going to study to the USA or europe)


> Somewhat surprisingly, just the emulation of this circuit in software (Virtual Memcomputing Machine) provides solutions to these problems orders of magnitude faster and more accurately than today’s state of the art technologies.

Looks like market speech to me.


Yea that can’t be true. Otherwise you would be able to systematically modify an existing algorithm to a new one to mirror the Memcomputing method and create new state of the art algorithms that are faster by orders of magnitude? And somehow that hasn’t already been discovered and published and used preferentially to the published state of the art? Nonsense.


> to mirror the Memcomputing method and create new state of the art algorithms that are faster by orders of magnitude?

i'm not so sure, memcomputing is not reliant on trasnsistors like regular computing is... the differences are deeper.

not 100% nonsense. but also not 100% correct.


I didn't know that memcomputing had already been solved. That's fantastic news...here's hoping it doesn't remain ridiculously expensive.

I take issue with this line, however.

> Both MemComputing and quantum computers are physics-based approaches to computation. That is, they set aside many of the foundations of traditional computer science and rethink computation from the ground up. For example, they don’t follow the von Neumann architecture, the computing framework that separates the processing unit from the memory unit, which is employed by all of today’s computers.

From what I recall of Von Neumann, his titular architecture was the 'first draft' of what he was going for, and we've been stuck with it before he could figure out version two. He based his design on the notion of cellular automaton, where of course the processing units are not separated from the memory units.

Rather, from what I recall, the Architecture we've been stuck with was just his best attempt at emulating what's here called memcomputing, but with the limited engineering resources at his disposal. If he hadn't died, I think he would have been trying to do proper memcomputing in the second draft. Only took us the better part of a century to catch up!


I view the von Neumann architecture as more of a reference. When you get down to it, there's so much hardware trickery at play that it isn't easy to cleanly say that any specific computer fits precisely into what Von Neumann was describing. Consider multi-layer caches, CPUs that execute directly out of RAM, mapped IO, or even micro machine code design where the instruction set of the CPU is really an emulated layer on top of a deeper processor, and it's really hard to reconcile these modern machines as a Von Neumann architecture. I'd say that computer design has taken all the pragmatic approaches available. We have had wholesale architecture reboots every ten years or so. Even modern x86 doesn't resemble the original all that much.

However, as a conceptual bundle to teach aspiring computer science students about hardware? It's great!

Requiring that our software be able to simulate a Turing machine has probably held us back more, although that basic abstraction did get us to this point so in no way am I besmirching the work of Dr. Turing.


Ever since my PhD work (where I worked with some pretty obscure computing architectures), the phrase “Von Neumann Architecture” makes think immediately of D H Lehmer’s quote [1] on working with ENIAC:

> Can we use the high speed computer to do the sieve process? This was a highly parallel machine, before von Neumann spoiled it.

[1]: https://history.computer.org/pioneers/lehmer.html


Neat. It makes me wonder if there is work on mechanical computers at around the atomic scale.


yes, but they call that synthetic biology.


Quantum computers.


that's sub-atomic scale? because particles? which are subatomic?

hmmmmm....


Their tagline at the end:

```MemComputing is available today and is delivering the performance expected of quantum computers across a wide range of valuable optimization problems in industry. ```

Quantum computers unfortunately are not expected to do well against traditional OR algorithms in most (all?) OR problems (Grover search only helps in brute force search; Shor it not quite an OR application, and relies on quantum effects), so not really a strong selling point in my view


> "SOG-based Circuits (SOC) can provide ultra low-power, real-time computation for many challenging computational tasks common in signal processing, graphics, physical simulations, AI, etc. These circuits are also very efficient when solving complex optimization problems. Somewhat surprisingly, just the emulation of this circuit in software (Virtual Memcomputing Machine) provides solutions to these problems orders of magnitude faster and more accurately than today’s state of the art technologies."

It smells like some BS to anyone who knows about this kind of thing, but I think their marketing is for the ones who don't know.


MemComputing mainly present themselves as optimizers: Their machine can solve real optimization problems, in practice, today. This is very good because, through that lens, we can skip the BS and go straight to the facts. The evaluation criteria for any optimization approach/algorithm are:

- the quality of the solutions found (in the best case: optimal solutions)

- the presence or absence of optimality guarantees (for example, some algorithms provide optimal solutions very often, but cannot guarantee it 100%)

- the time (or computational cost) needed to find such solutions

Furthermore, the state-of-the-art (SOTA) is well known for most types of optimization problems. In this article, they present a list of real and practical applications, so let us have a look at them one by one:

1.1 Traffic flow optimization

Simple flow problems can be solved in polynomial time (and quickly in practice), so there is no need for anything fancy. Once you introduce additional constraints or discrete variable, the SOTA is mixed-integer programming for offline problems. For online problems it's more complicated. In both cases, I am not aware of any application in which MemComputing can reach SOTA.

1.2 Vehicle routing & scheduling

Here, depending on your computing time constraints, needs for solution quality and/or optimality guarantees, the SOTA can be constraint programming (gecode, OptaPlanner), local search heuristics (LocalSolver), or mixed-integer programming (CPLEX/XPress/Gurobi) with column generation. MemComputing is nowhere to be seen again.

1.3 Supply chain optimization

This is a very broad field, but in general mixed-integer programming is king here.

2.1 Protein folding

Here there is quite an objective measure: the biennal CASP competition. This is where AlphaFold made a splash in 2022. MemComputing has never participated.

2.2 Genome sequencing

I am not knowledgeable enough to comment here.

2.3 Radiotherapy treatment

I am not very knowledgeable here either, but last I looked mixed-integer programming approaches were favored.

3.1 Portfolio risk optimization

Various types of branch-and-bound solvers. Mixed-integer linear/quadratic/convex programming. No MemComputing.

3.2 Detecting market instabilities

No idea.

3.3 Optimizing trading trajectories

No idea.

4.1 Training neural networks

Many people here know how this is done. Stochastic gradient descent on GPUs or TPUs. No MemComputing involved. How can they even claim to be active in this field?

4.2 Detecting statistical anomalies

Vague.

4.3 Classifying unstructured datasets

No idea.

The problem is that if you invent a new optimization algorithm, it is very easy to find one instance of one problem for which your algorithm works well. They did literally that in a paper [1]: They took a library of mixed-integer programming problem instances containing 270 benchmark problems, and published a whitepaper showing that they beat a SOTA solver on one of them. A single instance out of 270!

The really hard part is the opposite: given a class of problems, find an algorithm that beats the SOTA. MemComputing has never done that. Combined with their propensity for grand claims backed by misleading evidence, MemComputing have accumulated a lot of badwill from the academic community over the years. My suspicion is that, while on the surface this post seems to put their approach in contrast to quantum computing, what they really try to do here is ride on the quantum computing hype wave.

[1] https://arxiv.org/abs/2003.10644


Oranges vs Apples.


classical Vs quantum?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: