Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Evolutionary Algorithms and Analog Electronic Circuits (hforsten.com)
50 points by andars on Sept 14, 2015 | hide | past | favorite | 20 comments


A related article with some more complicated circuits http://www.damninteresting.com/on-the-origin-of-circuits/

This was evolving an FPGA to do things like detect tones or voice commands. Weirdly reminiscent of DNA, logic cells which were apparently useless would cause the circuit to break when they were removed. Neither did the layout work when transferred to an identical FPGA.


Neither did the layout work when transferred to an identical FPGA.

Now that is quite fascinating. (The article goes into possible explanations, for those who care.)


"Not very good as a reference but considering that this was generated automatically I think it's a success"

Behold 80% of the appeal of evolutionary algorithms... people get so excited that they do anything that they are willing to overlook that you burned CPU-hours of time to produce something that doesn't do it very well. There's this weird disconnect between how people speak of them and the actual standards applied to them that gets applied to very few other things. (Neural nets, perhaps. Even if they recently got better, they still were grossly overrated for a long time.)


There is an annual award given to the creators of genetic algorithms that produce results competitive with humans: http://www.genetic-programming.org/combined.php

Some of the results are really impressive. There are domains where stochastic optimization works really well.

And even when it doesn't generate human competitive results, it's still cool. It's a computer program doing something previously only humans could do.


My criticism is more targeted at credulous blog posts, to do things like paint pictures with polygons more slowly than a human could do it in a paint program, with worse results. Professionals know what's what, and when to use it. I know it's not useless, it just doesn't particularly resemble the what blog posts like this present it as.


Is this happening in 2015?



I studied EC a lot, and this is actually one of the strongest success areas for evolutionary algorithms. In general evolutionary computing does far better with real-world, analog, and physical systems than it does with brittle software systems. Other areas of success include mechanical component design, antennas, and materials.

The results must be post-validated of course, since in some cases it will over-optimize to quirks of the Spice or other simulation that's used to implement the fitness function.


Great point - they work well for anything where the optimization surface is relatively smooth, where small changes don't lead to highly suboptimal solutions. In programming, a single character can make a program go from working to not, so the likelihood of "discovering" a valid program is low.


There's a whole subdomain of genetic programming research that centers around the design of evolvable computer languages and instruction encodings. There are some immensely interesting programming languages and instruction sets out there for which this is less true than for ordinary programming. But we still have yet to devise an encoding that is truly analog and smooth... except maybe for really exotic encodings that are too slow to be useful in any practical system. These are sort of like homomorphic crypto-- interesting but too compute intensive to run anything real.


I agree whole-heartedly with the point about post-validation. I work in the field (EC) and the first dozen results you get are usually instances of the algorithm finding where your model or your assumptions are broken. For that reason the "On the Origins of Circuits" link posted in another comment is a really bad example. The results are total garbage because there was no model at all, just an actual physical FPGA. Models are great because they abstract away the effects that you really shouldn't rely on in your design. With out a model, the FPGA optimization gives us an answer that's only valid for one particular FPGA, at a particular point in its lifecycle. In fact, this might be why software optimization breaks --- nobody's figured out how to abstract away the irrelevant effects.


Nice. You may be interested in Kosa's work in this area if you're not already familiar with it:

http://www.eecs.harvard.edu/~rad/courses/cs266/papers/koza-s...


He needs to put resistors in series with both the signal source and the power supply. Also there should always be some kind of load.

With 0-ohm signal source impedance very strange things happen. For example, a common emitter amplifier will seem to have much better bandwidth than it actually does (hides miller and base capacitance).


This looks fun. One could conceive of a three-pronged approach to circuit design:

* Manual placement of components, based on the designer's skill and knowledge

* Insertion of preexisting blocks from a library

* Specification of requirements for a generated block to satisfy, triggering evolutionary search as described in the article

I'm not sure how likely beginning with an existing design would impede the evolutionary algorithm from escaping a local optimum based on a bad decision, but one could integrate the third strategy in the above list with the first two in varying degrees.

Loosely related by analogy in the software domain: superoptimization. [1]

[1] https://en.wikipedia.org/wiki/Superoptimization


Not exactly this. But there is already a router (piece of software that is used to desing printed board out of scematic) employing GA [1].

You mix automatic optimization with manual routing (using your skill and knowledge) and can switch between these modes at any given point. Your manual edits could be used in new generations or you could just say to autorouter: “keep away from this area”. I had some experience with it and was quite satisfied.

[1] https://en.wikipedia.org/wiki/TopoR


> To make this program useful it's probably necessary to add checks that power used by components is reasonable.

Indeed. It would also be interesting to see power consumption addressed in the cost function.


When I worked as a Research Associate I developed a genetic algorithm for the constrained via minimisation which was to help design circuit boards with the minimum number of vias/holes between the layers of the boards. Indeed it worked well . Unfortunately the professor took all the credit it was to me a good paying job. Search for constrained via minimisation using ga for more on the topic.


How resilient is it to component tolerances? Have you tested the actual circuits to confirm that spice was accurate?


He should include monte-carlo tolerance test as part of the simulation..





Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: