Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From what I understand, chips are now laid out by an optimizer instead of by humans. The humans design an adder, but the software picks where to place it and how to arrange the gates.

I took a couple undergrad classes in CE, which makes me dangerous, but can we be that far away from building a macro system to convert small bits of imperative code into FPGA logic?



Naively translating imperative code to run on FPGAs won't do much in the general case. It takes time to get data to and from the FPGA so whatever you're doing on the FPGA has to be time consuming enough to make up for the time spent shuffling data around. FPGAs tend to excel at things that are highly parallel (e.g. routing) or streaming applications (e.g. audio/video codecs), so most applications don't have obvious ways to benefit from an FPGA.


I'd also throw in "things that have precise timing considerations" to the list. Like driving an LED display, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: