Every time I see one of these pop up, the thought that software engineers are forever trying to avoid knowledge, understanding, and wisdom with another layer of abstraction comes to mind.
I’m all in favor of a better HDL. Verizon/SystemVerilog is loaded with completely non-obvious landmines. I’ve been doing this so long I forget they are there, but it’s pretty painful seeing someone new to the language step on them. But the alternative, VHDL has largely fallen out of favor in the US.
You would be hard pressed to find a more strongly typed language than VHDL, but damn is it verbose. None of the footguns, but you might get an RSA before you finish typing the code in. If you have ever given Ada a try, VHDL will look pretty familiar.
I know this may be a weird thing for software folks to think about, but writing HDL is a tiny part of digital design. In digital design, if done with discipline, writing the HDL is an almost mechanical process of translating the design. In a design that might take a year, writing the code might be 3 weeks.
Done without discipline you will spend all your time debugging. Wondering why it worked in the lab an hour ago, but after lunch nothing works and you won’t be able to make sense of it.
Understanding basic combinational logic, then sequential logic, followed by state machines (which are the bread and butter of digital design), followed by understanding IO timing and timing constraints (a brain damaged “language” to itself) will take you far.
Domain crossing isn’t so bad if you have those fundamentals.
Then you can spend time learning algorithms other more interesting things. Writing low power software accelerators for neural nets and signal processing.
You can go through all the gyrations of language design in the world, but the language isn’t the hard part. There is a huge amount of improvement to be done, no doubt. But digital design is not the language.
If you want to make the world of digital design a better place, more open, easier to break into, work on tools, not languages. I’d give a kidney for an open source timing diagrammer that could do simple setup and hold checks, create derived signals through Boolean combinations of other signals, and emulate a flop.
I’d do it, but I’ve tried and programming a gui is about the most painful thing I’ve done on a computer. So much work for so little payoff.
> I know this may be a weird thing for software folks to think about, but writing HDL is a tiny part of digital design. In digital design, if done with discipline, writing the HDL is an almost mechanical process of translating the design. In a design that might take a year, writing the code might be 3 weeks.
I say this as an FPGA engineer that started my career in software: This used to be the case in software too. But we now have such good tools with REPLs, automatic tests, IDE integrations, partial implementation possibility that it's simply way less inefficient to build software like that.
I feel like that is precisely where the current hardware ecosystem stands. VHDL/Verilog/SV are simply not good enough to use it as a design exploration tool.
Instead of saying: "We use HDLs just to put in the design which we do beforehand" it should be "HDLs are the central tool in digital design. Both during design exploration and implementation". I think this is what people want out of these "neo" HDLs. The way you are viewing it is as only as a straightforward drop-in replacement. That is not what it's about.
Assuming you meant to also include 'verification', then I can agree somewhat.
As an ASIC designer, you usually get just one stab at a hardware implementation, there's no 0-day point release. It's got to work so you spend the minimum time typing it in. Then you devote the maximum time to verifying it. I think some of the very poor quality s/w we are all subject to is partly due to a false sense of security that sw tooling provides. I expect fully working and efficient hardware, I'd like that from my software too, but in reality I rarely get that. The idea of applying more software dev processes/principles to hardware is a bit frightening to me.
Not all circuit design is asic. There is a ton of FPGA work. But in general I do agree with you. The move fast and break things ethos is something you shouldn't really import ;).
> Assuming you meant to also include 'verification', then I can agree somewhat.
This project involves static verification of one part of hardware design, namely the timing properties of pipelines and sequential circuits.
This is not all of verification (which is a huge issue in modern semiconductor design) or even of high-level design verification, but it probably helps.
I'm also an FPGA person that started off in software and this is exactly how I feel, and the reason I'm building my own HDL too. Software languages have evolved so much since the 80s and are so much more productive, HDLs have missed out on that, but convincing hardware engineers about that is an uphill battle
Hell, I'd even give my left nut for a vendor- and IDE-agnostic tool that would convert a schematic to your HDL of choice. I still do a lot of work on graph paper.
„they“ (someone not working in the field) try to solve a problem (i.e. warts in old and mature languages) that is literally not a problem in the field at all. Then they get pissed at you when you point it out that nobody needs this, like at all. Oh well.
It works the other way around too, hardware getting designed that no sofware person asked for or wants to develop for. Most hardware people are bad at software design, and vice versa, but the differences are necessarily papered over in software, leaving a lot of the theoretical gains on the table.
They could at least use a number that would reflect the current transistor density if they could have that number of nm with planar 1980's transistors.
TSMC now labels their process e.g. “N3”. So there has been a shift away from nm in some contexts.
But it really doesn’t matter. There’s not a single physical number you can extract from the process that accurately describes its performance. So just continuing to use “nm” and assigning some number that feels right is actually a reasonable approach.
Transistor density for just, like, a grid of unconnected transistors? Or some reference design or something like that?
IMO it is interesting to get a general idea of where the companies are, but in the end the element size doesn’t matter to end users. People should check the linpack benchmarks of the chips that come out, or whatever.
SRAM size has not been scaling at all in recent nodes, so these days the notion of uniform scaling is also breaking down quite a bit. This means that future designs will have less cache memory per core, unless they use chiplets to add more memory (either SRAM or eDRAM) close enough to the chip to become usable as some kind of bespoke cache.
Just for the record, the "not at all" part is incorrect for the nodes I'm aware of. Correct would be "way worse", i.e. it's still getting denser, but the improvement is way worse than that of random logic.
TSMC's N3E (their first 3nm that will actually see broad use) has the same SRAM cell size as N5. Their original 3nm had 5% smaller SRAM cell size than N5, but that turned out to be too aggressive and the process was too expensive for most of their customers. So for the time being, TSMC has indeed hit a wall for SRAM scaling. But it looks like N3P will be able to shrink SRAM again.
Attempting to goad a critical reviewer into engaging in some sort of comments-section "public debate" (assuming that somehow the public discussion would change the reviewer's mind) a _decade_ after the fact? That behavior is, in a word, insufferable.
EDIT: Also, "I know it was a challenge to review a book of its size..." comes off as insinuating that (1) the book is somehow "grand" and (2) maybe the reviewer didn't "get it".
I remember when it came out because a friend was excited about it. As I recall it’s a pretty large book.
Edit: just under 1200 pages on Amazon. I never got into it because I couldn’t figure out what the big revelation was supposed to be. It would take some serious dedication to go through such a large book for the sake of an unfavorable review.
I've been using vi variants for decades and didn't know ZZ or ZQ. This solves a major pain point for me (mis-typing :q, trying again, now I'm off in the weeds).
ZZ is particularly nice since it sorta "does the right thing" in that if you have an unwritten empty buffer, it will silently discard it and quit. I guess I haven't really run into this as I was taught ZZ from the beginning but I imagine hitting :wq<cr> on such a buffer is pretty annoying.
I’m all in favor of a better HDL. Verizon/SystemVerilog is loaded with completely non-obvious landmines. I’ve been doing this so long I forget they are there, but it’s pretty painful seeing someone new to the language step on them. But the alternative, VHDL has largely fallen out of favor in the US.
You would be hard pressed to find a more strongly typed language than VHDL, but damn is it verbose. None of the footguns, but you might get an RSA before you finish typing the code in. If you have ever given Ada a try, VHDL will look pretty familiar.
I know this may be a weird thing for software folks to think about, but writing HDL is a tiny part of digital design. In digital design, if done with discipline, writing the HDL is an almost mechanical process of translating the design. In a design that might take a year, writing the code might be 3 weeks.
Done without discipline you will spend all your time debugging. Wondering why it worked in the lab an hour ago, but after lunch nothing works and you won’t be able to make sense of it.
Understanding basic combinational logic, then sequential logic, followed by state machines (which are the bread and butter of digital design), followed by understanding IO timing and timing constraints (a brain damaged “language” to itself) will take you far.
Domain crossing isn’t so bad if you have those fundamentals.
Then you can spend time learning algorithms other more interesting things. Writing low power software accelerators for neural nets and signal processing.
You can go through all the gyrations of language design in the world, but the language isn’t the hard part. There is a huge amount of improvement to be done, no doubt. But digital design is not the language.
If you want to make the world of digital design a better place, more open, easier to break into, work on tools, not languages. I’d give a kidney for an open source timing diagrammer that could do simple setup and hold checks, create derived signals through Boolean combinations of other signals, and emulate a flop.
I’d do it, but I’ve tried and programming a gui is about the most painful thing I’ve done on a computer. So much work for so little payoff.