Why does Intel need to die? Sure they're not exactly the company they used to be, but would it be enough for them to just move away from x86? I'm just thinking I don't want just one or two or three companies doing procs.
Intel stagnated and at the same time started implementing some rather anti-consumer practices. This allowed AMD to take the performance lead off them with their latest generation of products. It’s fantastic that the market for processors is so competitive. I’ve grown to not like Intel very much recently, but I’m glad they’re here. They’ll keep the pressure on for further innovation, so AMD will either need to keep up or be overtaken again. Either of which is a good outcome for consumers.
Absolutely not. We need more competition because the #1 reason we got to this situation is mono culture and a single platform (x86). We need Apple to succeed of creating an alternative ARM based desktop/laptop platform and for more competition we could add in Mips64 from China to the mix. I am really hoping that by 2025 we are going to have 3 major platforms available for end customers, so that there is real competition.
And where are those chips going to be made? The issue with Intel's dominance is it's complete dominance on the supply side as well.
You fail to realize that this isn't like 3D printing, or other low volume manufacturing. You can't just setup a 100nm Si lithography lab in your spare room and churn out RISC-V chips.
In 5 years - realistically we will have a few high performance(non-mobile) ARM chips manufactured at economic scale. Any other type of disruption would require Intel and AMD to fail and relinquish the supply side capacity... or China investing billions into new chipmaking facilities now.(it takes a few years to build that capacity)
China already has 14nm online, and should have 7nm in a year or two, so that means that we will probably see some real RISC-V chips from there soon, if sanctions continue.
So I think that we will have a four way competition between Intel, AMD, Apple, and Chinese RISC-V chips.
That being said, I don't see x86 dying, I think AMD and eventually Intel when they wake up will be competitive.
When you say they "should have 7nm in a year or two," are you just banking on them copying or stealing a European-made EUV machine?
China cannot be competitive at the razor's edge if its semiconductor companies depend on promptly copying/stealing technology that European, Taiwanese, and American companies bring to market.
7nm doesn't require EUV. Intel has 10nm which is equivalent to TSMC 10nm without EUV, altough it's not that great.
SMIC has already produced some 7nm chips without EUV.
As for EUV for the further future, there has been quite a bit of research in that domain for many years in pre-emption of this, and while I think they will be a node or a node and a half behind for a while, they will almost certainly have one ready eventually. Of course, that will be accelerated by stealing data on EUV machines, or maybe buying a used EUV machine from someone and reverse-engineering it.
I don't see X86 dying either, I think it will be dominant in the desktop/laptop segment for a long time. I am not sure why Longsoon uses Mips64 over RISC-V. Is RISC-V generally available and ready for prime times?
I think Longsoon still uses MIPS64 because of institutional knowledge. It's moreso Alibaba and HiSilicon that I think are promising, and they both seem to be getting on the RISC-V train.
That is a great question. I am not familiar with how much the production of these CPUs are dependent on ASML, TSMC etc. I think think China is kind of forced to have its own supply chain after the Obama era ban on Intel chips in Chinese supercomputers.
How much innovation actually comes from China, versus just being stolen by China?
The lithography companies actually have to talk about the measures they take to stop China from stealing their IP on their earnings calls.
China is a manufacturing hub, but its (often government backed) chip companies run low-margin businesses that don't make enough money to invest heavily in R&D. Go look at Apple or Qualcomm's gross margins and compare them to Huawei or Xiaomi.
This really depends. Once-upon-a-time, at least in the UNIX (tm) world, there were a plethora of ISAs, and this was the environment where ideas like Java really made sense. Write once, run anywhere.
Most OSes are still pretty well situated to handle this. Java remains, and is easily cross platform. I can run Java-based proprietary games like Minecraft on my POWER9 desktop, despite no-one involved probably ever considering ppc64le a valid target.
The CLR on Windows is also pretty easily cross-platform, although it won't help legacy x86 PE executables. Apple has solved this for ages on the tooling side, encouraging production of "fat" binaries with many arches since OS X was NeXT, and your .app packages needed to run on x86 + m68k + SPARC + PA-RISC.
Emulators like Rosetta (and qemu's usermode emulation) can fill the gap of legacy executables, while these other technologies can make the end-user experience good. Of course, that's only if a) someone writes your platform's equivalent of Rosetta, and b) developers write crossplatform apps.
So, the answer depends on how cynical or optimistic you are :-)
The experience in Linux distros is that extra arches surface bugs that other arches paper over, leading to higher quality software. For example unaligned memory access is slower on some arches but causes crashes on other ones.
It's time for tick-tock to die and Moore's Law to stop being the guiding light of Intel management.
Instead, they should set up two groups: one to generate new architectures for desktop and server, and another to take the best features of those architectures and make them thermally efficient for use in laptops. The development of these two products should be unconstrained by time, because as we have seen, impossible deadlines delay the possible.
In the past 10 years, most of the chips that amaze me have simply done what was already possible, just with enough thermal efficiency that they can be placed in mobile devices.
Intel dying would be horrible for the world. They have so much institutional knowledge...
Is some of this because of those processor level flaws/exploits where the fixes resulted in disabling some processor commands making them slower and less efficient
With only a completely new/different architecture getting those advances back?
But I would also be a little wary, because ARM systems are way more locked down than x86 systems today.