I don't think this is strictly true, though it's rare. The easiest example is the semiconductor industry. ASML's high end lithography machines are basically alien and cannot be reproduced by anyone else. China has spent billions trying. I don't even think there's a way to make the IP public because of how much of it is in people's heads and in the processes in place. I wonder how much money, time and ASML resources it would take to stand up a completely separate company that can do what ASML does assuming that ASML could dedicate 100% of their time in assisting in training the personnel at said company.
The semiconductor industry is only tangentially or partially a tech company. They're producing physical goods that require complex physical manufacturing processes. The means of production are expensive, complex, and require significant expertise to operate once set up. The whole thing involves multiple levels of complex engineering challenges. Even if you wanted to make a small handful of chips, you'd still have to go through all that.
Most modern tech companies are software companies. To them, the means of production are a commodity server in a rack. It might be an expensive server, but that's actually dependent on scale. It might even be a personal computer on a desk, or a smartphone in a pocket. Further, while creating software is highly technical, duplicating it is probably the most trivial computing operation that exists. Not that distribution is trivial (although it certainly can be) just that if you have one copy of software or data, you have enough software or data for 8 billion people.
No, I think it's very clear that upthread is talking about how software is difficult to build a moat around.
Chip fabs are literally one of the most expensive facilities ever created. Saying that because they don't need a special moat so therefore nothing in tech ever needs a special moat is so willfully blind that it borders on disingenuity.
That's the comment you should have responded with instead of the one that you did.
Upthread used the term "tech" when the thread is very clearly talking about AI. AI is software, but because they used the term "tech" you cherry-picked non-software tech as a counter example. It doesn't fit because the type of tech that GPT-4 represents doesn't have the manufacturing cost like a chip fab does. It's totally different in kind regardless of the fact that they're both termed "tech".
Yeah, this is probably also true for TSMC, Intel and ARM. Look how slow progress is on RISC-V on the high end despite RISC-V having the best academic talent.
Unfortunately, RISC-V, despite the "open source" marketing, is still basically dominated by one company (SiFive) that designs all the commercial cores. They also employ everyone who writes the spec, so the current "compiled" spec document is about 5 years behind the actual production ISA. Intel and others are trying to break this monopoly right now.
Compare this to the AI ecosystem and you get a huge difference. The architecture of these AI systems is pretty well-known despite not being "open," and there is a tremendous amount of competition.
Read the RISC-V foundation website. There are numerous "ratified" parts of the RISC-V instruction set that are not in the latest "compiled" spec document.
Saying a "compiled" spec is out of date may be technically accurate (or not, I don't have any idea) but if open, published documentation of the ratified extensions is on the web site, it's misleading to cite it as evidence that the spec is not open. And I know that the draft specifications are open for public comment prior to being ratified, so it's not a secret what's under development, either.
I never said that it wasn't actually open source. I just said that the openness hasn't actually created meaningful competition, because there is a single company in control of the specs that abuses that control to create a moat.
For a concrete example, the bitmanip extensions (which provide significant increases in MIPS/MHz) were used by SiFive in commercial cores before ratification and finalization. No other company could do that because SiFive employees could just change the spec if they did. They're doing the same thing with vector/SIMD instructions now to support their machine learning ambitions.
I would also add Samsung semi to that list. As I understand, for the small nodes, everyone is using ASML. That's a bit scary to me.
About RISC-V: What does you think is different about RISC-V vs ARM? I can only think that ARM has been used in the wild for longer, so there is a meaningful feedback loop. Designers can incorporate this feedback into future designs. Don't give up hope on RISC-V too soon! It might have a place in IoT which needs more diverse compute.