For me at least, being statically typed is overall a strength. Yeah, it's not that much work to include types when declaring vars, but the benefits are you don't have the problems with types in expressions that you do with dynamically typed languages (Javascript for example, which is one the reasons why Typescript was created).
... although, Java have does support some dynamic typing, as you now don't need to have the type when instantiating objects, using the "var" keyword and the compiler will infer the type from the object that is being set to it (although, this is just syntactic sugar).
`var` has nothing to do with dynamic typing. It is still statically (compile time) typed, so the type can not change at runtime. Compare that to JavaScript where you could easily switch the type of a variable from Number to String.
agreed, it's not (as mentioned, it's just syntactic sugar). Still, how often is changing the type of a var needed? (besides minor casting issues)
And not saying that dynamic typing doesn't have a place, I really like working in Python, it's just that for more complicated code, prefer statically typed as it leads to less problems with your expressions. To each their own.
In my opinion Helidon is the most refreshing of these frameworks. It supports virtual threads from the ground up and comes with clean, function, mostly annotation free APIs. It really looks like a Java framework should look today.
Windows reputation is declining, so the operating system might be the actual crisis. Linux with modern desktops (e.g. Gnome 3) might fill the gap, but the market is far from broad adoption. Promoting and improving Linux desktop and apps would be a long endeavour, but betting only on Windows which degrades to a cloud and AI advertising surface might be fatal.
While that's true, I also think these things tend to happen as a gradual build up to the tipping-point effect where the zeitgeist shifts so suddenly that a massive player is suddenly irrelevant.
Microsoft is structurally incapable of making Windows better. Intel is intrinsically incapable of making x86 better (enough to matter). x86 hardware manufacturers are in a price race to the bottom, and there's no way around that.
Apple doesn't have any of those problems. Instead, more and more young people can afford and aspire to get a Mac. They want to buy software that works on the mac, and they'll want to write software for the Mac. The network effect compounds.
I swear that I read this comment in 2019, and it's still wrong today. Young people want iPhones, go look at Apple's revenue breakdown. iPhones and iPhone accessories dwarf Mac sales, the only comparable product in terms of revenue is the iPad. There is no evidence that Apple Silicon has changed that B2C story.
In the broader B2B sense, Apple lost pole-position to Nvidia. They're not the ecosystem kingmaker they once were, and their ARM architecture is failing to subsume demand for their competitors. The "Private Compute" Mac-based servers are going terribly according to reports, and their contribution to the chip shortage has even driven them to collaborate with Intel Foundry Services: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...
The zeitgeist exists on forums like this. Outside where people touch grass now and then, they largely don't care.
x86 OEMs are a race to the bottom because that's how the PC market has been for eons as PCs are a tool, not a status symbol, but how has x86 not 'gotten better'? It's significantly more battery friendly than it has ever been by a long margin, matching the M-series.
It's just momentum. You buy windows because you have windows. You buy windows because there's not really much of a choice. You have windows because you're not going out of your ways to reconfigure hundreds of laptops at your company just for your employees to be less comfortable with Linux.
Now introduce a choice… and things might change.
With the vast majority of software nowadays living in the browser, your OS matters less and less, especially for a business that buys machines for its employees.
OTOH, client Windows is the smallest and least important building block in it. Microsoft is helpfully also setting all their native apps on fire too and replacing them with webslop that runs equally poorly on MacOS, ChromeOS and Linux as it does on Windows 11, so the biggest concern is (A)AD integration and centralized management… and all three are decently manageable these days. If Microsoft didn't throw in the Windows licenses for free, more orgs would already be looking at ditching Windows 11, and if it keeps getting worse, even that won't look like a good deal any more.
Gradle could be used as an AI benchenmark on its own! The syntax of plugin DSLs changes all the time. Special credits could be achieved for handling old (Groovy) and new versions (Kotlin) of Gradle itself.
However, the cause it not really Java as such, but massive frameworks and tooling (e.g. Maven). Maybe AI will bring some fresh air to Java, because old truths might no longer be valid. But it will be hard to get rid of old habits because it manifested in AI training data.
My takeaway is that Go almost always prefers simplicity and not so much good software engineering. `nil` without compiler checks is another example, or designing a new language without generics. However the overall simplicity has its own value.
I agree, its strength (beyond goroutines) is that anyone who knows one of the popular languages (Python, Java, etc) can easily translate their idioms and data structures to Go, and the code would remain easy to read even without much Go experience. That's probably one reason why the TypeScript compiler team chose Go.
But this makes the language feel like Python, in some ways. Besides nil, the lack of expressivity in its expressions makes it more idiomatic to write things imperatively with for loops and appending to slices instead of mapping over the slice. Its structurally typed interfaces feel more like an explicit form of duck typing.
From what I remember of a presentation they had on how and why the made Go, this is no coincidence. They had a lot of Python glue code at Google, but had issues running it in production due to mismatched library dependencies, typing bugs, etc. So they made Go to be easy to adopt their Python code to (and especially get the people writing that code to switch), while addressing the specific production issues they faced.
> he popular languages (Python, Java, etc) can easily translate their idioms and data structures to Go, and the code would remain easy to read even without much Go experience
disagree, they made many decisions which are different from mainstream: OOP, syntax as examples.
Sure, the syntax is unique, but it's fairly easy to get over that. I guess I'm comparing to Rust, where not only is syntax different, but data structures like a tree with parent references aren't as straightforward (nor idiomatic), and there's a lot more explicit methods that requires knowing which are important and which are just noise (e.g. unwrap, as_ref).
I would argue that after a short tutorial on basic syntax, it's easier for a Python/JavaScript programmer to understand Go code than Rust.
to me Rust syntax is less alienating, they adapted ML syntax which is probably second most popular(scala, typescript, kotlin) after C style syntax, while Go from whatever reasons got something totally new.
Which aspects of Rust syntax are adapted from ML? Semantics sure, but to me the syntax seems a lot more similar to C++ (e.g. semicolons, type parameters using <>, etc.)
For some uses, that's all you need, and having more features often detract from your experience. But I'm doubtful on how often exactly, I have been able to carve out a simple sub-language that is easier to use than go from every stack that I've tried.
I think of it as a bit like Python with stronger types.
I'm not convinced that you couldn't have good software engineering support and simplicity in the same language. But for a variety of mostly non-technical reasons, no mainstream language provides that combination, forcing developers to make the tradeoff that they perceive as suiting them best.
I don't like Bitwarden UI/UX. It looks not really polished. Especially the "folders" are akward. How the implemented it, calling them labels and designing them like labels would make way more sense. But the whole UI looks like software developers - and not designers - built it.
I think I tried using it maybe 4 years ago or so, and I had the same feeling. It just felt.. awkward to use, lots of friction. I was hoping it had changed by now, but I guess that hasn't happened.
reply