Hacker News new | past | comments | ask | show | jobs | submit login

Key takeaway for me was the note that it “definitely isn’t a community designed language”. This is telling, I’ve watched a lot of neat languages over the years and absolutely none of the ones without community involvement have grown beyond the sphere of influence of their primary companies.

I like swift enough I’m learning it to build my own personal tech nerd ideal podcast client because I own apple devices and want an app that works between macOS, iOS, iPadOS, tvOS,and watchOS. but I doubt I’ll ever use it for anything beyond this one personally motivated project. Even if i release it as an app on the store for download or purchase I don’t know if I will ever be motivated enough to build anything else using it. Because the scope is too narrow. Business work is converging on open stacks like react, and angular, and the dark horse of C# with its latest release supporting WASM web components backed by GRPC-web and a C# function driven stack from front to back, even without SQL Server costs this is a compelling ecosystem backed by PostgreSQL and other completely open source tools.

But Swift remains Apple’s language for apple stuff and… while a profitable niche, it’s still a niche.

Edit: typo fix.




I’d much rather have a language specifically for creating apps than another generic language. It’s actually my opinion that Swift got worse since they started wanting it be used for more things like the server side.


The weird stuff came out of Tensorflow and SwiftUI IMO. Distributed actors looks like it’s not hurting the design.


which sucks because apple had the chance not to repeat their mistake with Objective-C .. but history repeats itself.

No matter how much apple market share in whatever market grows, developers want to be able to switch platforms and not have to think about the language.


> apple had the chance not to repeat their mistake with Objective-C .. but history repeats itself.

Objective-C has served them for over 40 years, building several different OSes over a multitude of CPUS and platforms.

If anything, Swift would welcome a repeat of this history.


You can build OSes in many different languages, and compile software to many different CPUs and architectures.

That alone does not make it good.


APIs designed for and built in Objective-C have been the mainstay of Apple application development ever since the Cocoa framework, stretching all the way back to NeXTSTEP in the 1980s. Even with the advent of Swift, most of today's iOS and macOS APIs carry the legacy of Objective-C. Much of development with Swift has involved with attempting to use the new language for old APIs. Perhaps legacy alone does not make it good, but it certainly demonstrates that there has not been a need to rewrite all of it using newer languages, that Objective-C has been reliable and venerable.


Objective-C is such a small and concise language. It doesn't get in the way of development in quite the same way Swift can delay building out a feature due to bikeshedding.


Vendor lock-in will always inevitably backfire, because developers and users don't want to be trapped. It might work to maximize profits in the short to medium term, but there's very strong financial incentives not to be stuck with a single platform. Vendor lock-in could mean the death of your product, or even your whole business.


Most new products are positioned to grow into a form of locked-in market that you can extract value from, drawing a box around their customers for the sole purpose of squeezing it in the future. This business strategy is a mind virus and has even overtaken the aspirations of bright-eyed entrepreneurs. The aim has fallen from the simple "get rich by selling a $1 item to 10 million people" down to "create a product where customers are trapped in a dependency relationship with the product by design, give it away below cost to push out alternatives, then flip it and squeeze them for as much as possible" (where the last part is omitted from the initial business plan but still implied, and enabled by outsourcing the dirty work to new management via buyout).

The primary goal should be to maximize value, and within that a balanced tension between maximizing delivered value vs maximizing captured value. It's reasonable to be compensated for the value you add, but it needs to be in service of maximizing value in general. If the correct hierarchy of goals is inverted and capturing value becomes the primary aim then it inevitably devolves into this antisocial, monopolistic, lock-in behavior.


I'd say at least 95% of Swift users don't mind at all that it's mostly an Apple language.


Well, and 95% of heroin users don't mind needles and constipation. It's trivially true that the people who end up using a product are mostly the people who are content with its tradeoffs. That doesn't say much at all about whether those tradeoffs are ultimately optimal.


I don't know about optimal, but I've been a Swift developer since day 1 and yet never heard anyone I know complain about Swift being to Apple centric. It's typically armchair philosophers online that have purity concerns, not practical developers.


I don't have a problem with it being Apple controlled. I have a problem with it being presented as an open project while simultaneously Apple completely controls its trajectory and maintains private forks of everything, which are where the developer tools are actually built from.

If it was a closed process where neat stuff just dropped out of the sky each year, that would be fine. When features drop out of nowhere each year and then get laundered through ex post facto public "review", then I take issue.

If it was a closed process then we would expect that only features coming from Apple would exist. In a truly open process there would be facilitation for contributions from people outside the organization. In the Swift project as it is currently run, those contributions have withered on the vine; the core team doesn't particularly welcome or support anything that didn't originate internally.


This is a textbook example of what I meant by philosophers obsessed with purity.

And it doesn't sound like you're actually following Swift Evolution. A) Most of what happens is done in public, only rarely do they hide stuff until the last minute, like result builders for SwiftUI. B) As far as I know, they have never claimed that it's going to be completely open and 100% community controlled. The core team is mostly Apple employees, that is not a secret.


> it doesn't sound like you're actually following Swift Evolution

Nope, you're completely wrong about that.


Then why would you say something like this? It obviously isn't true.

> In the Swift project as it is currently run, those contributions have withered on the vine; the core team doesn't particularly welcome or support anything that didn't originate internally.

It's also a very niche objection to complain that it's neither fully open nor completely closed. Most people are totally fine that the development is mostly open, with some new features kept hidden for business purposes. The vast majority of Swift users see it as a tool, a tool mostly to write Apple software, and they are more or less pragmatic. Almost all additions to Swift have been very positive for people that use it in their day job.


I also have yet to meet anyone writing Swift for any reason besides "Apple made it and it works on Apple things", so we may be at an impasse here.


Hi, nice meeting you. Now you know one.

My current concern right now with swift is the impossibility to use any code on android in an officially supported way.


But is that different from using Kotlin on iOS?


kotlin has kotlin multiplatform. A project to run non-ui components on both platforms. It's been there for a few years, apple hasn't even started.


You can run non-UI Swift on Android too if you want. I don't know who made that possible, but I also can't see why Apple would sponsor Android app development, seems completely counter to their interests.


you can in theory, but the binding with the jni world will be atrocious.

You're also pretty much on your own with the library ecosystem.

I agree that it's not apple's interest. But that's part if the problem : this language didn't start with the goal of being just an apple language.


Objective-C was not originally designed by Apple (or NeXT)


I think Swift was a strategic mistake mainly due to the niche effect, even if it's a better language.

It would have been much more productive to team up with the C# team as C# is a mature language, and a combined Apple + Microsoft would have been able to compete against java (esp. with Oracle as owner of java).


An Apple C# future is a neat idea, but I'm not sure how you square that with the (IMO correct) observation that allocation control really matters for quality of experience on constrained platforms.

Obviously iOS platforms are much less constrained today, but not having to run a GC is a pretty nice thing.

Maybe C# could've been extended into that universe (I know Midori had their own variant but don't know much about it) but it seems daunting to then make that compatible.


I agree with you, and I've long speculated that lack of tracing GC is the main reason why iOS devices don't need as much RAM. It's too bad that the options for implementing a GC-free GUI that can target Android are quite limited, and as a consequence, the low-end devices used by people who can't afford an iPhone are inevitably saddled with GC.


Apple tried to do automatic GC but did so very badly and with a language not designed for it, so they had a bad experience and were scared away.

The modern GC in java is amazingly fast and with a few tricks likely to be good enough.


I'm fairly convinced that GC is the reason that Android phones need much more RAM and much more CPU to generally be worse than Apple phones in terms of performance.


You may be both right, as android is actually not using Java garbage collectors that are quite good, but they have their own runtime.


I'm pretty familiar with the modern Java GCs, and they're very impressive, but at the same time having to do manifestly less work with ARC is probably good for responsiveness and battery life.


I'm not saying you're wrong about ZGC, but I will say everyone heard exactly that line also at the time of Android 1. And that was not a language "not designed for it." (And Android is still not using it.)


So why are Android developers still complaining about their apps running poorly than iOS apps for years, especially contributing to battery drain?

Sounds like a lot of armchair generals debating in hindsight about any problem in anything that fits their dislike bias rather than fore-sighting about the specifics before it has happened.


You’re confusing Android with OpenJdk. The parent was talking about Java/OpenJdk’s gc implementations, not androids.


From the perspective of the user or app developer, they do not care. When they see an app perform slower on another platform for whatever reason they will complain. Maybe the problem was Java in Android (along side its sheer fragmentation) all along. Hardly any complaints about any of that for iOS.

No wonder that is the reason why they are preferring to moving everyone to use Dart / Flutter in their future operating system instead of continuing to use Java.

It is like as if Android was designed to be a disaster.


Again this isn’t Java’s fault. This is the Android Runtime’s fault. The Android Runtime is not Java or OpenJdk.


It's both of their fault; including the entire Android system design and Google knows it.

It's evident that Google and many developers have had enough of it from the runtime, Java, JDKs and the whole system. Otherwise, why on earth are they planning to move away from it all in the first place?

Once again, from the perspective of the user or the app developer, they don't care, Android still just doesn't cut it against iOS. No wonder Android is always second place.


Java is fast but takes 5x-10x memory that a proper AOTC language app.


Swift needed perfect interoperability with all the existing Objective C code, and that probably would have been difficult to accomplish with C#.


It's arguably difficult with Objective-C today, let alone C#!


Rust would be a more realistic choice, as it's built on LLVM.


> Key takeaway for me was the note that it “definitely isn’t a community designed language”.

If you scroll to the top of the page with Chris' comment, this point is being addressed:

>In the coming weeks, we hope to introduce a new language workgroup that will focus on the core of the language evolution itself, splitting off this responsibility from the core steering of the project. The intent is to free the core team to invest more in overall project stewardship and create a larger language workgroup that can incorporate more community members in language decisions.


Another committee is sure to fix the problem with the existing committees.


> and absolutely none of the ones without community involvement have grown beyond the sphere of influence of their primary companies

C touches much more than telephony




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: