Hacker News new | past | comments | ask | show | jobs | submit login

>Apple doesn't care about using their machines for general purpose development

That's a bold statement that I've heard repeated over many years and never actually seen any real evidence of considering the Mac is the only place you can develop applications for the majority of Apple devices. Not a week goes by on HN without someone saying "Apple doesn't care about developers and/or will stop making general-purpose Macs" and yet there isn't a XCode for iOS or Android or Windows or Linux.




General purpose development includes more than developing applications for Apple devices.

XCode is a good example of what I mean. It's a terrible developer experience for anything but targeting Apple's platforms in the way you want to (meaning writing Swift, using their frameworks, and their devtools, and keeping your code base small to keep their devtools functional, and never caring about that code running on a different OS).

Compare to Visual Studio, which also only runs on Windows, yet is a pretty damn good developer experience and not a nightmare to support for cross platform projects of late.


Yes but until XCode runs somewhere else, Apple has to support C compilers and CLI tools and allowing general-purpose code to run on their machines. XCode relies on all of these things. Just because Apple doesn't make a general-purpose IDE for their platform doesn't mean anything as long as you can install VSCode or vim or Jetbrains or whatever your preferred IDE is.

Again, this is repeated here at least once a week for the how many years I've been visiting this site and it's no closer to being true now than it was back then. If it was even remotely true Apple would have left the Macs on Intel and just phased them out in favor of the iPad Pro, but instead they spent billions making their iPad chips run MacOS and desktop applications and code compiled for Intel processors and real talk here, what about that gives anyone any indication that they're planning on throwing all of that away?

They're actively doing the opposite of what you're claiming and spending billions of dollars to do it, and one blog post that says "this isn't even a big deal" is all it takes to convince you otherwise?


I was replying to the comment you made about XCode not running on other platforms. That's neither here nor there. My point is it is not a good developer experience for anything but client apps on their operating systems - things like iOS apps and consumer programs, not web design, backend engineering, high performance computing or the slew of other kinds of software that Macs are getting worse at.

This goes deeper than developer tools. The documentation for their core frameworks have been purged from official sources and is relegated to deprecated sites and comments header files tucked away in /Library. Kernel modules are being deprecated. There's no alternative to IPP or MKL on ARM for the M1 chip. Docs for plugins architectures are being more and more hidden away, and Apple's Developer Conference consistently only focuses on consumer facing applications - while support for professional applications and advanced computing is only available if you work for a partner organization, making it less accessible.

The reason I say that Apple is making their platform harder to use for general purpose computing is from my experience shipping code on MacOS for the last decade. It's fine if you disagree, but that hasn't been my experience. Every year it costs me more time and money to target Macs than the year before.


>XCode is a good example of what I mean. It's a terrible developer experience for anything but targeting Apple's platforms

It was and is built as an IDE for Apple's platforms/languages, so this point is moot, and doesn't prove anything general about macOS as a platform.

It's like saying Emacs is bad for developers, because SLIME doesn't really play well with Javascript coding...


Development for Apple devices is wide in volume but narrow in scope.

You pretty much don't develop anything else on Mac OS. No web, no embedded, no Linux, no Windows, no nothing. Only Mac OS software, iPhone apps, etc. I am exaggerating slightly, but you get the idea.

Or when you do, you use your Mac OS laptop as a terminal. Or at best, something to run VMs, and in both cases you don't develop with Apple OS / tools, it's just hosting or giving you non-integrated access to completely different systems.

Apple things are a pretty much distinct and closed ecosystem, and one which is quite limited to (some) endpoints.


None of anything of what you said makes any sense at all. I absolutely develop web applications on my Mac, using a graphical IDE. I've also developed for Arduino boards using a graphical IDE on my Mac. I've written software natively on my Mac that I put into production on Linux and Windows machines. And all of that is supported on the M1 as well. They even built a complete and complex and powerful subsystem to allow you to run Intel code on their new processor.

Apple has never stopped me from installing VSCode or Atom or Sublime or Jetbrains and they've never given me any reason to think they would do so in the future, especially given Rosetta 2. I've never run a VM on my Mac.

I have no idea what point you're trying to make here, so I'm trying my hardest to argue against the best possible interpretation, and that interpretation is still so unbelievably wrong that I still feel like I must be missing the point.


Thanks for giving some counter-examples :)

So I already wrote that I was exaggerating slightly, and I still believe your case fall into that. For example about embedded dev, I had in mind things less individual tinkerer oriented and more productized. I don't know: set-top-boxes, base stations, software for trains, software for washing machines, smartcards, etc; or even big equipment controlled by an OTS desktop/laptop-like computer, or a PLC. I'm sure in a few exotic cases Mac will be involve here and there, but lets be honest, Windows as an host dev station is far more probable. Maybe a few Linux too, but probably far from the majority. And Mac OS would be probably: very far.

Now about running Intel code, I know about the excellent x86 emulation layer Mac OS+M1 have, and it's great, but I actually don't care at all for what I was thinking about, I think that broadly apply for x86 Mac as well. I'm more thinking about the software ecosystem, the precise HW CPU for devs is only interesting maybe for people developing SIMD code or, well, running VMs.

About running VSCode & co, that's great to but where are the toolchains for Mac OS host for the targets I talked about? That's why I qualified Mac OS in this case as merely used as a "terminal", I was speaking in the broad meaning of the term, a graphical terminal, not just a VT100 like terminal. The actual toolchains are elsewhere.

About web-dev, I admit that's probably where you can do most of the non-Apple-only dev while staying really native, although probably not if you need a complex server-side setup. Arguably I went way too far when I wrote "no web".

Well, nothing is absolute, and I know Mac OS remains a general purpose OS even able to host some serious dev. Just I think it is not really the most used one outside of let's say client related consumer tech and some pro-desktop tasks, mainly on Apple techs. Claiming embedded in the general case would really be stretching the narrative.


For completeness’s sake, the arm-none-eabi-gcc and GDB multiarch I have on my Mac lets me do embedded development ;)


I imagine there might be a lot of embedded toolchains that only run on windows, so I can see that being used there. But in my experience the vast majority of web development (here in the UK at least, but I get the impression it's similar in the US and the rest of Europe) is done on macs. And you can see that reflected in the software support. Languages/ecosystems like Python and Ruby still don't have support for Windows that's on par with mac/linux support.


Yeah I take back what I wrote for web-dev. Although I find Linux broader than Mac Os in this case because the server side is likely to be Linux in prod and never Mac Os, and in some complex cases it will also be Linux in dev. But even with that, yes big parts of web-dev take place on Mac Os.


Unless they use docker, in which case it takes place in a Linux VM on Mac OS.


> I am exaggerating slightly, but you get the idea.

You are exaggerating wildly, and I am not sure what your idea is. Why is web or embedded dev impossible on a Mac?


Web-dev I mostly take it back (well this is subtle: for complex server side dev scenario you will use Linux even in dev -- but even with that enough work remain possible under purely Mac Os to not say that web-dev broadly can't be done)

But embedded: where are the most used tools for FPGA? where are the compilers for microcontrollers? Of course you can do some, in a limited capacity, with a subset of targets and a subset of toolchains, not the most used on earth. But broadly in that area, yeah you don't use Mac Os.


There aren’t that many tools that are on Linux but not macOS. You’re probably right for FPGAs (which I tend to consider a small subset of embedded, but of course there are different points of view), but the standard tooling for embedded platforms is there on MacOS, including some of the IDEs (I am not going to say most, because I am sure there are counter-examples).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: