You can actually go one step further with your "Reason 2". Many may not realize this, but everything that you can do with the "bracket" part of Objective-C you can also do with straight-up C and the Objective-C Runtime methods.
In terms of size, they're almost the same (Objective-C classes are implemented with structs). In terms of speed, the difference is absolutely huge. You can generate a random number faster than you can make an Objective-C call.
But, it's fast enough for 95% of things people do with it.
Actually, that's on an iPhone. Here's the article for OSX from the start of 2008 (two months before your link), where the cached message call comes out faster than the virtual calls, and the non-cached is ~ 4.5x slower instead of ~ 2x on the iPhone:
Exactly. Many of the benefits of Obj C's dynamism come into play with the UI, where the main speed constraint is the response time of the user. And whenever you need to write compute bound code it is easy to drop down into C.
One of these days I will write a detailed account of my experience learning Obj-C, but for now I'll say this:
The fact that Obj-C takes C and adds objects on top is its biggest flaw, just like with C++. Worse than C++ though, Obj-C doesn't even attempt to fix at least some of C's more glaring flaws... you get the whole language, quirks and all.
I think it depends on what you want from the language. If you want to write C and gain some safety through the use of the message-passing abstraction, it's perfect. If you want a real high-level language, it's lousy.
With clang, ObjC is getting very interesting features that take it closer and closer to "real" high level languages, like GCD, ARC and blocks.
I'd still like more type safety, but the evolution there is slower and would need more fundamental changes to the language and abandoning C compatibility.
I dunno--in an age where we have actual, halfway decent garbage collection, referring to ARC as something that might compete favorably with those "real" high level languages makes my eyes go a bit crossed.
People in 2011 lose weeks to tuning allocators and collectors in real high level languages. C programmers take a memory profile, deploy the appropriate pool/arena allocator, and get on with their job, comfortable with the fact that their former allocation bottleneck is now in the average case register-fast.
ARC fits better with the C world of quickly tuning bottlenecks to register speed. Have you ever debugged C code with custom object lifetimes under a garbage collector? It's a nightmare.
People in 2011 lose weeks to tuning allocators and collectors in real high level languages.
But that's, statistically, almost no people out of 'people who use GC'. There aren't that many allocators in common use that even have much in the way of tunable parameters (I can only think of the JVM ones, offhand), the people who spend weeks tuning them have very particular needs - high throughput, low latency, high-enough complexity to warrant a high-level language. These are the sort of people who also do things like start at a PHP page and end up chasing performance improvements in the bowels of a network kernel driver. It's done, but it's unrepresentative.
ARC fits better with the C world
It seems like a much more central reason ARC is more suitable than a GC for Objective C than 'GC tuning is occasionally a dark art'.
What can I tell you here? You're right, of course: most apps don't care about performance at all, either in space efficiency or time efficiency. Those apps shouldn't use C, and should certainly use a GC'd language.
I'm only reacting to the idea that (pp) "because it's 2011, nobody should be using ARC". Well, a statistically tiny number of apps may require the bare-metal performance that hand allocation provides, but they're also disproportionately important apps.
Ah, I read it more as 'It's 2011, where's my flying car/why aren't just about all apps running in managed environments'. Sort of like one says, 'It's 2011, why did syslogd just brick my server'. And it's not a completely lisp-machiny, neckbeardy sentiment, 10 odd years ago everyone was telling us the flying car was just around the corner - Apple was busily trying to bridge Java into Rhapsody, Microsoft was working on .NET/CLR. And yet, and yet...
Or you could pick a well thought out OO language like Java or C#, maybe even Python. Unfortunately you're trapped using the abomination for iPhone development unless you pick Monotouch.
Not even Java's fans would defend it as a "well thought out OO" language.
Since this is Hacker News, it maybe appropriate to point out that nothing prevents you from building whatever specific flavour of OO you want on top of the Objective C runtime, if you want it badly enough.
I would not call Python a well-thought out OO language. Its OO is essentially a cranked up dict; the abstraction over the dict is just barely papered over. It can get pretty awkward.
Would anyone care to comment on the state of Obj-C outside the Mac/iOS platforms? The impression I get from previous posts on this is that the utility of Obj-C is closely tied to the functionality provided by the (proprietary) Cocoa platform.
Since there are so many developers from outside the Apple ecosystem learning Obj-C in order to develop for iOS, is there any signs of a movement in the other direction? Are developers who don't think of themselves as purely Mac/iOS developers trying to take Obj-C back with them to other platforms?
Posts about how great or poor Obj-C is as a language are kind of irrelevant really as long as it's (effectively) the mandatory language for development on Apple platforms, and isn't much used beyond that. In that situation there are no choices to be made based on opinions about the language's merits or flaws, so they are largely moot.
As far as I am concerned, Obj-C and Cocoa are so tightly integrated into each other that using one without the other is a rather pointless exercise.
There are compilers available to use Obj-C (the language) on just about any platform. I guess you could use Obj-C to write GTK or Qt code (I doubt that the Qt preprocessor would like that, though).
If you read that Obj-C is great, that usually means that Obj-C is great for Cocoa. Cocoa and Obj-C make for a great team! Probably comparable to Android and Java or .NET and C#. It ultimately does not matter whether Obj-C would be of much use without Cocoa, since almost all people will be using it as part of Cocoa.
Great tires are of little import if you don't own a car.
I get the impression most developers use Objective-C because they have to, rather than because they want to. Take a look at the Tiobe graph for Objective-C:
A while back I tried learning Objective-C on Linux. My end goal was to do iOS development, but I wanted to try it out before I spent the money on a Mac. Maybe I'd hate Obj-C and know immediately it wasn't for me...
Using the Objective-C language on Debian was as easy as doing "apt-get install gobjc". But, as you stated, most of the utility comes from Cocoa.
Even though it's technically possible to use Objective-C without Cocoa, in practice it's almost not worth it. It's not like C++ where you can chose between Qt and Gtk+ or some other big library. There's Cocoa/Foundation or there's nothing. It would be hard to find learning material that didn't use Cocoa, even if you wanted to try it.
There are a few alternative Cocoa implementations that run on Linux and even Windows.
Cocotron lets you create Windows apps using Cocoa, but it's a cross-compiler that runs in XCode. Didn't fit my needs, so I didn't try it.
libNuFound was pretty good for the Foundation part of Cocoa, but it took a bit of work to make it work. Objective-C with only Foundation has a lot more functionality than plain Objective-C, but still no where near the usefulness of Cocoa.
The most complete and most popular option is GNUStep. It implements Foundation and a good chunk of Cocoa. They've done good work, but the documentation was lacking, and I never knew whether code I saw on the web and in Cocoa tutorials would work under GNUStep, without trying it. Apple's developer documentation is really good, and I used it as much as possible, but it was pretty common to run into stuff that didn't work under GNUStep. My other big complaint was that it's a really heavy weight "library." I don't know if it's their fault, but I vaguely remember having to install a bunch of seemingly unrelated dependencies just to get their *-dev packages installed on Debian.
Perhaps not the best comparison, but for me, using GNUStep libraries to develop Cocoa always felt like using Wine to run Windows programs.
I had a few other links about setting everything up, but these two were helpful:
Objective-C is a nice UI glue language but I wouldn't want to write anything of any algorithmic complexity in it. The data structures are just way too awkward and manual boxing/unboxing of primitives puts it way over the top.
I write my iOS app UIs in Obj-C but the brains are in a C++ core. The new C++11 support in LLVM 3.0 has tipped the balance that much further.
I'm kind of stuck with Obj-C on iOS but I can totally believe you could build an abstraction layer like the one you describe over vanilla C++ that would give you most of the benefits of Obj-C. Clang's Obj-C++ support makes it pretty painless to mix Obj-C and C++ as long as you clearly delineate responsibilities.
So far I haven't had to do any serious string handling in my C++ engines (mostly audio apps). The new unicode stuff in C++11 ought to make this manageable but I haven't actually tried it yet.
Is anyone using Objective-C as a systems language? That is, without Cocoa and other Apple libraries and APIs? Otherwise, why is it worth comparing it to C? Is anyone suggesting that Linus could write the Linux kernel or git in Objective-C? Objective-C is more fairly comparable to Java, C#, Qt or GTK rather than plain C.
Most people who say they like Objective-C actually mean they like programming for Macs and iPhones.
Yes, I am. And it is fine, either using ObjFW, GNUstep (which is essentially Cocoa) or just writing it raw with my own framework. It is a nice language, and the C apis to the runtimes are pretty nice.
>Objective-C was Apple’s response to object-oriented programming
Apple's response? At its inception it had nothing to do with Apple and for a long time was the language of NeXT. It's only since OSX that Apple has adopted Objective-C.
to add, if you haven't read Brad Cox's books then you really should. They are not heavy technical, but give time to some serious ideas. They are a little hard to get ahold of these days.
Am I the only one who thinks Objective-C is complete crap compared to Scala, Haskell and the likes?
Hell, even C++ is way faster to code in and provides a better experience.
Obj C is just immensely verbose. I'd love to be able to use Qt + C++ to program iOS instead..
You 're not the only one. I too think speed of development and maintenance is key. It's just that lots of people live in (and on) the Apple bubble. I think Obj-c was one of Jobs' obsessions or else i can't explain why anyone would want to force people to write code like that in 2011. I have a feeling that Apple may open up their APIs now that he;s gone.
Does submitting to the App Store require you to provide source? If so, it seems like a compelling reason to enforce a single language to avoid the explosion in skills required for the app reviewers.
You aren't forced to use Objective-C, at least not any more than you're forced to use Java on Android. It's just the native language of the system frameworks, which you are required to use. You can literally just have a glue layer to talk to the Objective-C underpinnings and write your whole app in another language (as long as that language can compile for iPhone, obviously). The reason people don't do this is because for most people it turns out to be less productive, not more.
I was personally a little off-put by Go. Some things just
felt a little odd to me, after an admittedly small amount of time with the language (implemented a trivial app with it to play around).
That said, I did quite like some aspects of it: goroutines, channels, functions can be added to structs.
I am hoping that rust makes it out of the lab at some point.
My biggest annoyance with Go is that Google, a search company, gave their language a completely un-Google-able name. I'm pretty sure the term "Go" invokes special handling.
I don't have a lot of experience in Go either (I'm writing my bot for the AI challenge in Go), but so far I've been very happy with it. What did you feel was odd?
I'm not the OP, but it felt like Go was coming from a culture a little bit different from the mainstream programming culture. The guys who made Go have worked together at Bell labs for year on stuff like Plan 9 and to me it seems that they lost touch with the rest of the world.
> The guys who made Go have worked together at Bell labs for year on stuff like Plan 9 and to me it seems that they lost touch with the rest of the world.
Before they worked on Plan 9 they created Unix, that the *nix world lost its way from its simple, small and beautiful roots is clearly something that still pains them greatly.
I would not say that they lost touch with the rest of the world, but that the rest of the world lost touch with them, which has been a great loss.
From piecing together the history I think it's clear that the real magic came from Ritchie. I doubt that Pike and Thompson in particular were ever in touch with the rest of the world.
Things like variable declarations matching the look of how variables are used ('inside out' parsing) are not simple and are responsible for the certain something that makes C work. And Unix was never beautiful, it was always a huge hack.
* The "exception handling" (or lack thereof). Defer, panic, and recover do provide an interesting means to achieve some similar behavior, but I ended up testing return values _a lot_, and all over the place. It felt very boilerplate and messy. Also, I found it odd that they claimed that the 'try-catch-finally idiom' was convoluted, then came up with defer, panic, and recover. Which calls code on function exit. I liked defer (great way to clean up allocations in reverse on the way out), but I did not find panic-recover as a convenient replacement for exception handling (nor does that seem to be the proper use-case for it anyway).
* new vs make. I don't see why they didn't just unify these in some way. Purity over pragmatism, perhaps.
* goto
* static compilation only (no shared libraries). I know static libraries are safer and sure do make deployment dreamy, but it would be nice to benefit from shared libraries if desired (easier to deploy security fixes, memory savings, etc).
Maybe I'm weird, but I actually really like Objective-C. My mind likes how the language works. I just wish there was more of a community around it so that it could more easily be used for things other than iOS/Mac development.
Having learned object orientation with Smalltalk/V, I never quite liked C++ (to the point of procrastinating for 8 years before really touching it). It felt just wrong.
Objective-C was always attractive (I like C), but its close association with NeXT and Apple (and corresponding little support from other platforms) always puts me off.
When I first started coding in Objective-C, I hated it because it just felt "wrong". The more I learned the language, the more I enjoyed programming in it. I still think some things are a bit awkward, but like anything new, it just takes time to learn to love it.
Example: I wrote a GA-based solver in Objective-C. It worked but it was painfully slow. I re-coded in nice-clean C. The plain C version was incredibly fast and efficient in every way. Best of all, it is highly portable too.
The only thing I really don't like about Objective C is that message passing is considerably slower than function calling. You just can not manipulate an image using the `setPixelX:Y:` method in anything close to real time. Using functions in C, this is no problem.
So, there you go: Write your algorithms in C, write your program logic in Objective C. Oh, and I love how the OpenGL C API interacts so nicely with Objective C and Cocoa!
That said, I have a fair bit of experience with PyQt. I would ditch Objective C for something like MacRuby any day, but then I would (sorta) lose the ability to drop down to C if need be. (Still, does anyone have any solid experience in MacRuby? I would love to hear some!)
Also, if you're calling an Objective-C method and dynamic dispatch is causing measurable performance issues, you can cache a pointer to the C function that backs the method (called the IMP) via the class_getMethodImplementation() runtime API.
I tried MacRuby once a few months ago (https://github.com/tilltheis/fullscreenclock) and I have to tell you the Cocoa parts lacked some essential features (like correct syncing of Ruby and Cocoa setters).
But the nice thing about it is that you can intermix Obj-C and Ruby code freely to write the low level parts of your app in C/Obj-C if you want to. There is even a bridge between C/Obj-C and Ruby which enables you to call native functions from your Ruby code.
I think the nature of OpenGL's API (internal state, very simple C-style functions - contrast with DX) means it is very portable. Java also interacts nicely with it, as does C++, and I've not worked with WebGL but the very fact it exists says something about it.
OpenGL is a good API, but it's design is fairly terrible, and its implementations range from quirky to terrible (cosnider Intel's support for anything above 1.4).
The notion of internal state is fine and well for only the most trivial of programs--once you start doing something requiring multithreading nothing but sadness awaits.
I wanted to play with Objective-C for some numeric computing, but until Intel's c++ compiler supports it there really was no point in even getting started with it for me (sorry gcc)
Now, I've coded a lot of stuff in Objective C for iOS, and it works just fine, and I actually think it's a pretty good language. I'm glad that Apple has helped bring it more into the spotlight.
But ramitos is right; it really is a Frankensteinian bolt-on addition of Smalltalk, syntax-wise. The message passing code simply does not look like C.
That said, I'd encourage everyone to give it a chance. I, too, had a negative reaction to it when I first encountered it ("It's so ugly!") but you get used to it, and then it's fine.
> The message passing code simply does not look like C.
Arguably that's a very good thing, because passing a message has very different semantics from calling a function. Making the syntaxes of these operations identical would encourage a lot of confusion between two very different operations.
So, where in this article did the author actually state why Objective-C fixed C for them? Where was a listing of C's failings or Objective-C's strengths for their projects?
This is not really a good submission--it's got the rhetorical content of a tweet.
Reason 1: When you need what C provides, it is right there for you (being a superset of C)
Reason 2: When you need what C does not provide (basic reflection, generics, polymorphism, whatever) it is in the "bracket" part of Objective-C.
Reason 3: The two above do work quite well together - the impedance mismatch is minimal, compared to e.g. Python with extensions written in C.