Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Speaking from personal experience, when I transitioned from C (first language that I was three months into) to Objective-C, I couldn't STAND the number of cutesy metaphors used to describe what objects were and how they functioned: "Like, you can send the duck a message to swim, or to fly." "Ducks inherit from birds."

Coming from a highly plausible programming world of data and operations on data, this came off as nonsense. I wanted to tear my hair out finding a sober explanation of what an object actually WAS, practically, in practice instead of hearing them referred to as things that could be told to do stuff. Apple's Object-Oriented Programming was by far the most enlightening document in my early days:

http://developer.apple.com/library/mac/documentation/cocoa/c... (chapter 2)



> Coming from a highly plausible programming world of data and operations on data, this came off as nonsense.

It is nonsense.

On IRC the other day, I said this:

    19:19 < xentrac> I propose a new rule for discussions of object-oriented 
                     programming
    19:20 < xentrac> which is that anyone who brings up examples of Dog, Bike, Car, 
                     Person, or other real-world objects, unless they are talking 
                     about writing a clone of The Sims or something,

    19:20 < xentrac> is immediately shot.


You should propose alternatives that are similarly universally familiar.

Let's not forget that a Dog is not a dog; it's a word for a dog. A Bike is not something you can ride around on; it's a name for a concept you have in your head, by which you can classify real-world instances. Ceci n'est pas une pipe. I've noticed a lot of people who dislike OO seem to have a very distorted idea of what OO is and isn't...


Discussing about real-world objects makes it easier to grok what polymorphism is about - subtype polymorphism being almost equivalent to OOP, its most important property and the hardest to understand.

Exemplifying with a Duck that's also a type of Bird is good - many times you don't give a fuck that an object is a Duck, all you care about is if it can fly or not.

And to replace these examples, how awful they may be, you've got to find other alternatives.

So shoot me.


The trouble is that, in good OO programming, we don't make class hierarchies in order to satisfy our inner Linnaeus. We make class hierarchies in order to simplify the code by allowing different parts of it to be changed independently of each other, and to eliminate duplication (which comes to the same thing). Without any context as to what the code needs to accomplish, you can't make a judgment about whether a particular design decision is good or bad.

Here are some real-world examples (drawn from my own code, sorry):

• A Timer is a horizontal strip on the screen with a stripe racing across it. A NumericHalo is a spreading ripple on the screen that fades. A Sound is a thing on the screen that makes a sound when a Timer's stripe hits it. A Trash is a thing that deletes Sounds when you drop them on it. They all inherit from Visible, which represents things that can be drawn on the screen and perhaps respond to mouse clicks or drop events, but they do different things in those three cases. In addition, the Trash and Sounds are subclasses of ImageDisplay, because the way they handle being drawn is simply to display a static image, so that code is factored into a superclass. http://www.canonical.org/~kragen/sw/pygmusic/pygmusic.html#V...

The next two are only reasonable examples if you know the λ-calculus:

• Var, App, and Ind are three kinds of Nodes representing an expression graph: variables, applications, and indirection nodes. But in evaluating an expression using combinator graph reduction, sometimes you need to mutate one kind of Node into another. So instead of making them subclasses of Node, there are Strategy objects called Var_type, App_type, and Ind_type, and Node objects delegate almost all of their behavior to that Strategy object: http://lists.canonical.org/pipermail/kragen-hacks/2006-Novem... Note that this allows the code to later extend the combinator graph type with λ abstractions by adding another Strategy object.

• Doing combinator-graph reduction directly on λ-expressions, you have three classes of graph nodes — variable references, λ abstractions, and function applications. All of these types support a common "graph node protocol", which recursively generates printable representations of them in various formats, such as Scheme, TeX, plain ASCII text, or Perl. In a language like Java, this would be an interface, but in Perl, protocols don't have an explicit representation, so these classes don't inherit from anything in common. These classes implement an additional protocol used for β-reduction, which consists of two methods, "occurs_free" and "substitute". However, β-reduction to normal form requires identifying β-redexes, which is not something that a single expression graph node can do on its own, so this is done by functions that violate encapsulation by making explicit is-a checks. http://lists.canonical.org/pipermail/kragen-hacks/1999-Septe...

The next example is only good if you already know basic algebra:

• The Formula class overloads a bunch of operators to make it convenient to combine formulas into other formulas, and it has subclasses Binop, Unop, Variable, and Constant that represent particular kinds of formulas. These subclasses implement methods derivative (you know, taking the derivative, calculus), identical, simplified, and eval, as well as coercion to strings. Derivatives and simplification in particular are wildly different for different kinds of binary operations, so you could imagine making subclasses of Binop for the five binary operations supported, and that would probably be better than what I in fact did, which was to use dictionaries of lambdas for evaluation, derivatives, and simplification. This would involve replacing instantiations of the Binop class with calls to a factory function. http://lists.canonical.org/pipermail/kragen-hacks/2001-Janua...

• The Subfile class implements most of the same interface as Python's built-in file type, but provides read-only access to only a small byte-range of a file. This means you can pass a Subfile object to a function that expects to be passed a file object, and it will work on that small part of the file. This allows the `_words` function (search for `def _words`) to be written as if it were reading an entire file, separating out the tasks of checking for the end of the byte range and breaking a file into words into two separate objects. http://lists.canonical.org/pipermail/kragen-hacks/2006-Augus...

• UI interacts with the user of a chat client; icb interacts with a chat server; httpreq interacts with a web server. All three of them implement the protocol that asyncore expects of "channel" objects, which involves methods like readable(), handle_read(), writable(), handle_write(), and so on — basically things that manage event-loop I/O on a file descriptor. They also all inherit from asyncore's `dispatcher_with_send` class, which provides default implementations of some of these methods. Two of them also inherit from an `upgradable` class, which provides an implementation of dynamic code upgrade — replacing their current code with code newly loaded from the filesystem. (This allows you to hack the code of the client without closing and reopening your chat connections.) http://lists.canonical.org/pipermail/kragen-hacks/2002-Augus...

(Relevant drive-by dismissal: http://www.netalive.org/swsu/archives/2005/10/in_defense_of_... "this article [by Fowler] finally made me understand what OOP was all about. This is not true for many other articles and yes, I'm looking at you, shitty Car extends Vehicle OOP tutorial.")

The problem with the "Duck extends Bird" kind of example is that it gives you no understanding of the kind of considerations you need to think about in order to decide whether the design decisions discussed above are good or bad. In fact, it actively sabotages that understanding. You can't add code to ducks; you can't refactor ducks; ducks don't implement protocols; you can't create a new species in order to separate some concerns (e.g. file I/O and word splitting); you can't fake the ability to turn a duck into a penguin by moving its duckness into an animal of some other species that can be replaced at runtime; you can't indirect the creation of ducks through a factory that produces birds of several species, and even if you could, the analogy doesn't help at all in understanding why the analogous thing might be a good idea in the Binop case; penguins don't implement the "fly" method that can be found in birds; whether you consider ducks to be birds or simply chordates does not affect the internal complexity of ducks; and you don't go around causing things to fly without knowing what kind of bird they are. (Ducks themselves decide when they want to fly, and they certainly seem to know they're ducks and not vultures.)

So I disagree that the analogy "makes it easier to grok what polymorphism is about". It's misleading; it obscures the relevant while confusing people with the irrelevant.


Yeah, I'm sorry but I'm not sure if any of those examples are at the depth of a typical beginner who is learning what polymorphism is.


Even the first one? You can actually see the objects on the screen and move them with your mouse.


Oh, wait. That works. I just think that the lambda-calculus example may be a bit advanced for a programmer who's only starting to learn what polymorphism is. Not every coder starts off with Scheme.


Person is shot.


Person has a GunShotWound, which is a subclass of Wound. (cf. AbrasionWound, LacerationWound, etc.) Also note the new properties such as bulletType, entryPoint, and fragmentationPattern.


Sorry. It's Java. The GunShotWound has to come from a GunShotWoundFactory.


Unfortunately, GunShotWoundFactory now needs to implement AbstractGunshotWoundFactory so that we can support multiple types of gunshot wounds.

Of course, we'll need an AbstractGunshotWound as well and a subclass for each type of gun used.

Needless abstractions make my head HURT!


DON'T WORRY GUYS! Dependency injection to the rescue!


The WoundFactoryFactory will take care of it.


Actually, I'll be honest: I quite like dependency injection. There are a lot of really good things that can be done with Guice. I've written one for C++ here: https://bitbucket.org/cheez/dicpp/wiki/Home


Or Python.


When I taught myself to program, I learned a functional language, in no small part because the language that I was learning was not object oriented. After some time, my code was starting to turn into spaghetti, so I started enforcing a rule where I put functions that operated on like data in separate files, and created abstractions so in my primary program flow I could work at a high level to keep things clean and understandable. When I was taught about objects they weren't very easy to understand, but when I connected the dots and saw it was what I was already doing, plus state, it all came together. I don't think that most educators do a good job explaining how OOP can really help you, they just start rambling about re-usability and polymorphism. I knew how to program, so I could understand why you would want these things, but to a newcomer they must seem meaningless and unapproachable.

If I were teaching, I wouldn't start new programmers with OOP, but I would bring them there quickly. Imagine an assignment where you have the students model a car, a stop sign, and a traffic light, and their interactions on a closed course. I'd have the students model the interactions using ruby hashes (e.g. car = {:color => 'blue', :state => 'stopped', :stop_time_required => 1}, traffic_light = {:color => 'red', :stop_time_required => 30}), and a series of functions for the first assignment, where the program prints what happens as it encounters a pre-defined course. This should result in some pretty crappy spaghetti code. For the next assignment, I'd have them clean up the structure of the program, separating the code into logical units based on what the code is operating on, a car, a red light, a stop sign. Lastly, I'd introduce OOP and have them update the code for OOP. It should take a lot less code to do the same thing, and it should be a lot cleaner. Polymorphism can be addressed by instructing the students that red light and stop sign are both traffic control devices, and having them implement to_s on each class to clean up their string handling code.

OOP can make lots of sense when it's presented as a time saving measure. However, especially in Java classes, it tends to start will a wall of code thirty lines long to print "Hello World", with the direction to ignore everything else. It would probably be better to start simple, with a 1 line program (e.g. puts 'Hello World'), and work up from there. That's not to say Ruby is the right language, it's probably a good starter language, but a strictly typed lower level language would probably be better once they get the hang of things. The beauty of ruby is that you can create useful code quickly, but you still need to be able to understand what's going on under the hood.

The two classes I learned the most from were a digital hardware class, and a assembly course. In the hardware course, the professor gave me a FPGA to work with after he saw I understood digital logic very well, and on that FPGA I implemented a basic instruction set, including jump, which could be entered using toggle switches, executed using a push button, with results output to a lcd. In the assembly course, the professor spent a great deal of effort manually compiling c to assembly, and explaining why. He was on the ANSI-C committee, and worked full time writing assembly for spacecraft at NASA, so offered a very unique prospective, and quite a bit of insight.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: