Hacker News new | past | comments | ask | show | jobs | submit login
Software Design Patterns Are Not Goals, They Are Tools (exceptionnotfound.net)
305 points by kiyanwang on May 19, 2016 | hide | past | favorite | 165 comments



It probably seems like an obvious statement to a lot of HN, but I have a feeling that it isn't to the majority of developers, who for some reason appear to love immense complexity and solving simple problems with complex solutions. I think a lot of them started with OO, which immediately raises their perception of what is "normal" complexity --- at that point, they're already creating more abstraction than is really necessary. Then they learn about design patterns and all the accompanying "hype" around them, so they think "awesome, something new and shiny to use in my code!" and start putting them in whenever they can, I guess because it feels productive to be creating lots of classes and methods and hooking everything together. It's easier to dogmatically apply design patterns and generate code mindlessly than to think about what the problem actually needs to solve it. The result is code that they think fulfills all the buzzwordy traits of "good software engineering practice" (robustness, maintainability, extensibility, scalability, understandability, etc.), but in reality is an overengineered brittle monstrosity that is only extensible in the specific ways thought of when it was first designed. That almost never turns out to be the case, so even more abstractions are added (including design patterns) on the next change, on the belief that it will help with the change after that, while leaving the existing useless ones in, and the system grows in complexity massively.

I did not start with OO, I never read the GoF book, and don't really get the obsession with design patterns and everything surrounding them. I've surprised a lot of others who likely have, by showing them how simple the solutions to some problems can be. Perhaps it's the education of programmers that is to blame for this.

The statement could be generalised to "software is not a goal, it is a tool".

Related article: https://blog.codinghorror.com/head-first-design-patterns/


> I did not start with OO, I never read the GoF book, and don't really get the obsession with design patterns and everything surrounding them. I've surprised a lot of others who likely have, by showing them how simple the solutions to some problems can be. Perhaps it's the education of programmers that is to blame for this.

Design patterns are mostly crutches for languages that have pretty weak expressive power. In a powerful language, you don't need most of them.

See also: http://norvig.com/design-patterns/design-patterns.pdf, slides 9 and 10.


It's true, as Norvig points out, that newer languages are making old design patterns disappear (slide 10), but new ones are appearing at the same time, and he mentions that too. The simple design patterns like iterators die when iterators become intrinsic, but now we get higher level patterns built out of the small patterns now in the background.

I wonder if it's more about how much code you have to write. Patterns are patterns because you have to write some code to make them, i.e., they're not already there in the language, and as soon as you have a generalized class that actually solves a structural problem, you have a pattern.

I suspect that powerful and expressive languages won't make patterns go away, we'll just get powerful and expressive patterns.


Lisp macros give you the ability to abstract away patterns ad infinitum. There's no hard ceiling there. So you can make patterns go away with enough expressive power.


Can you give us a couple of examples of how lisp macros "abstract away" patterns?


Take the example from [0]:

> Also - there are such "nonexciting patterns" in OOP languages as well, and they are much less often abused. For example "a switch inside a while" pattern. Which I think I will need to name "StateMachinePattern" to make it cool and stop people refactoring it into strategies and stuff.

> There is value in one-screen definition of the whole machine, instead of separating it into 6 files.

In Lisp you could create a macro, say, define-state-machine, and use it like that:

  (define-state-machine :my-state-machine
    ((:state-1
      (do-something)
      (do-something-else)
      (if (condition)
          (transition :state-2)
        (transition :state-3)))
  
     (:state-2
      (do-something)
      (transition :state-3))
  
     (:state-3
      (do-something)
      (when (should-quit)
        (quit-state-machine)))))
This macro could easily expand to a "switch inside a while" (or, possibly, to a "let over lambda over cond", or maybe even into a series of low-level tagbody and go constructs). The resulting abstraction is clean and communicates its meaning well.

--

Different example - when writing macros in Common Lisp, there are two things one usually has to be wary of - unwanted variable capture, and unwanted re-evaluation. So you might end up manually assigning gensyms[1] to avoid variable capture, and manually creating lexical environments to evaluate passed forms only once. But you can also abstract it away! For instance, with with-gensyms and once-only macros from Alexandria[2]. Or you could build something like defmacro! described in Let Over Lambda[3], i.e. a macro that automatically code-walks your macro definition and extracts variables named g!foo and o!bar to apply gensyms and once-only to them, respectively.

--

Those are only two examples, but the general idea is - whenever you see yourself repeating the same boilerplate in similar places to express a concept, you can wrap that boilerplate inside a macro and make the compiler generate it for you. Since Lisp makes all of its capabilities available for you during macroexpansion, you can continue this until you're satisfied that your code is readable, boilerplate-free and clearly expresses your intentions.

--

[0] - https://news.ycombinator.com/item?id=11730248

[1] - gensyms are symbols generated on demand, that are guaranteed to have no possible collision with anything ever

[2] - https://common-lisp.net/project/alexandria/draft/alexandria....

[3] - http://letoverlambda.com/index.cl/guest/chap3.html#sec_6


I use this sort of thing in Tcl ( for path-dependent reasons ) A Whole Lot. You can have arrays, indexed by state, and use the "eval" operator to "eval $fsm($state)"

fsm is an array of strings, indexed by whatever.

I have made dynamic state machine generators this way, although it gets kinda disorienting.

The tactical advantage to Tcl (for me) is that it has serial ports and sockets as first-class objects with essentially identical semantics. I work in the embedded space, and this seems 1) unusual and 2) a very nice thing to have to write comprehensive test rigs.

I rather like it better because it's all in the "string" & "list" domain rather than the lambda domain. I really should try this in Lisp just to see how wierd it gets.


Eh, lisp macros are nothing more than a way to avoid quoting your symbols. Functions are more clear, and do not try to hide what is going on. Honestly wish it was like this: (defun 'my-adder '(x y) (+ x y)) ;; emacs lisp.

I'm a lisper and clojure user. Those are my main languages. Don't get the hype for macros, but lisp is the best.


The main difference is that unless you are using an interpreter, you can apply macros at compile-time. Also, without macro you build forms which you must then evaluate: eval cannot access the lexical environment. You need to systematically pass an environment, either at runtime in an interpreter or at compile-time: more boilerplate.

To summarize, meta-programming is a recurrent pattern that was abstracted with macros. The defmacro macro itself produces functions like you want to use, except that it integrate them with the macroexpansion facility offered by the environment.


I'm happy to be part of a community, where even when I spew out something ignorant there is a clear explanation that makes sense. Thanks for your answer.


I think this perfectly shows that design patterns are just that patterns. They're not goals but they aren't really tools either. They're just what you see when you compare different code bases because similar problems will provoke similar solutions even from different people.

Design patterns only become problems when people learn about them and try to apply them without learning or understanding how they came to be.


Or, apply them even though they likely will never need them...

One pattern implementation that I always hated, and still do was MS Enterprise Library's Data abstraction. I worked on one project that targeted three different databases for Enterprise customers, which was a great fit. That's the only time I worked with EntLib where using it was better than just using other tooling for the specific database directly.

In the end, sometimes people get used to a given framework, and don't stop to think if they really need that framework for what they're working on. It's one of the things I really like about the node ecosystem, though sometimes that goes too far the other way. That said, I tend to rail against certain patterns as they don't have much place in a given language/platform.

It tends to be something that comes with age/experience. But not always, I've known plenty of older devs stuck in framework/pattern rutt.


Design patterns are fundamental, you cannot avoid using them, the key is to understand this, learn about good design patterns versus bad design patterns (or "anti-patterns") and use them appropriately. The book "Design Patterns" by the GoF does include many patterns which are specifically targeted to working around the shortcomings of "weak" languages, specifically Java. However, that book is not the end-all-be-all of design patterns, and using different, more straightforward "idioms" in more expressive (typically more "functional" languages) doesn't mean you're avoiding using design patterns, you're just using different ones.

Unfortunately, a lot of people make the mistake of seeing the GoF list of design patterns as being the holy scripture of patterns, when it's merely an enumeration of some patterns that proved useful for a particular language in particular contexts. The concept itself (of looking at design at a level above mere algorithms and data structures) is hugely important, it's unfortunate that it suffers from a bit of the "kleenex" effect, instead of being seen as a general purpose concept with wide ranging applicability.


That presentation makes a funny assumption that it's dynamic languages that do that. Haskell is nearly as far from dynamic as possible, with all the same advantages w.r.t. patterns.


The presentation is about dynamic languages, but the observation about design patterns being crutches for offsetting language deficiencies is in fact a general one.


Haskell needs "zip3" when there are three arguments to zip.

That's a design pattern for optional arguments straight out of ISO C.


I think it would be possible to create a generalised `zipN` function, using the Haskell printf trick[0], but it would be a lot of complication for little benefit. It's rare that you need to merge that many streams at once, and adding a single digit is not a huge hassle in those cases. Also I'd probably consider splitting the operation up into stages if I was using anything higher than `zipWith3`.

[0]http://stackoverflow.com/questions/7828072/how-does-haskell-...


You also have an equivalent on a family of isomorphic types, namely the types

    forall z. (a -> b -> c -> ... -> z) -> ZipList z
This is gotten by taking your `list_a :: [a]` etc. and writing:

    \f -> f <$> ZipList list_a <*> ZipList list_b <*> ZipList list_c <*> ...
You can of course also use Haskell pair-stacks to do all of this, storing (a, b, c, ...) as the type (a, (b, (c, ... ())))...


I've done some examples of transposing Java code explaining various design patterns from the GoF book to Haskell, then seeing how it gets simpler when you have a couple of higher-level ideas from Haskell to work with. It could make a fun talk to present at a Java-centric conference, "today I'm going to teach you some Haskell." As a bonus, you get to see how the Java code deals with a tremendous amount of needless complexity.

For example, here's what I've written so far for the Visitor pattern:

https://gist.github.com/drostie/818c54ca5c8182143699a72da986...


Or rather, they can become either invisible or first class citizens that you can assign to a variable and pass around.


No, they disappear completely. You stop thinking in terms of "factories" or "strategies" and start to think in terms of simply passing methods around, etc.


I still think "oh, a generator function" when I program in clojure. I'm just not excited every time I got to use it. That's all the difference IMHO.

Also - there are such "nonexciting patterns" in OOP languages as well, and they are much less often abused. For example "a switch inside a while" pattern. Which I think I will need to name "StateMachinePattern" to make it cool and stop people refactoring it into strategies and stuff.

There is value in one-screen definition of the whole machine, instead of separating it into 6 files.


Agreed.

I never really saw design patterns as a reference manual for how to solve problems (as I think some people do/did). Instead, I saw it as a shared vocabulary.

For example, if I say that X is a factory for Y or an iterator over Z, you know exactly what I mean.


We still have monads and functors in Haskell. People in other languages have eg picked up the `monad' pattern. (Look for eg `flatMap' to spot it; that's monadic bind.)

In Haskell even the compiler knows that there are common elements to all instances of the monad pattern---they can reference a common typeclass. (But even if you don't implement an instance of the typeclass, you might still have implemented the monad pattern, but just don't realize it.) In eg JavaScript it's all by convention only.

But even in Haskell, the compiler does not enforce all monad laws, yet.

A more common example, but perhaps invisible today, is the `function' pattern. In most assembly languages, functions are a convention only, and you have to manipulate your call stack somewhat manually.

In almost all languages anyone is using these days, that's done automatically. Functions are not `invisible' but they are citizens with more right, ie the compiler / interpreter knows about them, you can give them a name, and in some more modern languages even pass them around, or _not_ give them a name.

Some patterns do become invisible as you say.

`Factories' are one that only exists because of weird restrictions in eg Java. They are invisible in Python.

In Haskell for different reasons we have a similar solution where we define extra functions to return our create our objects instead of using constructors directly. Instead of factories, people call these smart constructors (https://wiki.haskell.org/Smart_constructors).

Smart constructors in Haskell could go away with a stronger type system like in Agda. (Yes, there are languages with stronger types than what Haskell has.)


Totally agree.

Functions are a design pattern in assembler, but a language feature checked by the compiler in more advanced languages.

Classes are design pattern in C (called ADTs), but a language feature in all OOP languages.

Generalizing the concept: A design pattern is something the programmer has to check himself/herself for correctness, instead of being automatically checked by the compiler or interpreter.


Like C#? It may be just the opposite: What if design patterns are popular with complicated, highly expressive toolchains because they make users think they have control of, and are being productive with those toolchains?

You don't hear design patterns mentioned a lot with simple toolchains like that supporting golang.


No, unlike C#. Design patterns are pieces of boilerplate code you should, but are unable to, abstract away. Which means that by definition the language lacks the expressive power to abstract them away.


Yup. Visitor is a dirty hack for pattern matching, for example.


Unfortunately, power was too much for his readers, so he dumbed down his work into Python.


> the majority of developers, who for some reason appear to love immense complexity and solving simple problems with complex solutions.

It took me a while to break out of what I learned in school about how to write software. My single most important takeaway has been to not seek generalization and abstraction of the problem I'm working on, but solve it as directly as possible, and only generalize or abstract my code after I've completely finished solving the same problem at least twice. In my experience, the single biggest software time waster I've witnessed (and participated in) is people solving problems they don't have.


I tend to unconsciously follow the rule "write for the problem at hand, aggressively abstract as soon as you have to solve a variation of that problem."

Sometimes that means I'm writing a function, and I find I've written a few lines to summarize and output info about a data structure the same way a few times in that function for debugging purposes. I'll pull that out into a function.

Sometimes that means I have a fairly complex processing function for a data source, and a few months later I need to process a new data source of similar, but not the same, info. I'll try to (somewhat) normalize the data sources to a format that can be processed or almost be processed by that function, and make the changes to the function so it can work on the mostly normalized data, and handle the specific differences as needed.

It's almost never a good idea to make your algorithm very abstract in the beginning, as it generally makes it more error prone, and usually you don't know the specifics about how things need to work anyway. By the time you're feeling the need to abstract the function, not only do you know more about the problem and can implement it cleaner, but you can use the prior implementation as a test harness to confirm the rewritten version is functionally the same (or better). I can't stress how useful that is and how much time and pain it can save.


I agree 100%. I think you have to experience the problems of having to deal with a system that was over-engineered from the very start using all kinds of unneeded abstractions. Once you ran into that, you hopefully appreciate simpler solutions that most of the time tend to be far more adaptable to changes than your OO-monster you accidentally built.

I once had the joy of witnessing a program I wrote being rewritten. Before the rewrite it was ~400 lines of Python written in the most obvious way possible without any abstraction overhead. You could easily scan the complete code and understand what happens. The rewrite was totally over-engineered, split up into a dozen files, organized into some inheritance hierarchy. Even the database access was abstracted out, so it could potentially use another DB at some point. It didn't improve anything at all but bloated the code by at least a factor of 2. But it was more "correct" according to some metric...


> Even the database access was abstracted out, so it could potentially use another DB at some point.

Did they use a library?


In school are students not always taught to solve using the naivest implementation possible, and then improve?


The first time I was told, "Make it work, make it right, make it fast," was in my first programming course. However, it was in passing by another professor substituting. After the first programming course, we had OOP Design. It's how to better structure OOP software, abstractions and design patterns. This was our, "This is how good programmers do it."

That same professor who had subbed then beat it into our heads - in our senior level programming languages course - that making the damn thing work takes priority over all. I guess better late than never.


My classmates and I were taught to use the most naive code possible, and then improve it immediately using patterns that abstract and generalize the problem being solved. We were taught to look for improvements that solve problems of safety and scale, we were taught to look for ways to use existing design patterns.

Those are important basic software skills to learn, but in my schooling they were never offset by a philosophy that dealt with how and when to use them, and more importantly, when not to use them.

I've personally seen in my ~20 year professional software career so far a lot of software engineering that is overly expensive by a long way because people decide to add unwarranted complexity.


Ok but under-engineering can also be a problem.

Say you and your team are going to build a REST server that will serve hundreds of endpoints.

Nothing complicated, just a bunch of CRUD code with some business logic.

So here is the simple way to write each endpoint function, or method.

  RESTResponse getUserPrefs(Userid id)
  {
     return DB.getUser(id).prefs;
  }
except you need security and code to setup and teardown DB the transaction.

  RESTResponse getUserPrefs(Userid id)
  {
     if( !Security.checkAllowed(id) )
       return REST.403;

      RESTResult result;

     DB.getConnection();
     DB.beginTransaction(true /*Readonly*/);

     RESTResult result = DB.getUser(id).prefs;

     DB.endTransaction();
     DB.closeConnection();

     return result;
  }
Now imagine giving this example to a team of devs to develop those hundreds of bitty REST endpoints.

Also as you can imagine, the boilerplate introduced here may have to change per endpoint. There might also be other boilerplate that may need to be added for object level security, authentication, error logging etc ...

So when you throw this code at your developers, it means that you'll end up with hundreds or thousands of little methods all similar but all different.

They are going to cut and paste this example and modify as they go. They may come up with better ways to structure the boilerplate, leading to some old code mixed in with code written in a newer style.

Good luck working with that after a few years.

All these encomia for simple code usually considers only a single small program, with one person working on it.

Once code is loosed in the wild for a team of maintenance programmers to work on, you have a whole new problem, and lack of structure form the simple approach will cause weeds and vines to choke your codebase.


I couldn't resist making a Common Lisp version.

DB, REST and SECURITY are imaginary packages. CLSQL is an existing library for interfacing with databases. You define the meat of your code with a primary method:

     (defmethod get-user-preferences (id)
       (user-preferences
        (db:get-user id)))
Then, you add security checks with an AROUND qualifier; CALL-NEXT-METHOD is a little like super() in Java, but with multiple-inheritance and method qualifiers (around, before, after, ...):

     (defmethod get-user-preferences :around ((id integer))
       (if (security:allowedp id)
           (call-next-method)
           rest:+FORBIDDEN+))
You can add logging, executed before the primary method, after security checks.

    (defmethod get-user-preferences :before (id)
      (log "Getting user preferences for id ~S." id))
Likewise, you could log the outcome of the security check;

    (defmethod security:allowedp :around (id)
      (let ((result (call-next-method)))
        (log "Access ~:[denied~;granted~] for id ~S." result id)
        result)))
 
Also, you can wrap the call to db:get-user with the appropriate dynamic context where a database connection is open (it could be from a pool, or something else) within a transaction.

     (defmethod db:get-user :around ((id integer))
       (clsql:with-database (db *database*)
         (clsql:with-transaction (:database db)
           (call-next-method))))
I did not find an equivalent for the read-only flag.

If more methods need to follow this pattern, it might be preferable to define them directly with a macro which always open the connection and the transaction. The WITH-X macro ensure that transactions and connections are properly handled when unwinding the stack.


AOP huh? Nice.

The scenario i described above was from a real project I was on a few years ago.

The way I handled it (partially) was via the use of AOP. I also made use of dependency injection.

These two approaches combined managed to reduce the boilerplate to near zero.

Of course, in Java AOP means you have to take care to enable it in each part of the toolchain (build system, CDP, IDE ...) but it wasn't too bad.

I'm not sure which gang of 4 patterns I may have been using, and I don't care, but I do know that leaving it simple would have led to chaos, because I'd seen it happen before.


Generally, it's much easier to add complexity than remove it once it has been added.

> So when you throw this code at your developers, it means that you'll end up with hundreds or thousands of little methods all similar but all different.

If that's what happens with your developers, you need new developers or new management, clearly the ones you've hired are either incompetent or their management is forcing them to suck.


> If that's what happens with your developers, you need new developers or new management

I don't think so, it's just human nature and other pressures besides the purely technical ones: like the pressure to just get a task done, or to get the job done as easily as possible, and so on...[1]

As programmers we tend to think the the biggest challenges are technical, but they aren't: the social problems are a way bigger factor.

Also the problem I'm describing here applies to large low-complexity codebases, to be worked on by large teams.

These are common in 'enterprise' settings, which is why I think design patterns have so much appeal there.

[1] https://en.wikipedia.org/wiki/Tragedy_of_the_commons


I'm always curious about whether or not people are lumping in lower-level "coding best-practice" concepts like DRY in with this idea of "complexity."

It's fine to complain about people who "over-engineer" things, but I've also come across cases where someone uses "I'm Agile" as an excuse to completely throw out any and all coding structure or best practices.

And every time I try to say that maybe we should spend 3 seconds to remove the comments that are in Chinese and maybe adapt the code to our naming conventions after literally copy&pasting it in from StackOverflow, I get pointed to arguments like this post for why none of that stuff matters.


The problem isn't "best practice" per se, it's "best practice" in practice. :)

I can see good use cases for OOP, for instance, but OOP's definitely a leading contender for the "over-engineering" trap. I've certainly seen my share of "OOP / pattern spaghetti" to the point where just coding a simple select call involves writing several classes in random locations in the project. Like goto spaghetti, the result is way harder to debug and maintain.

I personally think lower level advice like DRY is much more important overall than patterns (in many OOP projects, it seems, I feel like they violate the KISS principle for no real benefit).

Even naming conventions can fall in this trap. Consistent naming convention is absolutely a great idea. And then you run into the database that have several naming conventions, all inconsistently applied depending on the state of the project at the time. I've found inconsistent naming conventions not much more useful than no naming convention at all.


>I've found inconsistent naming conventions not much more useful than no naming convention at all.

Try working with someone who's idea of expanding the app is literally taking a controller class for another page that does something vaguely related to the new feature, copy-pasting it into a new file, changing the class name (like put an "N" at the end of it like "SearchResultsView"-->"SearchResultsViewN"), and then changing some of the internal business logic without changing the names of any methods or variables.

What you're left with is something that is much, much worse than no naming conventions at all. It's actually naming conventions that actively mislead you as to their scope and function. Like a "clickedEditButton()" method that is actually wired up to a "back" button on the page and takes you back to the previous screen.


No one is against coding structure, or even "best practices", if you interpret it as meaning the thing someone fluent in English but with no background in software development would understand it to mean. The problem is that "best practices" in our industry has become a specific term meaning "overly complicated by a factor of about seven, but with enough meaningless jargon invented to describe it that no one feels comfortable saying the emperor is in fact naked".


I would agree with that definition has become the new interpretation of "best practices". However, in my 5 year career I have seen the majority of solutions under-engineered. That is much more of an issue these days than over-engineering.


>majority of solutions under-engineered. That is much more of an issue these days than over-engineering.

Couldn't agree more.


I totally agree with you, I just think that the ideas being discussed in this thread have a ton of nuance and they can be dangerous to introduce to a junior developer who doesn't quite realize the difference between the idea of "complexity" we're talking about here and the fact that he shouldn't be copy-pasting boilerplate code all over the app and copy-pasting directly from StackOverflow without any editing or integration of the code.


I sometimes describe it as “framework envy”. I've worked with a fair number of developers who learned library design by looking at the standard library or popular frameworks and thought it was a model for everything, unaware that in many cases those were either poor examples (e.g. AWT / Swing) or complicated by offering a general solution usable for a wide range of projects and thus overkill for an internal app which would never be used elsewhere, especially when the core structure was still being determined.


I think this hits the nail on the head. I would say that more generally, a not insignificant number of OO design concepts were developed in an environment of coding modules meant to be used by other people (either in the form of frameworks, or multi million line monoliths).

For a large number of developers, neither of these environments apply, but they are still abiding by the same rules.


Yeah, it's like reading about some cool thing Google or Facebook is doing and then recognizing that your problem is much easier because the data fits in RAM


It's something to grow out of. I had that "framework envy" towards game engines in my early years, and was designing overcomplicated architectures for a while.


> I did not start with OO, I never read the GoF book, and don't really get the obsession with design patterns and everything surrounding them.

Unrelated to your original point, you should consider reading the book. Pay special attention to the section introducing the motivation for each pattern and the problem it solves. Over-engineering programs to deliberately include patterns is to be avoided; but there is no harm is knowing the patterns and when they apply by reading the original material.


Often is about developers trying to make their systems flexible.

In software. You can make any part of a system flexible. You can make several parts of a system flexible. But the more flexibility you add the less maintainable the code becomes, and there for the least flexible overall.

So you really have to choose where do you want that flexibility. Often is better to keep your design simple and clean; and add the flexibility when you actually need it.


Yes. The most general problem cannot be solved. So only generalize where you need to.


> the majority of developers, who for some reason appear to love immense complexity and solving simple problems with complex solutions.

This is so true. Of all the projects which I have seen that failed, not a single one failed because its design was too simple. In fact, almost all of them failed because they were over engineered and because of that it was too hard to quickly adjust to changing requirements. And still people keep making the same mistake over and over again.

It's not just related to design patterns, it's everywhere in the IT world. Take for instance protocols. SMTP: before the 80's there was a whole bunch of mail transfer protocols, many of them quite sophisticated. But which one won? The Simple mail transfer protocol. Or HTML, which was derived from SGML, a more complicated and much more generic protocol. The simple protocol won again. Or XML (derived from the more generic SGML), which is now being replaced by the even more simple JSON. The list goes on.


> I did not start with OO, I never read the GoF book, and don't really get the obsession with design patterns and everything surrounding them.

I did some OO, but like functional programming better. In Haskell, there's a certain draw to doing everything in the type systems. Or a quest to find the single abstraction that unifies all your code into a coherent whole.

(For a configuration language pretty printer that I worked on at Google, I recently managed to throw out a couple hundred lines of code, by finding a unifying abstraction for most of the recursive functions on the abstract syntax tree we were doing.)


> so they think "awesome, something new and shiny to use in my code!" and start putting them in whenever they can

Yeah. You do not truly know hell until you're trying to maintain a basic CRUD app used as a tool for someone to learn domain-driven design, CQRS, and event sourcing simultaneously.


Does anyone know how the Design Patterns authors became known as the "Gang of Four"?

I'm guessing it originated from an unflattering comparison, although my knowledge of the Cultural Revolution is pretty weak, maybe it was a compliment.


Not sure, but 4 co-authors is a likely partial reason. Also I guess partly due to imitating various precedents, and not just in tech. E.g. The Three (Musketeers, ...), The (..., ...) Duo, The Famous Five (kids novel series), etc.

A separate HN thread on that might be a fun read :) A similar one is eponymous Laws (Wikipedia has a list).



Yeah, I don't feel any enlightenment kicking in. I'm aware that the Gang of Four name originated with a cabal of power at the centre of Maoist China, but I'm curious who drew the connection and why.


Well, going by further events related to the use of design patterns (as discussed in the article and, e.g. userbinator's comment (which was at the top of this thread), maybe the connection is "comply, or you will be assimilated" ...


You must use the fancy words- its not just a pointer to common constant struct- its the flyweight pattern. If you just use the pointers and not the words you are not a professional.


YES.

Design patterns are super useful as tools.

As "goals" they are idiotic. I think lots of people that think they are idiotic have been exposed to them as "goals", or don't realize that's not the point.

I think there is a larger issue here, which is that many kinds of software development, including web dev, has become enormously more complex in many ways than it was when many of us came up.

People coming up now are looking for magic bullets and shortcuts and things they can just follow by rote -- because they are overwhelmed and don't know how to get to competence, let alone expertise, without these things.

It's easy for us to look down on people doing this as just not very good developers -- and the idea of 'software bootcamp' doesn't help, I think it's probably not _possible_ to get to competence through such a process -- but too easy to forget that if we were starting from scratch now we ourselves would likely find it a lot more challenging than we did when we started. There's way more stuff to deal with now.

"Design patterns" are never going to serve as such a magic bullet or thing you can follow by rote, and will often make things worse when used that way -- but so will any other potential magic bullet or thing you can follow by rote. Software doesn't work that way. It's still a craft.


"People coming up now are looking for magic bullets and shortcuts and things they can just follow by rote"

This is the normal progression of learning. As a beginner, following the rules by rote is the goal. So, for the newbie that is first being exposed to design patterns, implementing some of those patterns should be the goal so they can learn where they fit in the toolbelt.


For another somewhat funny take on Design Patterns watch the first two minutes of this talk from last year DEVOXX

https://www.youtube.com/watch?v=e4MT_OguDKg

Speaker echoes similar sentiments about design patterns being written by highly experienced developers and should be used to communicate about code etc.


Patterns are from software archaeology, they were naming things that were commonly seen and what they were for -- they were helping build a vocabulary to talk about larger constructs.

They are useful if you have a problem and one fits it perfectly, it can help you start thinking about it -- but it might not be a good fit.

In general we should be keeping software as simple as possible, with the understanding that it can be changed and adapted as needed. Often large "pattern" based projects devolve into a morass of unneeded complexity to support a level of flexibility that was never required.


They are rather from the interest of certain software developers in the work of the architect and philosopher Christopher Alexander, who saw patterns as the basic elements of a humane and sustainable style of building and communicating knowledge about building.

The concept, like so many other concepts, drifted from its origin and became a more rigid and doctrinaire "system", especially after the misunderstood book by the "Gang of Four".

In reality, patterns are everywhere in software, whether we call them that or not.

Game loops, event servers, compilers, map/filter, microkernels, stdin transformers, for loops over arrays, etc etc etc.

Richard P. Gabriel's book Patterns of Software is freely available and goes into the history and meaning of patterns. Alexander himself wrote a preface that's worth reading on its own.


This is an aside, but it's significant that Alexander's own architecture is fairly unsuccessful. His emphasis on loose formal structure and codification of practices (e.g. built-in window seats, vestibules of whatever it is) is prescriptive and doesn't help with the real problems of architectural design, which is something profoundly intuitive, circumstantial and very subtle. Architectural education is a fascination of mine, partly because it seems to succeed so rarely.

On a tangent, anyone in London is recommended to visit the end of year shows of student work at the architecture schools, coming up at the end of June. A lot of very intriguing and complex activity. (UCL Bartlett, Architectural Association, London Metropolitan (CASS) etc.) I can't speak for other cities.


This seems opposite to me. Most architecture is prescriptive and there is little tolerance for Alexander's circumstantial and use-based design. If the story he recounts in The Battle for the Life and Beauty of the Earth is true, his preference to use simple materials and let the budget constrain the design makes him an enemy of building firms which use standards to justify cost overruns. What did you think about that book?


I agree that much building is standardized and conventional. Architecture, for me, connotes a more ambitious attempt to build creatively and appropriately.

When I say "prescriptive" I suppose I'm referring to the propositional nature of patterns, like an engineer's rules of thumb. The contrast is not so much with individual genius and variation as with the maintenance and passing on of tacit knowledge, which is a mysterious process involving multiple senses and observational learning.

http://www.lse.ac.uk/economicHistory/Research/facts/tacit.pd...

That attitude might sound very conservative, but it's based on my own experience and not any preference for a particular technology or style. Alexander is too reductive and abstract; he's not properly in touch with actual sensory knowledge or craft skills. Or with economical industrial tech, e.g. the work of Lacaton & Vassal which uses cheap industrial building systems to create generous spaces.

I haven't read the book you mention. Informing myself about it now.


Let me know how you like it! (I get emailed when someone replies to an HN comment of mine.)

I think Alexander's patterns rely on tacit knowledge. Many of them involve designing in a way that real usage can shape the building or land. For example, the way many colleges will not pave paths until they see the tracks real students leave between buildings is the sort of thing Alexander has written about. Thank you for the paper; looking forward to reading it.


I've read a brief summary of the World System A and B concept. It seems like what I have come to expect from Alexander: some reductive rhetoric and claims that he has rediscovered a true, virtuous way of being. I'm not keen on encouraging "them and us" thinking, or cultivating the idea of life as a state of war (the battle for x). I agree in principle that architecture usually gets imposed on users, when it should be made in a participatory way.

Building "according to the dictates of the human heart" is all very well in theory, in practice it sounds like it would translate to an ideal of each family building their own home on a small lot. So where are we in relation to e.g. the left-right political axis? (It seems like Alexander is proposing a bottom-up approach to building and planning, in contrast to something imposed by the state or by corporations.)

If there's one major point I'd make in response to his dualistic picture of the world, it's that there are potentially varying degrees of citizen participation in the design process. A pragmatic approach would be to work on improving participation where possible rather than demonizing the supposedly pure form of "World System B" in categorical terms.

Here's a theoretical tool which presents a spectrum of participation: http://lithgow-schmidt.dk/sherry-arnstein/ladder-of-citizen-...

If there's more, architecturally speaking, to what Alexander is proposing than just a theoretical opposition between "how things are", and a utopian and backward-looking idea of "how things should be", I think it's probably a potential revalorization of the art-historical categories of "linear" and "painterly".

(For details of "linear" and "painterly" see https://en.wikipedia.org/wiki/Heinrich_W%C3%B6lfflin )

In architecture these really correspond to a "structuralist" emphasis on modularity and interchangeable components on the "linear" side, and an expressive emphasis on overall form and image-making on the "painterly" side.

To give an example of each in its pure form, Hermann Hertzberger is a linear, modular, structuralist guy, while Zaha Hadid would be on the painterly end of things.

Alexander is not proposing Hertzberger-style buildings -- he wants to bring back traditional architecture -- but essentially I think his background is structuralist, emerging as he did from mathematics in the 60s. It's this flavour to his work that makes me feel strongly that he doesn't really care about tacit knowledge and the intangible, non-propositional aspects of architecture, the aspects that just cannot be put into words.

Oppositions are hard to avoid in thinking about this kind of thing, but of course they need to be treated with care to avoid falling into simplistic thinking.

Approaches to architecture are awash with irrational aesthetic decisions, even when they claim to be fully rational. There's a famous book from the 60s called "Architecture without Architects" by Bernard Rudofsky that might be of interest, as it documents the kind of qualities of traditional anonymous architecture (not the elaborate, royal or religious kind).

A final thing that comes to mind is the philosophy of Deleuze and Guattari. They are French post-structuralists. Although, coming from that theoretical milieu, they come out against the binary "arborescent" nature of simplistic thinking based on oppositions, they do allow themselves to introduce a significant contrast: they compare the mode of existence of nomads with that of the state. They talk about the smooth space of nomads as the origin of the (improvised) war machine (essentially, of science), while the striated space of the state is the space of complex social structures and institutions (the university, for example). Their ideas are very rich and thought provoking, popular with architects but potentially still strange and radical at the same time. Their work is evidently a distillation of a huge amount of reading and contemplation. In my opinion Christopher Alexander's ideas look very tame and unimaginative in comparison, and the scope of his thinking appears just disappointingly limited.

Here's a link to a summary which might whet your appetite for Deleuze and Guattari (their writing is also interesting and compelling, but this PDF document just gives the bones of one of their main ideas): http://www.protevi.com/john/DG/PDF/ATP14.pdf


Thanks; really appreciate the reply.

I will go through the links. As you recounted your impression of the A vs. B system, I remembered that I didn't care so much for the introduction as I did for the actual story of building Eishin campus. It seems like his most important principles are to let real usage finish the design and to choose materials that support that as well as completing the work within budget. My gut says: how can Alexander be reductive and non-tacit when he is trying to allow space to be shaped by the people who use it, i.e. directly incorporating tacit knowledge? There is an "us-vs-them" to believing users know something designers don't, I suppose, but it's tempered by the humility of believing the users will do better than the architect advocating for the users. But there certainly are multiple ways to obtain tacit knowledge and involve it in a project.


I'm currently stuck in Riga's airport, and from this perspective, the definition of success in architecture is an interesting question.

There is a clear political vision in Alexander's work, one whose basis is participation and non-alienation. This is at odds with modern capitalism, which prevents its success.

The free software movement, too, is in a sense unsuccessful, despite a loyal and tenacious following. Still, it's worth fighting for and preserving.

Software has been in an acknowledged crisis for decades. Ordinary folks I know aren't thrilled about contemporary architecture.

Also, Alexander emphasizes that his "prescriptive" works, such as A Pattern Language, are made as examples of how grassroots human knowledge can be conceptualized and presented in a certain composable form.


There is a clear political vision in Alexander's work, one whose basis is participation and non-alienation. This is at odds with modern capitalism, which prevents its success.

In what way? I mean, in practice?


For just a very basic example, you would never see an Alexandrian pattern language that recommends the horrendous environmental practices of modern capitalism, which are literally destroying the world.


And lots of these patterns can even become first class entities in the right languages.

Eg almost all languages these days provide functions as a first class abstraction. You don't have to manage a call stack yourself.


Many, but not all, and the idea that nonreified patterns indicate a flawed language is pernicious. Gabriel's book talks about this.


Yes. We can go further: ability to reify patterns can be seen as a synonym as ability to abstract.

Funny enough, Java at least originally deliberate tried to limit people's ability to abstract common patterns in the name of `readability by average programmers'. At least that's the folklore.


Somehow I think the foreword of Christopher Alexander is very emotional. I cannot express why.

To me the key is joy. The joy of writing, maintaining and using software.


He is skeptical about the value of the "software patterns" movement and in a deeper sense critical of software in general.

He asks what standards we should have for software. In architecture, there is the Chartres and myriad other truly beautiful structures—what if we judge software by such standards?

I agree that it's a beautiful and powerful foreword.

To me, another key idea is software as a way of building. It implies that a computer system can be an inhabitable, ownable, sustainable thing made for supporting human life... like a shed, a tractor, a garden, a town...


> They are useful if you have a problem and one fits it perfectly

Patterns must not be a perfect fit. You don't build your living room elliptical because you have a round rug. There are pragmatic considerations that are much more important that "perfect fit".


>, if you ever find yourself thinking, "I know, I'll use a design pattern" before writing any code, you're doing it wrong.

Unless I'm misunderstanding him, I would disagree with this. When you're doing it wrong is when you use a design pattern without understanding what problem its solving, and whether you have that specific problem.

To use his tool analogy - if you're a joiner who turns up to a job thinking "we always need to use a hammer" and start randomly hitting everything, then you've gone wrong. But equally, if you're halfway through knocking a nail in with your shoe and think "Oh look, I'm using the hammer pattern now", you're doing it just as wrong.

If you're looking at two things you need to attach together and you've considered whether glue, a screw, a nail or something else is the most appropriate for this specific job, decide it's the nail and then think - "I need to use my hammer now", then you're doing it right.


I would say that using your shoe to emulate the hammer pattern is a good sign of flexibility and understanding the heart of the pattern.

The point of the hammer pattern isn't the hammer, but the movements and the forces applied. In Europe hammers are an axiomatic part of their nails already so they don't even have a name for the hammer pattern!

On the other hand, if you're with your screw and trying to spin it with the hammer because someone said use the hammer pattern...


It's a good sign of flexibility, but not a good sign of being a professional on a job.

And it's something I see depressingly often with programmers - tackling problems that there's a fairly well known existing pattern for, but not being aware of the pattern and either spending days/weeks rediscovering the pattern for themselves, or even worse inventing a new and less effective way of solving it.


Design patterns aren't the problem. All a design pattern is, is a well-known way of doing something.

When you build a house, do you re-invent how to frame, plumb, wire, and roof it? No. That's all a design pattern is. Choosing the right design pattern is akin to making sure that your basement is made out of cement and your walls framed with wood. (You don't want to put shingles on your countertops!)

The problem is that some developers think they are some kind of magical panacea without really understanding why the pattern was established and what it tries to achieve. These are the over-complicated projects that everyone is complaining about in this thread. (These are the projects where the basement is made with wood or the concrete walls too thick; or the projects where someone decided to put shingles on the countertop.)

I try to pick, establish, and follow design patterns in my code. It helps ensure that I don't spend a lot of time re-learning why some other technique is flawed; and it helps achieve a consistent style that everyone else on the team can work with.


I found both his definition of the adapter pattern and his example to be a bit off. In his example, the adapter extends the external interface instead of the client interface. By definition the adapter must implement the client interface. It's even in the UML diagram displayed on the website he quotes (http://www.dofactory.com/net/adapter-design-pattern)

  > The fact was that I just didn't understand them the way I thought I did.

  > To be clear, I've never read the Gang of Four book these patterns are defined in
After admitting he has a less than desired understanding of design patterns (proven by his poor example), he makes bold claims like:

  > if you ever find yourself thinking, "I know, I'll use a design pattern" before writing any code, you're doing it wrong.
I'm having problems taking this article seriously.


"Software Design Patterns Are Not Goals, They Are Tools" - I do not understand why this needs to be said in the first place.


I suppose you haven't seen much of the world of Enterprise Java Applications?

https://docs.spring.io/spring/docs/2.5.x/javadoc-api/org/spr...

Many years ago, I briefly worked in that industry and thoroughly hated the rigid, dogmatic, extreme overengineering culture and the resulting code it produced. I'm glad to be away from it all.


A few years back I worked on a project (by myself!) where I inherited an epic codebase that was basically the Enterprise Fizzbuzz application scaled up to ~30 person years of effort - it had tens of thousands of classes and a seeming desire to use every known feature of Enterprise Java and every Design Pattern - actually finding where stuff actually happened in the haystack of abstractions was quite entertaining.

https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...

NB I have a strange definition of "entertaining" in a work context - I was contracting at the time.


Much of the Java+Patterns=<3 stuff came from java lacking syntax for functions that didn't involve creating an entire anonymous inner class with an apply function; the Strategy pattern is an excellent example of a pattern that just goes away when you can pass functions around. There just are not good ways to express certain solutions without using at least one or two, for any sufficiently large piece of software, without that syntax. And if your language is making what you're doing awkward, it's probably not how most people are going to code. Guava had an entire disclaimer in their functional documentation that essentially said "whenever you reach into this toolbox, think about if it's really what you want". Now that Java8 has syntax for them, we should see people reaching more for Streams and their functional methods.

Now, that said, it's often more tempting and easier to just add complexity, which results in crazy codebases. It's much harder, especially under pressure of deadlines, to REMOVE patterns as they don't make sense anymore, which gives us what people think of as enterprise Java. Not all Java is like that, I promise.


Sadly, it seems that most of the java devs who created these monstrosities learned the wrong lessons.

As far as I can tell they seem to think total lines of code was the real issue (It couldn't be the Abstractions/Patterns, those are good!) and are now on a kick to move as many lines of code as possible to annotations. I call this "Annotation based programming" and any new problem starts with a google search since how any tool kit handles something is pure guesswork.


If you're going to denounce annotations, you must also denounce decorators in both Python and Javascript. Annotations are a super powerful tool that get used too much. To borrow from Lisp: data > functions > macros. They should be the last tool you reach to, in order to remove a bunch of repetition with the intention of simplifying your code. "Could you do what you're trying to do with a library of functions that you simply call into? Or some sort of composition? Or maybe a parent class?" type questions should be asked before you reach for any sort of metaprogramming.

Take a look at the Retrofit library from Square for a good example of usage of annotations. To the original Spring link posted, while there's lots of annotations, they're replacing the XML that used to plague Spring projects, which I find much more magical. At least I can jump to an annotation's definition in my IDE.


Software design patterns as a goal are perfectly appropriate for APIs - the point is that if you use a well known pattern then programmers working against the API will be familiar with what's going on - if they're familiar with the pattern. If they're not familiar it gives them an opportunity to learn about it, rather than some messed up emergent rats-nest of functions just pulled out of the air.

Even if the functions you provide only loosely fit an established pattern it's still worth "adapting" with verbose notes about how it doesn't conform to said pattern.

Actually I would go so far as to say there is value in developing within a pattern even if you're not building an API, if you expect that a number of third parties may have to maintain or develop your code for years to come.

However, if you're working against problem domain that is well understood in your organisation (technically specialised, low level) it doesn't make sense to force a pattern upon it. It is reasonable to expect that third parties should know what's going on already.


I think the problem, however, is when your stack trace is too long for your scroll buffer, you've gone a little overboard.


Of course! Your software must be usable and maintainable. These should be your top priorities.


Oh, please give it a rest already. That Spring release is from 2007. Things have moved on.

On the other hand a lot of people judge C++ or Javascript by 90s standards. So first impressions stick for a long time :)


Let me just say thank you for picking that particular example.

Now please excuse me. I just discovered a new hammer and I'm sure as hell gonna hit something with it.


They still believe that they are tools. The goal is working software, they're just overusing design patterns.


The goal is "the one true architecture" that can handle any future change. It's BS and soooo many self proclaimed "architects" fall into it's trap.


Except that people (and I have done this earlier in my career) make a lot of uninformed assumptions about what future changes might actually be and then when an actual future change comes along it often turns out to be different from what was expected and everyone gets a deeply unpleasant surprise.


I once (maybe it's in different state now) looked at C++ boost package that was supposed to work with dates and times. It had immense number of classes (or methods or whatever) and was for my needs completely useless. The way I remember them, the most of the classes (or methods) were to provide "abstractions" but whoever designed "abstractions" designed them without having ever doing any non-trivial job with time and therefore the abstractions were in tons but nothing I could use. I had the similar impression for a lot of boost packages: an immense amount of red tape all around, but no "meat." But hey, also every imaginable C++ feature is used. Then I've discovered that Google at that time forbade developers using the whole boost and I've felt a bit better.


That's honestly sounds like FUD. The Boost community has rejected OO design and abstractions from the very beginning embracing the STL design instead.

Dates and times is a complex problem, you won't find a simple solution for it. For the record boost.date_time ony has concrete classes and no abstractions. It barely uses templates.

"Then I've discovered that Google at that time forbade developers using the whole boost"

Their loss.


I haven't checked recently, at that time the classes didn't support the historical time zone calculations. I've looked at a lot of code (yes, they were not hierarchical classes, but there was a lot of code) but there were really only the most trivial things supported.

And I admit, I never liked Boost, I am biased. But that doesn't mean that I'm misrepresenting how little what it had matched my needs, and the "bloat" feeling I've (and obviously Google too then) had.


Not sure what you mean by historical time zones. If you are referring to DST rules, boost.Date_Time released in Boost 1.34, which was released almost 10 years ago has a member function dst_local_start_time(year) for a time_zone object which returns 'The date and time daylight savings time begins in given year'.

The doc links for older releases seem to be broken on boost.org, so I can't check if the functionality is even older.


I agree. It should be obvious. I don't know anyone who believes design patterns are the goals.


Umm... he believed it, hence why he wrote about the insight he gained regarding this belief. And if he believed it, then there will no doubt be others like himself that believed it as well, and will benefit from this insight he has shared. Any insight like this that is shared with an intent to help others is valuable. It doesn't matter how simplistic or obvious we may think it is as there will be others out there that will learn from it. When we make statements like this, we discourage others from sharing things they have learn't in fear of being judged as not knowing something that should 'supposedly' be obvious.


Well, the author says that he had been an advocate and even educator on design patterns, yet also admitted that he has never read the definitive (GoF) work on the subject. IIRC, the book's preface declares the very thing the author "discovered" for himself and thought to share.

If you started reading the book, and gave up after 2 minutes, you would still have learned this lesson. So, yes, it shouldn't have to be said.


I'm inclined to believe that there are many that have not read the book and yet have tried to use the patterns mentioned in them. They would benefit from hearing what he has to say and why shouldn't they? Should they be punished for not reading the book by having all insights from one like them silenced? Have you also read every book that defines your craft? I don't think so. So wouldn't you then benefit from insight being shared that you missed because you did not read the book yourself? If the rule for sharing content was that you had to read every book on the matter first, then no one would be able to share anything. The point I'm simply making is that as long as there are those that will benefit from what you have to share, then its beneficial for you to share it regardless of how supposedly 'simplistic' or 'obvious' it is and mocking others for sharing things that we deem as obvious doesn't help anyone. Rather, it prevents others from learning. On the other hand, sharing something that is obvious to some but not to others, at least helps those that it wasn't obvious to. The guy did a good think by taking his time to collect his learnings and share it with others in the hope of helping someone else out there, only to be told that what he shared was obvious and so he shouldn't have said it. The price you pay for trying to help.


I have been interviewing a lot of developers recently, and one of the best questions I've found is to ask them _why_ they have used MVC pattern in the test assignment (most do). Most of developers misunderstand the question at first and either start to explain how MVC works or explain how they would've implemented it without MVC (when you ask people why they did something, they often take it as "you shouldn't have done it"). But even when I clarify the question, a surprising number just can't even begin to answer it — instead they stumble and at best just tell that that's how they have always been taught to do it.


I noticed the same thing when interviewing people. Answering why a decision was made would always trip people up, but it's far more revealing than any demo project can be. It doesn't matter, to me, whether the solution was right or wrong, but more the "why" that got you there and how it will get you to where you need to go. You can't problem solve effectively without understanding the reasons you're making the decisions you are.

The same follows for patterns, I think. If you haven't thought about it critically and really done the work to understand the "why" (being told the "why" is not enough) you're going to struggle and mostly fail to implement it properly are in the appropriate situations.


That you haven't been up-voted more affirms my belief that you are on the right track. If they can't explain why they don't know the value of it. If they don't know the value of it, they can't weigh that value against any others.


Is "it's the first thing that came to mind" an acceptable answer?


That's falls somewhere between "because its the way we do things around these parts" and "it's what we used on my last couple of projects".

Perfectly fine for a certain degree of utility (maintenance, finishing, junior dev?), not quite so much for a lead, system designer or architect.


As far as I can tell design patterns are mostly about taking something simple and obvious and using terms to describe it that make it obscure and difficult to understand.


Patterns are about giving developers a common vocabulary so they can discuss what they all do without having to go into insufferable detail each time. We all do lots of the same shit and it's extremely helpful to all use the same names for those things.


Why do I have to call it "the strategy pattern?" What was wrong with "passing in a function?" Giving it a name like that obscures the idea. Yes, I realize that if you don't have first-class functions you have to work around that with interfaces or similar, but now we're describing a workaround for a language limitation, which seems a bit less inspiring than what "design patterns" are sold as.


I think the word "strategy" encapsulates more about the semantics than "passing in a function".

For example, the "visitor" pattern could also be described as "passing in a function", but there's a significant difference: a visitor is intended to be called on each element of a data structure in turn (i.e. a bit like an iterator callback), while a strategy defines an algorithm that you're expecting the caller to select at run-time (e.g. different cipher implementations).

Syntactically, they may well be the same ("passing in a function") but semantically, they're very different.

It can be useful to be able to have this shared vocabulary when talking about designs.


"Visitor" has always bothered me too. If you look it up it's tons and tons of text to describe "pass in a function and descend a graph calling it on each node."


That's actually not what the visitor pattern is about, though the way it's typically presented and the poor name tends to confuse the fact. Visitor isn't about traversing trees. It's about letting you define class-specific behavior outside of the definition of the class itself.

It gives you a way to "add a method" to a set of classes without actually stuffing them in the class definitions themselves.

In practice, most visitor patterns are used with AST classes or other types that are stored in a tree-like fashion, but that's coincidence. The visitor pattern itself is really about using virtual dispatch to avoid an ugly and non-type-safe type switch.


I'm not sure I'm following what you describe. Like an extension method or mix-in?


So pattern matching?


Sort of. It's a way to emulate a simple subset of pattern matching in languages that lack it.


> If you look it up it's tons and tons of text to describe "pass in a function and descend a graph calling it on each node."

No, it's one word: visitor. That's the point, we can now avoid saying "pass in a function and descend a graph calling it on each node" and instead simply say visitor and everyone knows what we mean without the big long explanation.


Sure, so some documentation isn't great (and, by having lots of documentation, you might be a bit disappointed at how simple it ends up being).

...but going back to my original point, isn't it more valuable to have a common vocabulary, meaning that I can say "visitor" rather than "pass in a function and descend a graph calling it on each node"?


Well, in my opinion, no, because the fancy terminology obscures the simple idea.

Also, the naming of the pattern, along with implementation, may push people to copy a design that would be the best way to do it in C++ or Java ca. 2000 but unnatural in whatever they're working in.


It's a simple word, visitor; that's not fancy terminology. Giving things names is not fancy, it's practical and it aids in communicating with other people who do the same thing. You're basically objecting to the fact that people like to give names to their idioms so they communicate better.


No, I'm objecting to the use of terms which I feel make communication worse, rather than better. You're free to disagree.


You don't have to call it anything, language doesn't work that way, but using the same words other people use makes communication more efficient. Passing in a function is vague and unspecific and doesn't fully encapsulate the idea of a strategy.


The thing is that "obviousness" is a relative quality. Your own code is always obvious to you. Yet it's often surprising for many devs (even veteran ones) how unobvious their work is to their peers. Using design patterns helps to mitigate that. These conventions help me to quickly grasp what is supposed to be what.


I just mean that I feel like the terminology obfuscates something that could be more simply explained without using it.


Completely accurate, in most cases. I think that's inevitable when you describe an abstraction with an analogy.


This article makes way more sense when he says he never read the Design Patterns book. If he had, he would know that before he started. They explain that the book is a collection of patterns that they have compiled from a bunch of people and from years of experience. The patterns did come about organically, and they were never meant to be the way to design software. They were only trying to come up with a common lexicon for something that they were all already doing.


I'm reminded of a set of tweets from Harry Roberts about whatever new hot CSS naming convention was popular for the week:

> Modularity, DRY, SRP, etc. is never a goal, it’s a trait. Don’t let the pursuit of theory get in the way of actual productivity.

> That’s not to say Modularity, DRY, SRP, etc. aren’t great ideas—they are!—but understand that they’re approaches and not achievements.

There's nothing super revolutionary about these thoughts, but they've stuck in the back of my mind for a while now.

https://twitter.com/csswizardry/status/539726989159301121?re...


design patterns are guru thinking. they're bad ways to describe self-descriptive tricks like callbacks. don't let a person who talks this way write docs ever; they'll focus on 'what's being used' rather than what's happening.

design patterns are like when a consultant creates a stupid name for something that already exists -- the name isn't about expressive power, it's about declaring ownership so the consultant can sell the 'Consulting Method' to solve your problem.

when a phenomenon or trick has an easily understood one-word name, don't let a nonexpert rename it to something nobody understands.


Callbacks only become self-descriptive or obvious once you've seen them a bunch of times.

To a new programmer, say in javascript, who has never seen them before, they aren't at all self-descriptive or obvious. It is super helpful to have the term 'callback' to talk about them, and to understand what they are -- say, just a function you pass in that's then called back or whatever.

Then you'll be able to recognize what it is when you see it in existing code/api, and how to use it, as well as when to use it when writing your own code/api. If you really just tossed someone at Javascript and they were a new programmer and had no idea what a 'callback' was, most people would have a lot more trouble figuring it out from themselves only from code examples, or would never think of using it themselves when writing code/api never having seen it before.

That's all a "design pattern" is. "Callback function" is a common and successful design pattern in Javascript. (Less common or useful in some other languages, depending on the affordances of the platform). That's it.

Yes, focusing on "what's being used" instead of "what's happening" is a problem, yes, thinking you can construct software just by robotically putting together "design patterns" is a problem.

Design patterns are still a useful and inevitable part of software development, and recognizing that can make you much more efficient at learning how to write software, at designing comprehensible and maintainable software, and at reading other people's software.


Yes, I meant 'callback' is a better title than 'visitor pattern', not that callbacks are obvious without explanation.


A difference of opinion about what a pattern should be called is not an argument against the utility of the concept of design patterns.

It's definitely called 'callback' in Javascript, I don't know if that name would be as obvious or useful in other environments.


> Here's the problem I have with design patterns like these [Adapter Pattern]: they seem to be something that should occur organically rather than intentionally. We shouldn't directly target having any of these patterns in our code, but we should know what they are so that if we accidentally create one, we can better describe it to others.

It's not clear what the author would have done differently in this example. It's one thing to raise concerns about pattern-first thinking in general, but quite another to spell out what exactly is wrong with reaching for the Adapter Pattern to solve a very specific problem under a given set of constraints. I can imagine a number of situations in which going straight for an Adapter is the only sane choice.

I've come to view with great suspicion any general discussion of programming divorced from its context. Architecture Astronauts and Cowboy Coders can each do a lot of damage if left to their own devices.


Design patterns, OOP, to a large degree programming languages are just tools. You don't hear of craftsmen saying things like "The only thing you really need is a hammer. It's been around longer than the other tools and you can use it on every project". Replace "hammer" with C or Java and you have a legitimate comment on a score of threads.

> What patterns don't help with is the initial design of a system. In this phase, the only thing you should be worried about is how to faithfully and correctly implement the business rules and procedures.

I submit that should be your overriding concern at all times, not just the design phase. If you have to refactor some code in order to extend it, tie it back to the changed requirement. This forces you to make the least amount of changes, refactoring the least amount code, breaking the least amount of unit tests and introducing the least amount of bugs into production.


Unfortunately looking at just the business rules and procedures leaves the oft-hidden 'soft' requirements (the -ities) in the dust. Unless you explicitly state the corresponding requirements, having the business rules and procedures as your overriding concern will lead to pain. And explicitly stating a testable requirement for e.g. maintainability or security tends to be quite hard in practice.


While we're here, SOLID is a nice acronym that is helpful as a checklist of generally good ideas to consider. It's not a law of physics, it's not compulsory, following it blindly can lead to worse outcomes and if transgressing it leads to a better outcome (with all things considered) then it should be transgressed.


Just now realizing there is ambiguity around the terms “design patterns.” Say it in a different crowd, they'll think you are talking about the kind of design patterns Brad Frost is writing about. http://atomicdesign.bradfrost.com/


Maybe there's a common (patterned?) ambiguity to the idea of design patterns?


Goals should include:

  1) Solve the problem
  2) Make it maintainable
  3) Make it extensible
  4) Make it scalable (server)
  5) Optimize it for memory, speed
So the reason to use an existing paradigm and a well-tested framework is because it makes the above easier, especially #2. And over time, #2 winds up saving you a lot resources and probably saves your project from tanking.

Finally, using an existing well known platform also lets you hire developers who know what they're doing from the beginning, leading to more prosuctivity and less dependence on any one particular developer. We leverage the knowledge that's already out there.


His problem may be learning about those concepts from snake-oil sellers - he mentions he didn't bother to read GoF and gets his knowledge from things like http://www.dofactory.com/products/net-design-pattern-framewo... .

My advice is to learn from people like Martin Fowler or Kent Beck and if you want to look at companies, look at something like ThoughWorks.


As pointed out (arguably a bit harshly) in comments under the original article, this is really a strawman argument. That's because that ol' classical GoF book on design patterns - which the author admits has not even read - addresses this concern already. It's still a valid argument, but not exactly a fresh one. And speaking on the subject without even bothering to read the piece widely considered as canonical is a bit arrogant.


YES

The point of design patterns is a way to describe what you've made succinctly.

NO

When you set out to do something that you don't yet know how to do, having a crank you can turn to get out functioning code is a good thing.

I think what you mean is "Design Patterns are Tools, not Dogma".

Plus, a lot of design patterns only make sense in typed and/or OOP languages, so under those circumstances, they can't be applied as goals.


> In other words, if you ever find yourself thinking, "I know, I'll use a design pattern" before writing any code, you're doing it wrong.

I completely disagree... if I'm working with a team. I've spent far too many hours trying to fix fragile code that comes about as a result of different devs with different methodologies trying to tie their code together.


I, too, disagree, but not strongly. Author used the adapter pattern in one of his examples. I think this pattern comes up more often when refactoring than when doing initial design. Not always (there are never any certainties), but usually. So, I think the warning should be more, be careful in what patterns you choose and it's ok, for now, to choose none.


What you're describing sounds more like a teamwork problem - you're likely either missing someone with the "big picture" who is reviewing code, or a collaborative code review process.


I would agree that lacking either of those things would cause the problem I described, but in this case we have both of those.

Most recently, I was the one joining a project after its incipient stage, so I was not the big picture guy, but the big picture guy chose not to establish any kind of patterns or standards so when others started joining the project--even with an understanding of how it should work--suddenly the app had a lot more functionality but it all performed terribly because of a lack of uniformity in design. Code review helps, but ultimately because there are no established patterns there is no justification for telling someone a task should be implemented differently, right? The only solution I can see is to establish that justification by trying to get the big picture guy on board with a massive refactor to establish some standards.


Isn't that what code conventions are for ? Additionally code review could fix that situation better.


Kind of relevant to the discussions in this thread: https://en.wikipedia.org/wiki/Rule_of_three_(computer_progra...

When I do hit the magic 3 and can justify restructing code, I consider my options in terms of design patterns (which are very much tools!)


Stated in other terms, patterns are a means to an end. Not the end goal.

Patterns will organically emerge as the result of ongoing refactoring.


I can say the same thing about programming.

Thats why when I read HN I'm trying to understand what are you trying to achive. Something that goes beyond staying in front of the computer 10 hours a day.


"I didn't read the article or the comments but I think you're all wrong, maybe it's bad upbringing or maybe something else but whatever". ok thanks for sharing.


How can someone doing research on these patterns not have read the most basic/important piece of literature on the subject?


When I was in college, I assumed (like most) that patterns were received wisdom in how to construct software. Then I actually attended a talk with John Vlissides and realized that patterns were an entirely different thing, closer to the "archaeological" sense dantheman mentioned. In this way, the study of design patterns correspond better to the study of rhetoric or poetics in human language. "Homeric Simile" could be a design pattern in poetry.

In software, some rigidity of expression might be preferred, and so the design patterns also help us avoid creating new terminology for things that have been appropriately described.

There are places where each pattern might have utility, and I suppose if there is any sense to the term "software architecture" it is in the ability to make sense of what the system should look like in a way that can be explained to the relevant parts of the team.

There is a tendency, as well, among software developers to think that a complicated architecture must be the result of countless stupid decisions, probably made by junior technicians, who were doing things without understanding what's going on. Thus you find people exhorting others for simplicity, and acting like they've done their job at that point. But instead, complicated architecture is the result of compromises and rewrites throughout the software's life, and attempts to discard those old architectures and start afresh with similar tools usually result in an initially simplistic, but ultimately inflexible, design that will eventually evolve into a different complex architecture.

The Linux kernel is an example of a complicated architecture that was designed from a standpoint of simplicity initially, and developed its own object-oriented layer on top of C, with pluggable elements all over, loadable modules, etc., and millions of lines of code. BSD is smaller and more coherent, but also much more limited in scope.

There are also examples like Windows NT, which suffered from being the second system to 3 systems: Windows, OS/2 and VMS. In this kernel, there are so many design features that were included before implementation, that it seems incredible it was ever built. But they persisted and made it happen, and even eventually made it fast, in some cases by working around its design with new compromises and approaches. Still, it lacks the simplicity of a Plan9 or an Oberon, but what it doesn't lack is users.

Anyhow, I digress. What is important to me about patterns is the language that we get from them, and the ability to recognize what's going on in code. They can provide useful hints about implementation gotchas, and they can also help people stop reinventing the wheel.


Read the book. Then read the books that inspired the book.


THIS


A design pattern is a reusable solution to a recurring problem. Too many inexperienced devs forget that part, and use a pattern where the problem it's designed to solve doesn't exist. Had the author read the GoF book (he admits he still hasn't) he might have avoided that pitfall.


Yeah, I can't believe someone would write a blog post about design patterns without reading the GoF book.

The authors are very explicit in every chapter about what problems are a good fit for that design pattern.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: