Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, Martin's book came out after we were doing these patterns in the 90s. My teams had that motivation - data worked with logic; logic worked with UI teams. Separation of concerns and division of labour are, generally, good ideas.

ETA: one of the groups that was motivated was MS: use SQL Server + SP ; then COM in the Logic layer and then ASP in the UI.



Yes. I was happy when Fowler came out because we could all start using the same terminologies for the same things, and work from common concepts when solving the same problem.

(It didn't work out that way, though. It seemed like most people used Fowler as some kind of bible or ending point, when it should have been a starting point/ source of inspiration. Somehow it seemed to turn people's brains off, making them dumber and less insightful about the systems they were building.)


This is a good perspective - the books that were written in the early 2000s were documenting a lot of the practices that had evolved through the 90s, and giving them a nomenclature. There was a period from roughly 1995-2005 where it felt like we were evolving a real discipline of software engineering, with patterns of how to build things and a language to communicate with each other.


"ETA: one of the groups that was motivated was MS: use SQL Server + SP ; then COM in the Logic layer and then ASP in the UI."

I remember SQL Server and its Stored Procedures having a brief period of popularity because Visual Basic had been popular, in no small part to non-university-trained people starting their careers by graduating from MS Excel VBA macros to VB UIs in enterprise client-server apps, with MSSQL SP providing the logic layer. As the Dot-com boom took off, ASP made it possible to leverage that experience outside of the enterprise building internet applications, which was were the money and excitement was. I remember MS pushing COM, but never getting very far with it as it quickly lost ground to Java. Java was exploding in popularity and was a rich and fertile valley where the "kingdom of the application layer" could be built re-using paving stones from the "3-tier and n-tier" era that somewhat preceded it but was somewhat coincident with it.


> one of the groups that was motivated was MS

I remember Microsoft being a huge marketing proponent of the 3-tier architecture in the late 90's, particularly after the release of ASP. The model was promoted everywhere - MSDN, books, blogs, conferences. At this point COM was out of the picture and ASP served as both front-end (serving HTML) and back-end (handling server responses).


If the claim is that Martin reported on, summarized, and put a name to a pattern that had been in use for some time, I would grant that, though 2002 seems very late to the party to me. I vividly remember "3-tier architecture" and "n-tier architecture" being quite current concepts by no later than 1999. Three years is a long time in tech as in life, and 2002 felt to me like a different era: post "Dot-com bubble", post EJB-excesses, post "GoF patterns", post-9/11, post Y2K, post "Bush v. Gore". By 2002, the number of tiers in your architecture was boring old news. "REST", "SOA", and "AJAX" were hot topics then, just as they would give way to "Big Data", "NoSQL", "microservices", and so on.

The reason this is important to me is because it raises within me the questions, "What was the '3-tier architecture' a reaction to?" and "Why was it so important circa 1997-1999 that there be 3 or more tiers?" I think the answer to the first question is, "The '3-tier architecture' was a reaction to the 'client-server (2-tier) architecture'." I think the answer to the second question is, "At least one of the reasons it was so important circa 1997-1999 to replace client-server architectures with 3-tier or n-tier architectures is that, for sociological and demographic reasons, there was an important new cohort of developers who wanted to use general-purpose programming languages like Visual Basic and Java rather than the SQL of the previous generation: young men who first learned BASIC when they were adolescent boys during the home computer revolution of the late 1970s and early 1980s." To the extent that's true, then it casts some doubt on the proposition that an "application tier" outside of database was based on merit. It raises the possibility that the motivation was less technological and more psychological than is usually acknowledged: as an attempt to hold the database and its alien programming model (SQL) at arm's length by people who started out with BASIC and never strayed very far from its familiar imperative model, eventually hiding it behind an ORM layer.

To the extent that's true, then it also casts some doubt on the claim in the article that "The motivation for this separation is as relevant today as it was then: to improve modularity and allow different components of the system to be developed relatively independently." I'm sure some people had that motivation and that's laudable, but it's not the whole story. There were other factors as well, some of them sociological and demographic. But, demographics change. The "Gen-Xers" who were 10 in 1980 and 27 in 1997 are at 55 now approaching retirement and are being replaced by subsequent generations who aren't hidebound by a formative experience that occurred decades before they were born.

Tying this back to the DBOS article, in general I liked it and consider it interesting technology. I just want to push back gently on a familiar tone I perceive, which tends to present whatever product or technology is being offered as somehow "standard", "accepted", "optimal", and "the natural and logical current end state of a tidy process of innovation which relegates earlier technologies as historical but no longer relevant, if it mentions them at all." The world and technology isn't that tidy, and a lot of old ideas are still relevant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: