Having now read it, I've come to the conclusion that the blog gives the wrong impression about the implications of having a custom language. Readers of the blog post come away with the idea that Wasabi was so full of "o.O" that someone was moved to write a book about that. In reality, the book is simply documentation of the language features, with callouts for weird interactions between VB-isms, ASP.NET-isms, and their language.
You should definitely read the "Brief And Highly Inaccurate History Of Wasabi" that leads the document off. It's actually very easy now to see how they ended up with Wasabi:
1. The ASP->PHP conversion was extremely low-hanging fruit (the conversion involved almost no logic).
2. Postprocessing ASP meant PHP always lagged, so they started generating ASP from the same processor.
3. Now that all their FogBugz code hits the preprocessor, it makes sense to add convenience functions to it.
4. Microsoft deprecates ASP. FogBugz needs to target ASP.NET. They can manually port, or upgrade the preprocessor to do that for them. They choose the latter option: now they have their own language.
It's step (3) where they irrevocably commit themselves to a new language. They want things like type inference and nicer loops and some of the kinds of things every Lisp programmer automatically reaches for macros to get. They have this preprocessor. So it's easy to add those things. Now they're not an ASP application anymore.
Quick rant: if this had been a Lisp project, and they'd accomplished this stuff by writing macros, we'd be talking this up as a case study for why Lisp is awesome. But instead because they started from unequivocally terrible languages and added the features with parsers and syntax trees and codegen, the whole project is heresy. Respectfully, I call bullshit.
If it had been a Lisp project, then Lisp programmers (and programming aficionados that have never used Lisp, but are attracted to its elegance) would be saying how awesome it is, and then most other developers would be talking about how they'd never work at Fog Creek because you have to use Lisp.
It's a good case study in market segmentation, though. Lisp works because all of the programmers who would like to do language design as part of their day job gravitate to it. As a result, it has the most advanced language features of any language on the planet. All of the programmers who just want their programming language to be a stable, dependable tool they can use gravitate to other languages (tops among them: Java and Go), and they build some pretty cool products with them because they aren't distracted by improving the language. Wasabi's big failing is that it tried to introduce Lisp-like concepts to programmers who have to work in ASP and PHP in their daily jobs. It's kind of a no-man's land there.
Kind of a weird, cliche-informed rant here about lisp. To clarify, lisp simply offers functional programming by default and metalinguistic abstraction - most languages don't. So it tends to be used by people/organizations seeking higher-level organizational tools than functions/modules/classes or organizations looking to leverage functional paradigms. It's plenty stable and plenty dependable - just look at some of the names juxt has helped to adopt Clojure [1]. Or consider Netflix [2].
Wow, something clicked while I was reading that. Wasabi is the result of following "Things You Should Never Do [1]" as a motto. Not that it's a bad thing (it's not), but it just explains so much.
Yes, I noticed that this story was a parable of what happens when a charismatic leader declares something taboo, because of all the unforeseen problems it always causes, so the acolytes resort to a workaround that has the same problems in a different form but does not violate the religious taboo.
There is inevitably a Law of Conservation of Coding Cost that will get you one way or another. If you won't rewrite your code from scratch (because of all the unrecognized wisdom in the old code), you'll build a compiler from scratch (and discover all the unrecognized wisdom in proven compilers), or you'll switch to some proven cross-platform solution that will cover most of your needs, but the remaining pieces needed to complete the job, when added together, will introduce you to all the unrecognized wisdom embodied in the standard, platform-specific toolchains, or some other approach that will preserve the cost in some other way.
This "never rewrite from scratch" dogma is an example of incomplete wisdom.
I disagree. Joel's "Don't rewrite from scratch" mantra seems like pretty solid risk-avoidance, given his first-hand experience with that path.
The cost to build and maintain Wasabi was surely higher than rewriting as X, but the risk sounds substantially less, and effectively spreads the risk part of technical debt over future years.
Thistle came out TEN YEARS ago, and Wasabi isn't much younger. Choosing the path they did arguably contributed to them being able to survive this whole time - and, having read Joel's creed on rewrites, I'm not sure they'd have done better if they rewrote as Python or Javascript.
I'm not sure the 'there's no such thing as bad publicity' works when you're aiming to come across as a voice of reason in the software community.
I think the discussion could have been nipped in the bud if it was described similarly to your summation - a necessary business decision rather than good software practice.
Problem with rewrites is there is the terrible gap that opens up between the old stuff which goes stale quickly, and the new stuff which isn't ready for prime time. Cash-flow then dries up right as the amount of work needed is maximal. Death follows quickly.
Here they did not rewrite from scratch they developed a bridge that allowed them to keep moving forward as the world that the old software was based on disappeared. And now they are on the other side with a code-base that works and has been providing cash-flow the whole time.
I think it's related to the Waterbed Theory[1]. Or, put another way, there's no free lunch. The bill will be paid, either now or later, by you or someone else. You're probably better shifting responsibility than time...
Wasabi is a compiler. The output of Wasabi was C# code. One day, I deleted Wasabi and checked in the C# code. I did not have to rewrite FogBugz in the slightest.
Most of them are NUnit tests, which don't care what source language the assembly under test is written in.
A few of the oldest FogBugz tests are in `unitTest.asp` and were written in Wasabi. They got transpiled over to C# like everything else. Some are confirming Wasabi language features (e.g. `testLambda`, `testDictionary`, `testErrorHandling`), so I could remove them.
All of Wasabi's unit tests were deleted with Wasabi.
In my (brief, haphazard) research for this article, I somehow missed this article. Thanks for sharing it. I really like the analogy with a bridge loan.
Given both the tedunangst's article and yours, I wonder what has happened to the JavaScript output feature? I don't seem to see it mentioned in your article? But I don't really know about Roslyn, so I'm not sure what it provides; does it have anything helpful? (I assume it has some kind of a C# generator; probably a C# parser too?), but - either you don't have a need for JS output anymore, or you've written a C#->JS translator? (Or used some 3rd-party one?)
I'm very curious, would be grateful for an answer! tx
See https://news.ycombinator.com/item?id=9780798 - I took the output of Wasabi's JavaScript generator and checked it in to source control. If someone wants to change the JavaScript, they have to do it manually, as there is no longer any program maintaining that code for them.
Roslyn is an open-source implementation of the entire C# compiler, with some fantastic design decisions that allow you to use it in pieces or all together. https://github.com/dotnet/roslyn I used the C# generator portion of the platform.
Respectfully, their example code is the following:
<%
<WebEntryPoint("example.asp")> _
Sub Example()
%>
<!DOCTYPE html>
<html>
<head>
<title>An Example Wasabi Program</title>
<meta charset="utf-8">
</head>
<body>
<h1>An Example Wasabi Program</h1>
<p>Arithmetic: 1 + 1 = <%= 1 + 1 %></p>
<p>Dynamic Content: x + y = <%= IntRequest("x") + IntRequest("y") %>
</body>
</html>
<%
End Sub
%>
I mean, they basically went to the trouble of building their own compiler to be able to keep writing Frankenstein ASP. If they'd built a language like Elixir I think people would be a lot more sympathetic.
The thing is, "frankenstein asp" (and PHP) are so damn easy. Yeah it mixes presentation with logic and after a point it all gets really messy but on the other hand everything about a page is right there and you don't need to do any builds and you can code in notepad and deploy with xcopy/rsync.
Before rich client/ajax took off, it was more than adequate for most e-commerce, web publishing, and internal administrative applications. Truth be told, it still is, but it's just soooo 2001.
If you go to the trouble to write your own language, make it not terrible?
Public Abstract Class Animal
Abstract Function Speak() As String
Overridable Function Eat() As String
Return "Yum!"
End Function
End Class
Public Class Dog Inherits Animal
Override Function Speak()
Return "Woof!"
End Function
Override Function Eat()
Return Base.Eat() & " Woof!"
End Function
End Class
ASP++ Was the path of least resistance given where they started, which was classic ASP and VBScript.
I suspect that many of the armchair critics can't relate to being constrained by decisions made years ago at the start of a project -- decisions that one might now regret but nonetheless constrain the way the codebase can evolve.
That would, at a minimum, have created substantial turbulence on the windows side. Continuing to ship the same windows installer with the same system requirements was less resistance.
I can relate, for sure, but it's not like they'd written a 50 million line operating system in ASP.
At the point you've decide to build a terrible new internal-only language just to maintain a single bug-tracking app codebase ... perhaps you aren't been doing an accurate cost/benefit analysis?
Literally every single person on this thread agrees: if you're going to rewrite code by hand, you wouldn't deploy something like Wasabi. Wasabi exists entirely to facilitate not rewriting.
You've also got several current and former FC'ers on this thread, all? of whom don't like Wasabi as a language, all saying that from a pure cost/benefit perspective, Wasabi paid off.
> Quick rant: if this had been a Lisp project, and they'd accomplished this stuff by writing macros, we'd be talking this up as a case study for why Lisp is awesome.
Agreed! But I would consider that an argument against using macros, not an argument for Wasabi.
> Quick rant: if this had been a Lisp project, and they'd accomplished this stuff by writing macros, we'd be talking this up as a case study for why Lisp is awesome. But instead because they started from unequivocally terrible languages and added the features with parsers and syntax trees and codegen, the whole project is heresy. Respectfully, I call bullshit.
Interestingly, since new hires were supposed to be pretty smart and had to pay the cost of learning Wasabi anyway, they could have just taken a page from Paul Graham and gone with Lisp. The learning curve should be the same (or easier, given the amount of material available) and they would have crazy good compilers and libraries from day one.
I think we are missing a piece of the puzzle somewhere.
The piece you're missing is that Wasabi was a much smaller jump from where they started, which was VBScript. Sure, criticize Joel for using VBScript back in the 90s when he started writing a bug tracker at Juno, but in 2003, when Fog Creek decided to port to Unix, they had to make a decision based on where they were, not based on where they should have been.
I'm not sure when I bought it, but it came out in 2000. Around that time, I'd seen some pretty nasty php, not much asp (but enough VisualBasic to start looking for a pitchfork whenever I saw it mentioned) -- and shortly after I worked as a sys.admin. at a place with a significant investment in Coldfusion (arguably the first php/asp-style language).
I've yet to write any classical ASP, and do anything more than glance at newer .net and what-not -- but I still got a lot from that book.
It, along with some of the posts I found on the Coldfusion fusebox framework/pattern[1] did a lot for helping me keep the difference between "bad language" and "bad programmer" straight.
Just because most all php code I've seen is crap, doesn't mean all php code is crap. And it didn't have to be in the early days either.
Another fun book that's somewhat related (in my mind anyway) is:
"Program Generators with XML and Java"
by J. Craig Cleaveland (Prentice-Hall, 2001)
http://craigc.com/pg/
(Sometimes one finds great technical books on sale, too!)
> Quick rant: if this had been a Lisp project, and they'd accomplished this stuff by writing macros, we'd be talking this up as a case study for why Lisp is awesome. But instead because they started from unequivocally terrible languages and added the features with parsers and syntax trees and codegen, the whole project is heresy. Respectfully, I call bullshit.
I was so sad, thinking about how using a cross-platform Lisp (I.e., LispWorks) would have solved this problem for them by design. FogCreek has some pretty smart folks, it should have been within their reach to use LW and go...
Oh, it totally makes sense after the first steps. It's a clear progression - obvious, even. But if they had started with a better technology stack up front, it wouldn't have been (such?) an issue.
FogBugz was originally an internal tool meant to facilitate writing Fog Creek's original first product (CityDesk, a CMS). It seems pretty reasonable to go with whatever's most comfortable for an internal tool that you don't expect to sell, let alone to have different people running on different OSes.
That is to say, it seems like they did choose the better technology stack for the job at hand. If you're going that direction, the real problems started when Joel got a job at Microsoft.
> The ASP->PHP conversion was extremely low-hanging fruit (the conversion involved almost no logic).
This is where I'm curious that they didn't just start writing PHP from that point forward. It was already a cross-platform language. And, if I recall, had a lot of hype and was gaining traction fast I'm the late 90's and early 2000's.
Certainly hindsight is 20-20, but I remember a lot of folks betting big on PHP at the time.
PHP on Windows was comically bad at that time (~2000) - performance and bugs - which is especially bad for shared hosting environments. It wasn't until Microsoft's FastCGI for Windows 2003 and PHP 5.2 landed that we decided to roll it out on our shared platform. I speak from experience as an engineer/dev for a shared webhoster who was also on out of hours pager duty.
> Building an in-house compiler rarely makes sense.
I disagree. First of all, Wasabi solved a real problem which doesn't exist anymore: customers had limited platform support available and Fog Creek needed to increase the surface area of their product to cover as many of the disparate platforms as possible. Today, if all else fails, people can just fire up an arbitrarily configured VM or container. There is much less pressure to, for example, make something that runs on both ASP.NET and PHP. We are now in the fortunate position to pick just one and go for it.
Second, experimenting with language design should not be reserved for theoreticians and gurus. It should be a viable option for normal CS people in normal companies. And for what it's worth, Wasabi might have become a noteworthy language outside Fog Creek. There was no way to know at the time. In hindsight, it didn't, but very few people have the luxury of designing a language which they know upfront will be huge. For example, Erlang started out being an internal tool at just one company, designed to solve a specific set of problems. Had they decided that doing their own platform was doomed to fail, the world would be poorer for it today.
>> Second, experimenting with language design should not be reserved for theoreticians and gurus. It should be a viable option for normal CS people in normal companies.
One might think so, if one is charitable toward the typical developer and they're abilities. I worked at a place that had implemented their own query language and integrated it into their core product.
The query language was interpreted, and walked structured text files stored on disk. Worked great for small installations, but as the company grew, acquiring larger clients, clients data corpi grew, performance fell of a cliff.
The syntax of the language was all over the board, users would cut and paste to combine little recipes to do things, there was a never ending stream of support calls as to why this didn't work like this, and why couldn't this do this like this other part does that.
Edit; wanted to say consistent, logical syntax is were someone with experience in language theory and implementation would have made an impact. The language as was, was the result of feature accretion. I think someone with knowledge would have attempted to craft a kernel of functionality, and back it with a more robust datastore (sql).
Edit2: also want to clarify, the developers who did the implementation were actually quite talented, the project was simply not something they were equipped to deal with, and really weren't given the oppurtunity to refine or iterate on.
Edit3: I offer this as an example of one company that implemented they're own langauge, of course the purpose of this example differs from wasabi, and is not intended to condemn or denigrate Wasabi, or Fogcreek.
Those might all be true and it's understandable why nobody wanted to support all that long term, but that doesn't mean people should abstain from developing their own languages and platforms, just because it lead to nasty code that one time at Fog Creek. And it's a "failure" that - from what I can tell - was a pretty good business decision at the time. But even if you discard that, even if you assert this was nothing but badness with no upside whatsoever, it would still be a learning experience.
When a rider falls off a horse, they have to make a decision: abandon the idea of horse riding ("riding your own horse rarely makes sense"), or apply that as a lesson to your skill set and re-mount. While there is nothing wrong in deciding that, upon reflection, horse riding was not for you, I think it's harmful to go out and announce to the community that riding your own horse rarely makes sense and should be left to the pros. Because what you're left with then is a world where the only riding is done by dressage performers.
(Sorry, that analogy got a bit out of hand, and admittedly I know nothing about horses, but I hope the point is still discernable)
Upon further reflection, I'd say that for the company in question, for that particular sector, having the query language available was a positive differentiator in the marketplace. I think it's clear Fogcreek's use of wasasbi was necissitated by the business and tech climate of the time. So clearly there are compellings reasons to go down this path.
I'd also say that languages are hard, runtimes are hard, languages and runtimes together are really hard. A decision to go down this path should be carefully considered, and not because a developer on staff read Parr's Antlr book and wants to try it out.
I work for a company that has the same thing. They have, basically, their own version of SQL. Clients can write scripts and import them. Lots of issues supporting it. I wonder how small the world is.
It seems to make sense to write a language that maps at a high level to a specific business domain. Query is firmly within the technical domain, not a business domain. It's not surprising that a custom query language would have trouble scaling relative to a commercial query stack from a team of people who live and breath queries for a living and make that living in a commodified market.
On the other hand, there really wasn't anything approaching an off the shelf solution to FogCreek's business problem, and even with massively popular legacy languages there rarely is (it takes an IBM to first productize a COBOL to Java compiler). FogCreek's strategy worked well enough that customer's were writing checks that didn't bounce, and that's pretty good for a software product.
"Embedding a programming language into a system has an almost erotic fascination to a programmer. It is one of the most creative acts that can be performed. It makes the system tremendously powerful. It allows you to exercise her most creative and Promethean skills. It makes the system into your friend.
The best text editors in the world all have embedded languages. This can be used to the extent that the intended audience can master the language. Of course, use of the language can be made optional, as it is in text editors, so that initiates can use it and no one else has to.
I and many other programmers have fallen into the trap of creating special purpose embedded languages. I fell into it twice. There already exist many languages designed specifically to be embedded languages. You should think twice before creating a new one."
> And for what it's worth, Wasabi might have become a noteworthy language outside Fog Creek. There was no way to know at the time.
Nobody could know most things at the time. But we can estimate. What are the odds that small, random companies creating languages as side projects will have success? Well, looking at the languages everybody uses, I'd say: not so good.
> For example, Erlang started out being an internal tool at just one company, designed to solve a specific set of problems.
It was just one company, but it was one very large company working on something that they expected to be in operation for decades. So I think that's a bit different than a small company working on a product.
Hobbies are very different than company projects like this. With hobbies, we don't really care about ROI; we do it for the fun of it. In companies, ROI is very important.
And naming the successes doesn't tell you much about the odds; you also have to count the failures. One can't justify the purchase of a lottery ticket by looking only at the winners.
Sure, but that strikes me as the exception that proves the rule. JS succeeded not because of language merits, but because it had strong distribution. JS was basically held in contempt for a decade until the platforms and the language matured enough to make it useful for something more than button rollovers and form validation.
I've been holding it in contempt for two decades and doubt I will ever stop. Spending some time on a static analysis project for JS recently only reinforced that opinion.
It's always struck me as extremely bizarre that a company that regularly advertises that it's at the bleeding edge of software engineering practices (see Spolsky's numerous blog posts on the topic) made such a colossal error as writing their own language, and that it took them a decade to realize this mistake.
I also find this kind of phrasing weird:
> The people who wrote the original Wasabi compiler moved on for one reason or another. Some married partners who lived elsewhere; others went over to work on other products from Fog Creek.
It's like the author of this article goes out of their ways to avoid saying that some people left the company, period. It also wouldn't surprise me if some of these defections were caused by Wasabi itself. As a software engineer, you quickly start wondering how wise it is to spend years learning a language that will be of no use once you leave your current company (yet another reason why rolling your own language as a critical part of your product is a terrible idea).
It has always struck me as extremely bizarre that computer science graduates would recoil from someone solving a business problem using what appears to be very basic compiler theory.
The second half of your comment transitions from weird to mean-spirited, as you begin speculating about people you don't know and their reasons for changing jobs. I'm a little confused as to why you've been voted up so high on the page.
I still think the fact that even most people with a 4-year degree still haven't done a compilers course is the core problem. Both other times we've had this discussion I haven't noticed anyone popping up to say "Yeah, I've written like 3 compilers and Wasabi was just an insane idea." (Of course, the Internet being what it is, someone will probably say that now. But the point is I haven't seen it before I asked for it.) A lot of people are doing the cost/benefit analysis with an order of magnitude or two too much in the "cost" column. Yeah, of course it looks insane then... but the problem is the analysis, not the reality.
Compilers just aren't that magically hard and difficult. I'll cop to not having written a true compiler yet but I've written a number of interpreters, and I've written all the pieces several times (compile to AST, interpret, serialize back out, just never had the whole shebang needed at once).
If you're reading this, and you're still in a position where you can take a compilers course, take it! It's one of the most brutally pragmatic courses in the whole of computer science and it's a shame how it's withered. (Even if, like me, you'll probably write more interpreters than compilers. And nowadays you really ought to have a good reason not to pick an existing serialization off-the-shelf. But it's still useful stuff.) It's one of those things that is the difference between a wizard and a code monkey.
I've written like 3 compilers, and while I don't think Wasabi was quite insane (they had an interesting set of constraints, so I could at least follow the logic), it's not the choice I would've done. Or rather, it's totally the choice I would've done as a fresh college grad in 2005 having written my first compiler for work (which was ripped out in about 2 months...it didn't take me that long to realize my mistake), but it's not what I would've done with the hindsight experience of that and other compiler projects.
The cost of an in-house programming language isn't in writing the compiler. It's training all your new team members in the language. It's documenting the language constructs, including corner cases. It's in not being able to go to Stack Overflow when you have problems. It's in every bug potentially being in either your application code, your compiler, or your runtime libraries, and needing to trace problems across this boundary. It's in integrating with 3rd-party libraries, and in not being able to use tooling developed for an existing mainstream language, and having to add another backend to every other DSL that compiles to a mainstream language.
All that said, I agree that if you're ever in a position to take a compiler course, do it. It's one of the most valuable courses I ever took, and really peels back the mystery on why programming languages are the way they are. It's just that the difference between wisdom and intelligence is in knowing when not to use that brilliant technique you know.
"It's just that the difference between wisdom and intelligence is in knowing when not to use that brilliant technique you know."
Which is precisely why I've never written a full compiler, even though I've written all the pieces many times.
For instance, instead of writing a parser, could you perhaps get away with just a direct JSON serialization of some AST? Do you really need to emit something, or will an interpreter do? So far I've never been so backed against the wall that I've actually needed a full compiler.
Yeah, one of the compilers I wrote just used JSON as the AST, with it being generated by a GUI interface. Another used HTML with annotations (although go figure, I wrote an HTML parser [1] for it, because there weren't any C++ options at the time that didn't bring along a browser engine). A third had a custom front-end but then emitted Java source code as the back-end.
The interesting thing is that the more experience you get, the more alternatives you find to writing your own language. Could you use Ruby or Python as the front-end, much like Rails [2], Rake [3], or Bazel [4]? Could you build up a data-structure to express the computation, and then walk that data-structure with the Interpreter pattern? [5] Could you get away with a class library or framework, much like how Sawzall has been replaced by Flume [6] and Go libraries within Google?
In general, you want to use the tool with the least power that actually accomplishes your goals, because every increase in power is usually accompanied by an increase in complexity. There are a bunch of solutions with less power than a full programming language that can still get you most of the way there.
I'm doing this now, for a crappy language and a crappy processor. It's been a nightmarish hellscape of a project, but also very expanding. Highly recommend.
(If you're interested in goofing around with Starfighter, you're going to get an opportunity to get handheld through a lot of this stuff.)
Assuming I could transfer the benefit of hindsight back to Joel's position in 2005, including all the knowledge of how the market has evolved over the past 10 years? I would've jumped on the SaaS bandwagon, hard, and converted the existing the existing VBScript codebase to a hosted solution, discontinuing support for the PHP/Linux version and freeing the company up to migrate code as it wished on its own servers.
I recognize that this would've been a huge leap for anyone in 2005, when 37signals was basically the only company doing small-business SaaS and the vast majority of companies insisted that with any software they buy, they actually buy it and the source code and data sit within the company firewall. Heck, when Heroku came out in 2007 I was like "Who the hell would use this, turning over all of their source code to some unnamed startup?"
But looking at how the industry's evolved, that's pretty much the only way they could've stayed relevant. Many companies don't even have physical servers anymore. That's the way FogBugz did evolve, eventually, but they were late getting there and had to back out all the existing Wasabi code and fixes they made for it to be easily deployable (which was one of their core differentiators, IIRC; they were much easier to setup than Bugzilla or other competitors).
It makes me appreciate how tough the job is for CEOs like Larry Page or Steve Jobs, who have managed to stay at the leading edge of the industry for years. Larry was pretty insane for buying a small mobile phone startup called Android in 2005, but it turned out to be worth billions eventually.
Tangent: Your description of how people resisted SaaS a decade ago makes me wonder if the only reason the industry did eventually move toward SaaS was that most on-premises apps were such a nightmare to deploy. After all, some of the disadvantages of SaaS, such as lack of control over one's own data, are real. If Sandstorm.io had existed back in 2004, might we have avoided SaaS altogether? (Of course, if Sandstorm.io had existed back then, Fog Creek would still have needed to port FogBugz to Linux.)
I think the move to SaaS was a combination of factors:
1. The primary product of many companies got too large to deploy on their own server farms, and so they started moving toward AWS etc. for scalable hosting. Once your product is in the cloud, it makes sense to deploy your supporting infrastructure & tooling there as well, because otherwise you're paying the support, hosting, & sysadmin costs for just your non-critical corporate infrastructure.
2. Bandwidth became a non-issue. In the 1990s there was a very measurable difference between 10BaseT internally vs. an ISDN line to your hosting provider. In the 2010s, there's little practical difference between gigabit Ethernet vs. 10M broadband.
3. HTTPS became ubiquitous, taking care of many security risks.
5. Employees started to blur the line between work and home, leading to demand for work services that could be used, encrypted, from a user's home network. VPNs were a huge PITA to set up. This was a big issue for much of the early 2000s; one of my employers made some clever network software to punch through corporate firewalls with a minimum of configuration.
6. Development speed increased. SaaS companies could push new versions of their product faster, react to customer feedback quicker, and generally deliver better service. Because all customer interactions go through the company's servers (where they can be logged), they have much better information about how people are using their products. Deployed services were left in the dust.
tl;dr: #1-4 made lots of businesses go "Why not?", while #5 and #6 made them go "Yessss."
It's interesting that many of the arguments about why you should not use SaaS businesses now (like privacy and security, and lack of ownership) were relatively minor reasons then. I do kinda wish (in an abstract way) that something like Sandstorm would catch on, but I think they may be early: SaaS just isn't that painful, and until we have a major shake-out where a lot of businesses get taken out because their dependencies go down, it seems unlikely that it will become so. Or the other way this could play out is that a new powerful computing platform comes out that lets you do things that aren't possible with thin clients, and you see a rush back to the client for functionality.
All very good reasons. I'll add another - accounting.
The monthly bills for small purchases of SaaS fits on what could be expensed on a corporate card. By the time IT gets wind, the product has already infiltrated the organization. If there's a very large up front cost, then IT is involved, you need a formal RFP process, lots of people weigh in, those opposed to the purchase can try and block it... As soon as "Put it on the corporate card" became viable, power moved back to the business units.
With Sandstorm, we could actually get that effect on-prem. Since no technical expertise is needed for deployment, and since the security model is so strong, and the IT department will be able to manage resource quotas on a user basis rather than an application basis, it's actually entirely reasonable that people outside of IT could be permitted to install software without IT approval.
Granted, it may take a while to convince IT people that this is OK, but fundamentally they have every reason to prefer this over people cheating with SaaS.
Actually, not that late. I think their main problem was that the environment changed around them. Besides SaaS, the whole developer ecosystem changed as well: when I look at who really won the bugtracking market, it's GitHub, who added it as a feature on code hosting.
If winning the bugtracking market was the goal, they probably would've taken VC money. You may notice that everyone who's in a position to claim that has done so (Github, Atlassian, etc).
They did learn from this, as you can see by the very different paths StackExchange and Trello are on.
Joel wrote an essay about this. [1] His basic thesis is that organic growth wins over VC when there are entrenched competitors, few network effects, and little customer lock-in. VC wins when there are wide-open markets, strong network effects, and strong customer lock-in. Stack Exchange's investment was consistent with this thesis [2].
The developer tools market changed from one with very few network effects to one with a large network effect around 2010. The drivers for these were GitHub, meetups, forums like Hacker News, and just its general growth - they made coding social. When I started programming professionally in 2000, each company basically decided on a bugtracker and version control system independently, and it didn't matter what every other company did. By 2015, most new companies just use git, they host on GitHub, and if they don't do this, they're at a strong disadvantage when recruiting & training up developers, because that's what much of the workforce uses.
Interestingly, both GitHub and Atlassian resisted taking investment for many years - GitHub was founded in 2007 and took its first investment in 2012, while Atlassian was founded in 2002 and took its first investment in 2010.
Right! And this isn't even a compiler, the way most people think of "compilers". It's a transpiler to three target languages each of which has an extraordinarily full-featured runtime. Two of which are (a) widely available and (b) as source languages, awful.
And it's not their own language, it's an extension of VBScript. And now that the tools around C# are better and Linux support for .Net is official, they have used these tools to transition to C#. Like you, I don't get the outrage.
Do you think that the name Wasabi contributes to the outrage?
Coffeescript has a similar name to Javascript, so you can quickly draw an association between the two.
The name Wasabi doesn't have an obvious connection to the VBScript that it's based on, which seems to be the cause of people talking about writing a whole new language, etc.
I've written some toy compilers, and I can at least say:
1. compilers have bugs
2. it really sucks not knowing if a bug is in your code or in your compiler
3. it sucks not having a source-level debugger
Anyone can write a simple compiler, just like anyone can make a simple database. The hard part (at least for a non-optimizing compiler) isn't the comp-sci theory, it's making the tooling around, and the extensive amount of testing needed to be sure you don't have subtle data corrupting bugs lying around to bite you.
I won't categorically reject the idea, for instance I think Facebook writing their HipHop compiler was completely defensible. But you need people with compiler experience, and people who know the pain of working with crappy, undocumented, buggy toolchains to make that decision, not people who once took a compiler course.
I've written like 3 compilers* and Wasabi seems like it was probably a reasonable solution for the problem they had at the time. Compilers just aren't that magically hard and difficult.
There are very few situations where writing your own langue and toolchain is a good idea. I used to work on a proprietary company language that was actually a compiler generator for language-to-language translation, plus a bunch of other stuff, and it was a horrible pain.
Documentation? None
Online community? None
Transferability of skillset? None, apart from knowing how compilers work. Makes for good nerd conversation, but that's it.
Writing your own toolchain is almost as bad. I've seen multiple talented people leave companies I've worked at when they were forced to build and maintain horrible tools for the in-house ecosystem. Some too-big-for-his-britches second-system-as-a-first-system ass had written them, and everybody else got stuck with it.
As the other commenter noted, this seems like epitome of bad software engineering and I'm surprised employees put up with it if they were any good.
EDIT: I learned to program in assembly, so compilers didn't seem super mysterious to me as they are for someone who learns Java first perhaps.
Can't you say the same things about a proprietary database, or a proprietary template language? What are the kinds of computer science that we can safely deploy without taking extra precautions to document and maintain it?
Both of those should be looked upon with suspicion. I can't say "never do it", given that every employer I've ever worked at has had its own proprietary database, and one of the projects I worked on at Google was a proprietary template language. But all of them were a large maintenance burden, much larger than originally anticipated.
I think the old business adage about "In-source your core competencies, outsource everything else" applies here. If you derive a big competitive advantage from having a proprietary database or proprietary template, and it generates enough revenue to afford a dedicated team of experts to maintain it, build it. But if you have a bunch of smart & motivated developers who can build a proprietary database, but your product isn't databases or templates and your core differentiator isn't the performance or query patterns you get from building it yourself? Put them to work improving the product, and work with the infrastructure that other firms have built already.
i'd actually be way more suspicious of a proprietary database, unless there was a very compelling reason why none of the existing ones worked. maybe this is just my inexperience in the field, but a database engine seems orders of magnitude harder to get right and maintain than a compiler (that too a transpiler, so you can even human-inspect the output!) does.
Yes. Any proprietary system will require you to document/educate the users, and you will not have the benefit of an online community to get help from, or bug fixes, or security analyses. There are very few problems where rolling your own solution is the right solution. Maybe if you are Google and your database isn't big enough or something.
If you have great people building the software, or at least competent ones, and you have competent users, you might succeed, maybe. But that's assuming you have a maintenance plan and a roadmap, which most software companies do not. Maintain software? YOLO! What happens when you have a bunch of morons using and maintaining the software?
In short, computer science in industry is largely practiced as shamanism by people who cannot engineer their way out of a crackerjack box.
"There are very few situations where writing your own langue"
Well, I can see how you might struggle there ;-)
Good natured snarks about spelling aside, part of the issue is that writing, documenting and maintaining your own language is only hard if your toolchain sucks.
If you're interested in writing a specialized language to solve a particular problem, take a look at PEG for JS, and either Racket or Common Lisp (the latter if you need native compilation).
I've recently been involved in the design and implementation of an English-like language for the expression of business domain concepts in web apps. It's a great approach if done thoughtfully and professionally.
That's probably the key, actually. The horror stories we hear are of the bad examples. And we all know that shitty tools, weak languages and bad documentation can come out of large software companies as commercial products as well.
I didn't take a course on compiler construction, and now I don't remember if my university's CS department had one (it was a fairly mediocre CS department at a state university). Now I wish I had.
Do you think a good compiler course would prepare the student to do a project with the scope and complexity of Wasabi? For one project, I wrote an interpreter for a little domain-specific language, then later reworked that interpreter into an on-the-fly compiler (to Lua, to avoid double interpretation). But that's a long way from writing a compiler for a general-purpose language, that can do global type inference and produce human-readable output in a target language that's fairly different from the original VBScript (if not Wasabi itself).
The trickiest bit of Wasabi is the type inference, which I admit is not "production-ready" (or "good code") because we basically invented it from scratch. If I were to do it now, I would know just enough to realize that I need to read about Hindley-Milner rather than reinvent the wheel.
Producing human-readable output is an exercise in tedium and bookkeeping, not any particular amount of skill or brilliance.
Thanks for confirming my guess that the type inference was the trickiest part. These days, I guess Flow (http://flowtype.org/) would also be worth studying. Edit: Or PyPy's RPython.
I imagine most people upvoted because it sounded smart.
It's fascinating how easily cruelty can be popularized by using the right, nice-sounding words. Coating/mask your bile in a rhetoric popular with a community, indirectly imply some terrible things, perhaps obfuscate anything that could raise any uncomfortable, thoughtful questions, and presto! You'll have the right set-up to manufacture consensus.
It's funny because what got me is how dumb it sounded: the words "such a colossal error as writing their own language" made me think "that's the kind of thing a mid-level IT manager at an insurance company would say".
It is usually a "colossal error" to write your own in-house closed-source/proprietary language, no matter how small or large the language is.
The main reason is exactly as the article states, maintainability.
> As time wore on, our technical debt finally began to come due. Compilers like Wasabi and their associated runtime libraries are highly complex pieces of software. We hadn’t open-sourced it, so this meant any investment had to be done by us at the expense of our main revenue-generating products. While we were busy working on exciting new things, Wasabi stagnated. It was a huge dependency that required a full-time developer — not cheap for a company of our size. It occasionally barfed on a piece of code that was completely reasonable to humans. It was slow to compile. Visual Studio wasn’t able to easily edit or attach a debugger to FogBugz. Just documenting it was a chore
What am I missing? This is an internal language designed as an incremental improvement over VB that gave them cross-platform common codebase. It lasted 10 years: that's 19,932 in SaaS years. When they transitioned off of it, they did it not with a rewrite, but with mechanical translation.
Colossal error vs spectacular success story. Yay hyperbole.
I'd say the actual experience is somewhere in between. Sure it enabled them to support client requests to be cross platform and proved useful for a very long time, but what was the broader opportunity cost? Did supporting this proprietary infrastructure eat up resources and prevent them from exploring other ideas? Probably.
Big projects are routinely ported once or twice to new languages. Normally, moving a project from one language to another entails a full rewrite. Because the process of building their own language had the side effect of instrumenting all their code (the transpiler was custom-designed for their application), they were able to write a program to do the port for them. That's a win I don't see captured in the original blog post, and that's all I'm pointing out.
I don't think it ate up any more resources than, say, MySQL support. Fog Creek employees have produced tons of innovation since Wasabi was introduced, including brand-new companies like Trello (2011/2014) and Stack Overflow (2008), as well as in-house products like Kiln (2008), WebPutty (2011), and Make Better Software: The Training Series (2009-ish). None of these projects were particularly resource-constrained by having to do some compiler and runtime maintenance in the process of building FogBugz.
Thank you for the most insightful comment on this story. I find it bizarre that so many are unable to understand that making the decision to 'kill' wasabi today does not necessarily mean that it was a mistake.
This whole story is a fabulous insight into software development for business over the long-term.
I mean, adding an esoteric tool into your development process -- one where an employee has to write a book about it's quirks -- sounds like a failure for the guys in the trenches.
On practically every software project in the history of software projects that have lasted more than 3 release cycles, there is a person on the team who could write a _${Some Component}, The ??? Parts_ book. Nobody ever thinks to do that, because _${Our Report Generation Library}, The ??? Parts_ is super boring.
The reality though is that line-for-line, a transpiler is probably not much harder to write than a serious report generation tool. I agree with the commenter upthread, who thinks this is a result of people simply never having tried to write a compiler before.
It might be, but for what many would argue are the wrong reasons, maybe even unnecessary.
> This is an internal language designed as an incremental improvement over VB that gave them cross-platform common codebase.
The problem is that at the time, there was already several cross-platform technologies in existence, many of which were being developed in the open. Utilizing one of these technologies would have allowed FogCreek to focus on what they do best, make software. Instead, they took a proprietary uni-platform language and attempted all on their own to make it cross-platform capable - which led to years of maintainability issues.
> It lasted 10 years
They gained an early advantage of not having to throw out the codebase and start over, yet they bought themselves 10 years of technical debt which continued to pose a burden on the small company. Many would argue that biting the bullet early on and switching to an open, community-driven cross-platform language/environment would have yielded much more return on the initial investment.
> When they transitioned off of it, they did it not with a rewrite, but with mechanical translation
Yes, that is an achievement, but again, for the wrong reasons.
I feel like you could take this comment, make very minimal tweaks, and deploy it in any language-war debate. "They succeeded with Golang, but for all the wrong reasons. They gained an early advantage but bought themselves 10 years of technical debt that a more modern cross-platform language would have spared them".
And I feel like when you get to the point where the best arguments you can make against something are isomorphic to the arguments you'd make against mainstream languages in language-war debates, that's a win condition.
Sure, FogCreek thought it was a good idea at the time, but over the years it became a significant burden, to the point they had dedicated staff working on just keeping Wasabi alive. Time was even spent writing an internally distributed book about the caveats of the language!
I know you will dismiss this as "routine", but it's not...
For a small company, this is an enormous waste of time, money, and energy.
A big company like Google or Microsoft can afford to throw developers by the dozen at internal proprietary languages and not even blink -- but according to the article, FogCreek did blink every time they had to dedicate time to fixing it. It took time, money, and energy away from their core business - making software.
That's a lose condition.
FogCreek should have bit the bullet and re-wrote their application in an open, standardized cross-platform system. They would have been able to spend zero time worrying about the language, and 100% of their time worrying about their application. They could hire engineers off the street and have them produce in days-to-weeks instead of weeks-to-months. They would have saved themselves an enormous amount of time, money, and energy invested in a language that is now dead anyway.
It may have seemed like a good choice back when the decision was made, but in hindsight it appears to have been a very poor, short-sighted choice.
> For a small company, this is an enormous waste of time, money, and energy.
I think you have this backwards. A small company that writes a compiler and loses a few weeks of dev time per year survives for a decade, while spinning up various new products.
In another world, a small company rewrites its only source of revenue. 18 months later, they release the rewrite with zero new features and a chunk of new bugs and promptly die, because who's going to buy a product that spends a year and a half going backwards?
> FogCreek should have bit the bullet and re-wrote their application in an open, standardized cross-platform system.
Ah, so you happen to know better than Joel how much resources they had available at the time, how long the rewrite would have taken, how much it would have affected their ability to ship new features?
Fog Creek was a much smaller company back when they wrote Wasabi. Postponing the rewrite until they had more resources to spare was probably a good decision.
I think his point can be summarized as "it's better to have to maintain your software than to have to maintain your software and the compiler for it", which is hard to argue against.
Then again, given that they had the codebase already, writing their own transpiler sounds like it was the best option at the time.
"It is usually a "colossal error" to write your own in-house closed-source/proprietary language"
I'm curious if anybody on this thread who has written more than three or four compilers/parsers would agree with you.
Depending on the task, the only solution to some problems is to write a custom/proprietary language (whether it's closed source, of course, is up to the company).
But "bug tracker" is not the problem that was being solved.
The problem was taking a big pile of legacy code and translating it to more than one platform vs rewriting the entire app from scratch in a new language that was cross platform.
It just happened that the legacy code was for a bug tracker, but it could have been for anything.
It is usually a "colossal error" to write your own in-house closed-source/proprietary language, no matter how small or large the language is.
Really?
I don't think so, having done this once, to the great success of the company. They also wrote their own database. The compiler was maintained by a team long after I left the company.
Software is hard. There are more interesting, and hard, problems than pushing the value of a field from one subsystem to the other.
Whereas, complex systems built on popular OO frameworks never have issues with maintainability.
/me deactivates snark mode
I see the issue exactly the other way around.
If you can build a domain specific language that lets you express concepts in a clear way free of boilerplate or conceptual hackery (ORMs, for example) you will wind up with a much lighter maintenance load than the equivalent functionality built on an off the shelf framework.
Of course, there's nothing stopping you from using the two as appropriate. Simple CRUD app? Rails. Need to express a very complex domain in a readable, easily maintained form? Custom language time.
Most things that have lots of reverse-dependencies require a significant amount of maintenance. Compilers are not much different from "common" libraries, or special-purpose frameworks, in that respect. Also, writing a direct-to-assembly compiler is probably not a wise idea.
I'm not saying you are wrong. Building the era of custom tools just for your company, only to do specific jobs is long gone. This is for a lot of factors of risk in a business.
It would work, if management didn't treat programmers as a replaceable cogs in a wheel. But the day you seek to make the craft of programming a commodity that could be done anyone, you need a ecosystem whose knowledge is available to everyone. Only then would you get reasonable expertise at affordable prices to finish your projects.
The opposite is to make so programmers so special that the knowledge of specific tools is available only to them. This definitely puts programmers in a lot more special position to negotiate pay and other things at will. Because the very existence of a business depends on them.
This is like saying "the era of computer science is gone, we all just wire form fields to database columns now".
The fact that a team building the canonical wire- form- fields- to- database- columns application (if there weren't such a thing as "blogs", bug trackers would be the "hello world" of database-backed web apps) found a reason to deploy computer science is, to me, a beacon of hope that we're not all doing something that's going to be automated away in 10 years.
> Building the era of custom tools just for your company, only to do specific jobs is long gone.
Isn't that the biography of 99% of the open source projects in the big data and distributed processing world? I understand they are open now, but didn't they start as custom tools just for a single company?
It seems like the "error" that Fog Creek made was to not open source Wasabi, though even that seems more like a hindsight has 20/20 vision kind of thing, as open sourcing a project is no small feat, especially to a small software company.
Sorry, but nine-hundred and ninety-nine times out of a thousand, it's a complete and total waste of resources to write a proprietary language or runtime enviroment to solve a non-core business problem.
First: they didn't write a runtime. They used .NET/Mono as their runtime.
Second, you wrote your comment in a text box rendered by an application notorious for being written in a custom language (in fact, it seems like arc:lisp::wasabi:.net).
Third, do you have evidence to support "nine hundred ninety-nine times out of a thousand", or is that hyperbole? Have you worked on projects where people used custom languages? Your argument would be much more interesting if we could read more about your experience with it.
You could view their core business problem as easily creating products that as many people as possible can pay for (since that's how they make their money).
Because it's bizarre and stands out. The clarifying sentence of marrying people that live elsewhere etc screams of "the lady doth protest too much".
There's nothing wrong with saying "Over time, and over the natural turnover that happens at all companies, none of the original Wasabi designers are no longer working at FogBugz".
Sure, some snarky people will make the comment "Oh yeah, I bet they left BECAUSE of Wasabi", but most will ignore them.
By completely negating the possibility that any of those people left for any reasons not involving family, it actually seems to INCREASE the probability that Wasabi was more unpopular within FogCreek than Joel would prefer to admit.
Do you know anyone at Fog Creek, or even anyone who has ever worked there? Did they tell you something that would lead you to believe that a blogger for Fog Creek would, completely unprompted and with no real need, make up stories about why people left?
Or could I just as easily argue, with the same total lack of grounding, that you're a secret shill for Atlassian trying to poison the well? (You aren't, of course, but you take my meaning.)
Good news, I actually do know everyone who was part of the original build of Wasabi! None of them left because of the language. I think this accounts for everyone who was there at the time:
1. Original author left because his wife was going to medical school out-of-country and Fog Creek didn't allow remote work at the time.
2. Second author left because his wife was going to medical school out-of-state and Fog Creek didn't allow remote work at the time (see a pattern?). Later came back because Fog Creek offered remote work. Went on to author the blog post we're talking about.
3. Developer left to go work on Stack Exchange (me!)
4. Developer left to go make the world a better place at Khan Academy
5. 2x developer left to go work on Trello
I think that was all of us. People move on in the course of 5+ years. Turns out most of those reasons don't have to do with programming language.
FWIW, I think Wasabi was a bad decision and I'm not going to defend it. But I really don't like these massive assumptions about people's motivations for leaving.
Can I guess at why you think it was a bad decision?
(a) Too incremental to be worth it, given where the .NET ecosystem was heading
(b) FC couldn't commit the resources required to adequately support a whole language, and it's better to commit to a lower common denominator than limp with a poorly supported language
(c) If you're going to create an additional obstacle to on-ramping employees, it had better be something every project in the company takes advantage of --- like, even if you had built FogBugz in OCaml, that would be a problem since the company is not designed to take advantage of OCaml.
(d) Unless you're getting a truly transformative advantage from a custom language, it's not worth it to be out of a "Google your way out of most problems" mainstream sweet spot
(e) No matter how good the language is, using a different language makes you incompatible with toolchain, so edit/test/debug cycles are needlessly painful
I obviously have no idea if Wasabi was a good decision or not, but a workplace where people are allowed to deploy basic computer science to solve problems is (sadly) an attractive stand-out to me.
So, I'm not David, so I'm not going to pretend to know what his thoughts are, but I'll say that I've always had really mixed feelings about Wasabi.
Let me start by saying that Wasabi as a strategic move was brilliant. If David disagrees there, I'm a bit surprised: FogBugz represented an awful lot of battle-tested low-bug code, and finding a way to preserve it, instead of rewriting it, made one hell of a lot of sense. I'm with you that the general thoughts in this forum that we'd have to be insane to write a compiler are misguided. Wasabi let us cleanly move from VScript and ASP 3 to .NET without doing a full rewrite, and I'd be proud to work at a place that would make the same decision in the same context with full hindsight today.
That said, I think Wasabi made two technical decisions that I disagreed with at the time and still disagree in with in retrospect. First, Wasabi was designed to be cross-platform, but targeted .NET prior to Microsoft open-sourcing everything and Mono actually being a sane server target. At the time, I thought Wasabi should've targeted the JVM, and I still think in retrospect that would've been a much better business decision. I really prefer .NET over Java in general, but I know that it caused us an unbelievable amount of pain back in the day on Unix systems, and I think we could've avoided most of that by targeting the JVM instead. Instead, a significant portion of "Wasabi" work was actually spent maintaining our own fork of Mono that was customized to run FogBugz.
Second, Wasabi worked by compiling to C# as an intermediary language. There was a actually an attempt to go straight to IL early on, but it was rejected by most of the team as being a more dangerous option, in the sense that maybe three people on staff spoke IL, whereas pretty much everyone could read C#. I also think this was a mistake: the C# code was not human-readable, made debugging more complicated (VS.NET had something similar to source maps at the time, so it wasn't impossible, but it was very indirect and quirky for reasons I can get into if people are curious), and that decision meant that Wasabi had all of the limitations both of its own compiler, and of Microsoft's C# compiler. IMHO, these limitations are a big part of why the ultimate move away from Wasabi was even necessary in the first place, since they increased both the maintenance and developer burden.
So from my own perspective, I think that Wasabi was a mistake in that, if we were going to go to C#, we should've just got the translation good enough to really go to C# and then ditch Wasabi; and if we weren't, we should've actually owned what we were doing and written a genuine direct-to-IL compiler so we'd have more control over the experience, instead of going through C#. But I still really do genuinely believe that our going to Wasabi was a brilliant strategic decision, and I think Fog Creek would have suffered immeasurably had we not done it.
I'm particularly interested in your thoughts on Wasabi compiling to C# rather than CIL. What characteristics of Wasabi led to the C# output being suboptimal for human reading and editing? If a compiler is going to output human-readable code, are there any general design pitfalls to avoid?
To add to Ted's comment, the main mistake we made in generating readable C# from the start was using `System.CodeDom` as our code generator, which explicitly does NOT care how readable your output is.
A better idea would have been to hand-code the generator, though of course that would have been a lot of string manipulation as well as a little extra effort.
Roslyn solves both of those issues for us, but it didn't exist until very recently.
Beyond what tedu and krallja pointed out, the debugging required inserting tons of #line markers in the C# output. But a single line of Wasabi could map to multiple lines of C#, making the definition of stepping ridiculous. Throw in that Wasabi necessarily grandfathered ASP globals that C# lacked and you also had fun variable display.
The semantics of wasabi (VB) and c# are slightly different. A fair amount of the code was actually the result of various code generators. It dumped everything in one giant file (delimited by #file markers, though). Nothing intractable, but nothing high priority.
Having done something similar, but entirely different, several times, I'm surprised you didn't choose to slowly refactor the code to be more and more native C# over time. iYou start with 100% Wasabi / 0% C# and slowly work up the native C# parts, in code units, until you reach a level sufficiently high to feel confident to do a final push to switch entirely to C#.
(In my experience, you need to build up an inter-op layer first to make working in C# somewhat sane, but it's usually not hard to identify the necessary helper modules needed. Having the .NET runtime actually is a boon here since the IL is designed for inter-language inter-op.)
Why did you find yourselves maintaining a fork of Mono (versus fixing upstream)? Was it something like forking, although being problematic, had lower impedance than doing the necessary rituals for getting your changes accepted upstream?
You can't exactly tell customers to go check out the latest subversion head and compile it themselves. Changes were pushed upstream, but that doesn't push them to customers. Neither could we ship nightly builds, because who knows what changes get introduced? So we had a fixed release with a pile of patches on top of it.
I think there's a useful lesson here about precision in writing. _Why_ the Fog Creek developers left isn't central to the essay's main point, and it's not interesting in itself. But the author did include it, so clearly he thought it was relevant somehow, but why? One salient hypothesis, in this context, is definitely "they left because of Wasabi".
Well, I don't think they actually did leave because of Wasabi; the chain of reasoning I described above isn't very sound. But it's easy and obvious, and the author could have avoided it by doing a little less.
Sure. As I said, I don't think skepticism is _correct_ here. But a critical reader will always be asking themselves, "why did they write it that particular way?", so as an author you have to continually ask yourself the same question.
I do not know anyone who has (or does) work at FogCreek.
But I can imagine a scenario where an employee leaves due to tech stack woes. The employee may not want to burn bridges during their exit interview by saying, "I'm leaving because this tech stack sucks: it's affecting my long-term career progression (no other companies use Wasabi); management is too slow to adapt despite our repeated kvetching; the 1 full-time guy who knows the language is overworked and doesn't have time for proper training." Instead, the employee just says, "I'm leaving for personal reasons" and everything is wrapped up nicely.
Edit: Glad to hear from other commenter that this wasn't the case at FogCreek. I have known people to leave jobs due to tech stack woes; they didn't tell management the real reasons why the left.
"The people who wrote the original Wasabi compiler moved on for one reason or another. Some married partners who lived elsewhere; others went over to work on other products from Fog Creek."
Unless I'm mistaken, this was all that was written about the reasons people left, which to me does not seem very "bizarre". Sure, he could have included "... and some left for other companies" but there's a difference between omitting the obvious and saying something like:
"Let it be known that all the ones who were truly good programmers, and by extension, real human beings! Only the best stayed at Wasabi. No one who was a good employee left for anything other than a personal reason. No one!"
It's not about the compiler, it's about the practical aspects of the language. The compiler isn't the hard part, the compiler is the easy part. You can write a compiler in a weekend if you want. Using it is a much different matter.
As GP notes, why would a dev bother inflicting upon themselves the brain damage of learning this language which will never be useful anywhere else? It's like requiring the devs at your company to use keyboards with a unique layout or something.
It's not necessarily true that it was an error. Their customers wanted Linux support, and they avoided the greater evil of having to rewrite their code or maintain two codebases (remember - this is ~2000 and there's no .Net or Mono, and all their code is written in old [edit:] VBScript).
This was started 2005, not 2000, it says in the post. .Net had been out for years. They're bleeding edge guys, they would have been looking at .net 2. They used C# as the compiler, with Joel's defence being in 2006:
The greater evil was between rewriting their code in PHP, Java, dealing with Mono or sticking with ASP (which was already old fashioned). Or writing your own language.
From Joel's defence post:
We could use .NET, but then I'd have to pay engineers to install Mono for all our Unix customers, and the .NET runtime isn't quite ubiquitous on Windows servers.
The greater evil was definitely writing their own language.
Obviously, hindsight is wonderful, but they had a lot of people immediately point out it was a bad decision and as they say in their blog post, they ended up having to employ a full time language developer. Installing mono doesn't look so expensive now!
I was running FogBugz on Linux at a particular start-up in either 2001 or 2002, so what you say isn't true. The original project must have been started much earlier than 2005. Edit: See reply below, must be getting old :-(
So Thistle, the transpiler, was written either summer 2003 or summer 2004 (it's not entirely clear if he employed the intern that summer or the next) and Wasabi came later (so 2005 is probably correct).
> I fixed a crazy number of silly bugs in mono’s class libraries. To be fair, implementing all of .NET is a herculean task, and most of the fixes were easy enough to make. The result of these fixes meant that we had to ship a patched, custom version of mono
This was 2007. Using Mono in 2005 does not sound like it would have gone particularly well.
So in a way, the original mistake was made even before FogBasic/Wasabi entered the picture: by buying into the wonderful MS ecosystem, FogCreek condemned themselves to 10 years of hacks for cross-platform support. And they've learnt their lesson so well that they're now consolidating on C#, another de-facto MS-only technology which only benefits from the fact that someone else (Mono) is doing cross-platform hacks for everyone.
Joel is great, but this choice baffled me in the past and baffles me today. For the sort of software FogBugz is, they would have had a much simpler life with Java, Python, Ruby, even Perl. Despite all of Joel's insight into "making html sing", he behaved like an accountant building humongous Excel macros "because that's what we know".
Remember the age of Fogbugz. It was initially released in 2000.
MS Windows was by far the dominant operating system. Virtualization was still in its early stages, and mostly at the desktop level. Linux was still growing in the server market but not dominant as it is today.
And what is exactly is wrong with the MS ecosystem if you're targeting enterprise? There are still a lot of businesses that work exclusively with Windows servers with IT managers that don't want the headache of having Linux servers.
Enterprise software tends to be a notch or two below consumer software in the "it just works department", and my experience with deploying Java based enterprise software was pretty negative. In 2000, not a lot of people were using Ruby, Python or Perl for enterprise web apps. It was mostly ASP and JSP back then.
> Remember the age of Fogbugz. It was initially released in 2000.
God, don't I half remember it. I was a junior ASP dev at the time, for my sins. Java was hot like the sun and PHP was the default choice for the young and penniless. Perl was mainstream. Python and Ruby were new and rough (they were crap for webdev on shared hosts, with zero support by ISPs, but alpha geeks were already flocking to their ecosystems, Python in particular).
I'm sure part of the reasoning was that FogBugz did not start as a product -- the product back then was CityDesk, which was even more tied in the MS world -- but still, the "server scene" back then was already unix-y, which is why they were pretty soon forced to consider Linux support. I still think it was a shortsighted approach but hey, FogCreek is still alive 15 years later, so I guess it wasn't all that bad.
One of my first jobs ages ago was to convert a large Perl codebase to ASP 2.0 because my new boss, a 22 year old CTO, was replacing a guy more than twice his age and Perl was "for old folks" as he put it. This one person turned the whole company into a Microsoft dev shop with one decision simply because he didn't feel comfortable around Perl code.
COO at a previous company did this, went from Java to .Net shop. This was two years ago. Laid off most of the Java developers, brought in consultants. I had left, but there were some really intricate business processes in that code base, running on a 40 node jboss cluster. They embarked on a rewrite, which of course is taking longer than promised. All the Java developers who could, got jobs and left, now there are only two guys left who know how to deploy to the the cluster. And they use scripts, they don't understand anything they're doing. The competent people left long ago.
"In particular, we didn't want to have to tell them to get a Java Virtual Machine up and running, because that's not easy, and they're not really a hundred percent compatible."
He seems like an OK guy, but comments like that make me think he made a decision first and then made up reasons later. How hard can it be to make a simple installer that checks and installs Java, many applications do that.
>> How hard can it be to make a simple installer that checks and installs Java, many applications do that.
You'd be surprised how many applications don't do that too. There's a reason why a lot of people say "enterprise software sucks" -- it's usually because the software makers value new features over improving how things work.
It is more difficult than it looks, but then again support multiple Linux distributions with a simple Apache installer is more difficult than it looks as well.
I suspect that this is a function of overestimating the effort on the Java side and underestimating both the demand and the work on the non-Windows side.
They used to be an extremely Windows-centric company.
To a point. Headless stuff tends to work way better than the GUI stuff.
It's kind of the same tune with mobile software -- "Native feels and runs better than everything else". In the case of enterprise software GUIs, it's particularly true.
The difference this time is that C# is now open source. Sure, C# is still essentially Microsoft-only, but there are enough users that if Microsoft abandoned it, it's fairly certain that others would be there to pick up the pieces.
IIRC, Joel once wrote that he started coding FogBugz to learn some VBScript. Once you start a project in a language, there is a mighty amount of inertia to overcome to move to a different language...
I came away with a different impression to you: in the past there were good reasons to develop their own language; they recognised that those reasons no longer exist; they used modern tools to dig their way out from underneath the accumulated debt.
Nowadays we have open-source runtimes, .net running on multiple platforms, and componentised tools like Roslyn. It is easy to forget that the .net tooling from 10+ years ago was much more limited.
Writing you own language is an unusual approach, so descriptions of dealing with that kind of technical debt are rare. I thought this article was valuable and interesting.
> As a software engineer, you quickly start wondering how wise it is to spend years learning a language that will be of no use once you leave your current company
I wouldn't worry about this at all. The choice of language on project n has never once negatively affected my work on project n + 1. As a programmer, my job has a lot more to do with solving problems (in the generic sense) than it does the tools I'm using to solve them.
Business software that provides business value is good software. FogCreek has been shipping Fogbugz to paying customers for fifteen years. If that's not success, it will do until success shows up. If that's a colossal error, where do I get one?
> It's like the author of this article goes out of their ways to avoid saying that some people left the company, period. It also wouldn't surprise me if some of these defections were caused by Wasabi itself. As a software engineer, you quickly start wondering how wise it is to spend years learning a language that will be of no use once you leave your current company (yet another reason why rolling your own language as a critical part of your product is a terrible idea).
I also found that passage oddly worded. We get it, people don't stick around forever, you don't have to try and hide it like it's some dirty little secret. Also as developer I doubt I would have wanted anything to do with a closed-source internal-only poorly-documented language. You may learn some concepts that transfer but by and large you will have to start from scratch when you leave and you won't have skills people are looking for. Also if you do dive headfirst into Wasabi and love it and then leave you probably will be that annoying fuck at your new company that says shit like "Well in Wasabi this was easy...." or "Wasabi makes this problem non-existant because...." Shut up, no one cares. It's crazy to me to think of a company as small as Fog Creek would attempt something like this but to be fair I was born and learned to develop in a different environment than they did so maybe the tools and languages available back then really just couldn't cut it.
Spending time working on Wasabi is only damaging to your career because of the current fad in hiring software developers that says "5+ years of Go development required."
I work with at least 10 languages a week. There is zero chance of us hiring people with years of experience in all those. We want people who have used multiple languages, and someone who worked on compiling one language into another would completely satisfy that itch.
> It's always struck me as extremely bizarre that a company that regularly advertises that it's at the bleeding edge of software engineering practices (...) made such a colossal error as writing their own language
There are no other languages that could replace Rust for Mozilla due to their combination of performance and safety requirements. It is also the product of Mozilla's research arm, which is not directly focused on immediate productization and employs several people with PLT backgrounds full-time. Finally, Rust has always been an open-source projects and has developed a robust non-Mozilla community that contributes heavily to the compiler. None of these is true of Wasabi.
In any case, Spolsky still firmly believes that Netscape would have been better off today if they had continued work on the (completely unsalvagable) Netscape 4.x. Having extensively reviewed the history, Joel was dead wrong about this. There's no reason to assume his advice is more trustworthy than that of any other "thought leader," or that it is consistently followed within his own company.
From the guy who founded an entire development ecosystem out of the web framework he built on what was, at the time, an obscure Japanese programming language, that he originally built for a project management application, it seems kind of strange that he'd be surprised that someone might build a complex tool in similar circumstances.
In a parallel world where the cool developers were all attending Wasabiconf, Joel Spolsky Prime wrote a sneering blogpost assuming that DHH Prime was joking about the crazy 'scaffolding' and 'proprietary database mapping' framework he was building BaseCamp on
I would like to remind you that we are talking about Ruby vs essentially VBScript. I mean, I pretty much hated the Ruby hype and especially the attempts to make it seem better than it really was, but we are indeed talking here about web software written in VBScript that ran only on Windows. For many people working with web tech today, that is inconcievable. Not that it didn't have its fair share of detractors back then...
Hats off to FogCreek for building a successful product, but that doesn't mean that people have to admire or even accept their technical choices without any criticism. I for one consider DHH's criticism to be on mark.
Well, 2006 was a -long- time ago, and Ruby and Ruby on Rails have come a long way.
I don't think Joel's conservative approach was all that bad.
He made a judgment call for --his-- company's products at the time. He could afford to, and it appeared to have worked well for his product line. And believe it or not, people wanted to work at Fog Creek.
In 2006, I wouldn't have imagined the explosion of language acceptability that we have today. We see everything from Ruby, Erlang, Scala and other languages widely used -- a great thing that you can use the best language for software development instead of being limited to the "safe enterprise" choices of Java or Microsoft .NET.
But, as the linked post points out, all those arguments fall by the wayside when the alternative is an in-house language used by no other production project on Earth. Developing your own language is not really "conservative" at all.
But the in-house language was basically a variant of Visual Basic. They basically created their own DSL because they didn't feel like any other tool satisfied their needs. Joel's not an idiot -- I'm sure he weighed the cost of doing his own language vs. the alternatives and found that the direction he took was the better way for the time.
It would have been more of a mistake if Fogbugz turned out to be a load of crap, but it's pretty good. I prefer it to JIRA.
>Well, 2006 was a -long- time ago, and Ruby and Ruby on Rails have come a long way.
Right. I recently listened to a lot of the archive of the StackOverflow podcast. Joel & Jeff periodically discuss Rails, and Joel eventually recognizes Rails as a viable option once it improves. This blog post must have been before that.
Even at the time, Atwood (who obviously can hardly be described as a toady for Rails -- at least not until his more recent projects) suggested his rationale for rejecting Rails was pretty questionable. And his rejection of .NET was also pretty zany.
One of the best parts is that he pretty obviously has Spolsky dead to rights here, but typically DHH is thought of as an asshole and Spolsky as the software nice guy. But it's possible to spread FUD with nice-sounding words.
The killer lines from the blog post is that they are killing off Wasabi for exactly the reasons that everyone said would make it a bad idea
1) Maintenance Nightmare
2) No-one likes programming in propriety language as it dead ends your career
3) Company leavers take vast amounts of knowledge away with them and impossible to hire-in to replace that knowledge
Those predictions were still off by ten years, however. Wasabi served them well until other technologies improved enough to allow them to ditch Wasabi.
We have no idea how much friction they've experienced over the years due to the decision to roll their own language.
Just because they've been successful doesn't mean they haven't expended unnecessary effort to get there.
Climbing a mountain carrying an extra unnecessary 15 kilos is still successfully climbing a mountain - but carrying an extra 15 kilos whilst doing so is still a bad idea.
Wasabi was technical debt. Like all forms of debt, there is productive and unproductive technical debt. Productive technical debt actually give some return on that debt. Unproductive technical debt doesn't. Given the fact that Wasabi lasted 10 years (and the company seems to have made a bunch of money because of it) and because Wasabi gave them the ability to adjust to market factors, I'd say the payoff from this technical debt was highly justified.
All technical debt decisions should be made based on what the business hopes to get from the debt. Considerations for the alternatives, when the debt is retired, etc should all play a part. To globally say "this is a terrible idea" like so many in this thread are doing totally ignores these types of factors in favor of "It's a bad computer science" idea and thus miss the point of technical debt in the first place.
It's not even remotely a bad CS idea, however. It's taking long established praxis with a sound theoretical background and applying it to a problem that that practice was intended to solve.
It's strange that the HN crowd, consisting mostly of developers who work for companies that exist on average for 16 months, has the audacity of calling Wasabi, a tool that has been in use for close to 15 years, a failure.
Probably because it isn't some hipster badass ninja rockstar web-scale pipe-to-dev-null Wangular.js Ruby-on-Fails Web-2.0 SaaS app, and therefore automatically a failure.
What would you have done, given the starting point of a VBScript codebase, to bring it to Unix and later to .NET? Remember, it does no good to talk about what should have been, years before. You have to make the best decision with what you already have. In that light, writing a custom compiler for a variant of VBScript doesn't seem so bad.
If your point is that it was strategically necessary given the constraints at the time, that's fine with me. There are lots of terrible things that became strategically necessary due to the past decisions of business people.
I still wouldn't want to have to learn this language, would you?
I would. I might not like it, but I can't say for sure now, not having used it. And you never know, there could be some good ideas lurking deep inside.
I recently played around with Atari 2600 BASIC (which is quite possibly worse emulated than on a real system) but found that it had an incredible tracing mechanism I've never seen elsewhere. Also, you could control the speed of execution as the programs runs (imagine---running a program at full speed, then slowing down to watch exactly what happens). You won't know until you try.
The author of this post started at Fog Creek around a year after Wasabi was invented, if I'm getting my timeline right. He was also the one to finally kill Wasabi. Hardly sounds like jumping ship.
We hadn’t open-sourced it, so this meant any investment had to be done by us at the expense of our main revenue-generating products. While we were busy working on exciting new things, Wasabi stagnated. It was a huge dependency that required a full-time developer — not cheap for a company of our size.
The way I read this article, creating Wasabi a decade ago was not a mistake, given what they were doing and what was available at the time. Not open-sourcing Wasabi was a mistake, though.
I agree. Especially when we first had Wasabi, there were a lot of other legacy ASP code bases lying around, and I think we could've both helped ourselves and gotten more community support by going that route. But it's also worth remembering this was before GitHub; this would've meant a SourceForge account, mailing lists, etc.—a reasonable extra use of resources. It wasn't as trivial a decision then.
When I learn a new, perhaps hyped-up computer language, I soon run into difficulties, no matter what the merits of the language. The difficulties are lack of tooling. eg, no debugger - or only a rudimentary one, no static analysis, no refactoring, no advanced code browser.
If the language is successful, these things come with time.
When you develop an in-house language, you'll never get the advanced tooling that makes for a great software development experience. This, for me, was why I was surprised by Joel Spolsky's announcement of an in-house language.
(Although, to be fair, these things didn't really exist of VBScript nor for PHP at the time Wasabi came to be.)
I think your bracketed comment says it all; there was no obvious alternative at that time which would satisfy the market conditions. The VBScript version was super easy to install for most companies with the servers they already had available. They had a selling feature which was entirely dependent on this terrible technology. So they mitigated it. It's all pretty reasonable.
I do wonder what the difference would have been had it been open sourced early on? I would think a tool that had a VBScript-like syntax that could deploy to PHP would have been a popular item with enterprise developers for the same reason it appealed to Fog Creek.
I seem to recall some comments at the time that it wasn't worth opening or selling because it was tailored specifically to their code. For example, it relied on their naming conventions to determine whether a string was safe to display or not, so non-Fog Creek code wouldn't really work with it.
It's not mentioned in the article, but I suspect another factor was the growth of their SAAS (aka "Fogbugz on Demand") offering - which obviously severely undercuts the value of Wasabi.
It's listed from a Google search, but from just clicking around the Fogbugz site I can't even find the page/pricing for on premise installation.
A lot of people criticize Fog Creek for writing their own compiler, but I think it's a good example of working smarter rather than harder to solve a problem they had at the time. I think that companies which apply sheer brute force to rewrite the same app in Objective-C and Java to target the popular mobile platforms could learn from this.
I wonder to what extent the generated C# code depends on C#'s dynamic typing option. I ask because the original VBScript was certainly dynamically typed. So by the end, to what extent was Wasabi statically typed, through explicit type declarations or type inference? And how much did the compiler have to rely on naming conventions such as Hungarian notation?
Wasabi is statically typed — `dynamic` wasn't added to C# until C# 4, in 2010, so the implementation choice was between static types and a LOT of reflection.
The type inferencer was an attempt to make it as easy to declare variables and functions as it was in VBScript. Of course, you could explicitly declare types as well.
The one place Wasabi depends on Hungarian notation as a type heuristic is when reading from an ADO RecordSet (which happens quite frequently in FogBugz). The column name determines which type the returned object will be cast into.
How much of the FogBugz code had to be modified to accommodate static typing, even with global type inference? In other words, was there much code that truly depended on dynamic typing? My guess is that Joel and his team were more disciplined about this than I've been. (Yes, I'm entertaining the idea of writing a custom compiler from a dynamically typed language, in my case Lua, to a statically typed one, probably C++ with heavy use of reference counting.)
Very little, if I remember correctly. The FogBugz coding conventions meticulously follow Apps Hungarian (http://www.joelonsoftware.com/articles/Wrong.html), which means that the type of a variable should not change.
> I think that companies which apply sheer brute force to rewrite the same app in Objective-C and Java to target the popular mobile platforms could learn from this.
I've been on that end and it really sucks, especially if you don't have the manpower to keep all platforms and versions in sync feature-wise. And there even was a C++ core that was identical across three platforms and yet there still was feature disparity and neglect.
I'm like 2 paragraphs in and already publishing this is a total win:
Thistle was origionally an intern project to produce a source-to-source translator that translated ASP into PHP by adding dollar signs to variable names and implementing LBound and UBound in PHP
Thank you so much for ruining my productivity for a couple hours. :)
Replacing Wasabi code with the prettified output of Wasabi seems like a terrible idea to me. Is the result similar enough to the original source that it will still make sense? Do comments get preserved?
Programmers just love to change good working code into the new style or new language. One has to always view the impulse to change skeptically.
I'll make sure to include some example output in my next article. It's pretty readable - not quite idiomatic C#, but you can almost always see what's going on. It's very conservative C#, and it still uses the same style of coding that the original source used - ADO and `Response.Write`, rather than WebForms or ASP.NET MVC.
This project had the full support of the company, from our newest hires all the way to Joel and Michael. As many people have noted, using this wacky one-off language was a source of a lot of friction in our business, and we felt that spending a few months to get rid of that friction would be a good use of resources (me).
- SyntaxTrivia is a great idea, if a little hard to grok. I wish I'd have thought of it for Wasabi's comments. (CComment inherits from CStatement, which is CLEARLY wrong.)
- The SyntaxKind enum is a giant blob of magic numbers. I would have appreciated it being implemented as sum types.
I hear you about the enum -- it's not the best programming experience. Unfortunately, for performance it's much faster for SyntaxTokens to be struct types, which means no inheritance and thus no sum types. In addition, the SyntaxKind allows a faster check on the SyntaxToken Kind since you can simply pull an int field, rather than having to do a runtime type check.
One thing I've always wondered (and I don't intend this as a criticism per se), but why didn't they open source it? If it had a following outside of Fog Creek it might not have been an inevitable dead end.
While I hesitate to endorse a language based on VBScript, it seems like the extensions they added to it were pretty nice. I mean, if you're inclined to use a VBScript style language, Wasabi wasn't horrible, and given Spolsky's following and Fog Creek's mindshare, it seems at least possible it could have become a useful thing rather than a legacy thing to be replaced. I mean, at the very least, it's probably not worse than PHP. (Granted: my opinion of PHP is very low). Maybe it's for the best though, the world is probably better off without new wasabi projects.
Looking back, I regret not open sourcing it as well. Pre-.NET, we didn't want to spoil our cross-platform competitive advantage. Post-.NET, we didn't think there was much of a market for a VBScript-to-C# transpiler (wouldn't those people prefer VB.NET?), and besides, open sourcing it seemed like a lot of work for basically no benefit.
I get that, running an open source project can be very time consuming and expensive, and once it starts it becomes a part of your brand that cannot necessarily easily be dismissed. So it may have been wise not to open source it. I do think though, that wasabi was not obviously stupid or anything, and it probably could have had some amount of success in the wild.
They tried something interesting, it didn't work and they replaced it later.
When this has been mentioned previously there is a strong "they did a crazy thing with Wasabi" but progress depends on doing crazy things that just might work sometimes.
You could have just started on Java. I've worked on very large 15 year old Java enterprise code bases that are doing just fine. That language is so pathetically maintainable and the backward compatibility between JVM releases is very good. Microsoft seems like a moving target with how often they deprecate things.
As someone working on a compiler which is not a primary business of this company I disagree. Our main product is a large library with quite a lot man-years of work put into it that we offer on lots of different platforms and frameworks – for the most part with near-identical API and capabilities. Compiling the source into different languages rather than rewriting everything is a very valid approach. In our case almost all bugfixes and new features can be merged directly into the other products and work (the exception being the glue parts to the respective platform/framework).
Being able to precisely control the output and thus guarantee API stability is a major factor in developing this in-house, especially as most converters only concern themselves with replicating the visible behaviour of a complete application.
If you license the library, why not directly cross-compile it (using existing compilers) and then just write wrappers to expose it in the target languages?
I disagree, because there are SO many examples of companies and products that do this. Basically every data storage company, every networking company, every database company (meaning they wrote the OS)..
Not to mention wasabi is a transpiler! It took VB-ish stuff and turned it into a few target languages. This is senior year undergrad complexity that they used for 10 years.
Writing a compiler sounds scary to some people. It's not scary. The only compilers training I have is a senior-year undergraduate course, and quite a lot of that was learning all the different ways to write parsers. Wasabi uses a LL(k) recursive descent parser, and you don't have to know what that means to be able to work on it. I've never even read the Dragon book; we used Cooper & Torczon "Engineering a Compiler" instead.
I think it's perfectly legitimate for a company to build those things when it's not their core business, provided that A. They hire people who are legitimate experts and give them free reign to work on it, and B. They open-source it.
A lot of quality compilers and databases have come from software companies whose primary business is neither a compiler nor a database.
A company should never do secondary business X if their proprietary take on X does not contribute to their value proposition in a way that external solutions cannot provide.
At one time, for example, ice companies did their own deliveries because they needed special hore-drawn carriages to keep the ice cold.
This is silly. If you make a reasoned decision, taking into account your specific circumstances and the projected changes to come over the midterm, and you decide to do this, well, why not? Should Google not have built go? Should Jane Street not have built core? Should Facebook not have built their suite of tools? I'm comfortable saying that the super smart people who advocated for those projects (among numerous others) had a pretty good case to make.
Why not? Because it creates a maintenance nightmare and if it's not your core business that means it will never be a priority. Using Google and Facebook as a counter-example is silly. These are $250B+ companies and these tools are a significant part of their business.
Or chat tool, or bug tracker, or... The list goes on. Basically, focus on your core business. If you spend more than ~10% of your time building infrastructure tools, someone else probably does it better, for cheaper.
I wonder how much time and money they would save if they had just bitten the bullet and rewritten the entire thing in PHP in the first place instead of doing what they did.
The big advantage of the ASP/VBScript version is that there were almost no dependencies and it would install on the most basic IIS setup. Many companies already had IIS installed and configured but not doing anything.
Getting companies to install PHP on Windows (a dicey position those days) was a non-starter.
This is off-topic, but how is it that this is the top story on HN right now with zero discussion (assuming that by the time I click "add comment" I am the first to comment).
Is it dumb to assume that it's normal for upvotes and comments to increase at a similar rate?
Writing a good comment takes effort. Interesting posts are upvoted just so that they can be part of a discussion even if you are not the one starting the discussion. I do this all the time. If I feel that I can add to the discussion, I will write up a comment later.
People won't comment if they feel they don't have anything of substance to add - sometimes submissions can stand on their own pretty well and thus don't solicit a lot of comments. It's about maintaining a high signal-to-noise ratio. If anything I've found that you can use the karma/comment ratio as a moderately reliable indicator of link quality.
As for your original question, you should probably "lurk moar" before starting off-topic discussions. This really isn't that uncommon.
I've always assumed that some users (closely related to the pg-sphere) have heavier upvotes than others. How else would they maintain a sane frontpage?
It's fascinating, because writing your own custom language is one of those things that everyone knows is a bad idea.
It's one of those things that people do because it seems like the path of least resistance (and in the short run, is), but it inevitably snowballs into a pit of technical debt. Spolsky knew this quite well (he'd written eloquently on the subject).
...and yet he still did it. His defence was that it was the easiest option in the short term, and he was probably right, but it doesn't matter. People only do stupid stuff that seems smart; saying "this stupid thing seems smart!" is only a defence if you have no idea that it's actually fundamentally stupid. Of all the people in the world, Spolsky is one of the least able to mount this defence.
Contemporaneously with his decision to go all in on Wasabi, he wrote a scathing condemnation of Ruby for being slow, unserious, obscure; he suggested that a serious company shouldn't opt for Ruby because it was risky, and that choosing it would put you at risk of getting fired.
Was he right? In 2006, maybe? I mean, he turned out to be wrong, but I don't think it was entirely obvious that Ruby was a serious choice 15 years ago. Of course, he wasn't writing 15 years ago, but even nine years ago, a very conservative, safe approach to choosing a technical stack very possibly did mitigate against selecting Ruby, for all the reasons he outlined. But those arguments applied twice as hard to Wasabi. You don't get to argue that there "just isn't a lot of experience in the world building big mission critical web systems in Ruby" (and hence you shouldn't use Ruby), and then turn around and use Wasabi for your big mission critical web system.
Of all the people in the world, Spolsky probably had the best understanding of why Wasabi was a stupid, short sighted decision. He did it anyway. And it was stupid and short sighted. Rarely is someone so right and so wrong about the same thing at once.
(And yes, Fogcreek is still around, and so is FogBugz. But I don't buy for a moment that Wasabi was actually a good choice. They survived it, but they didn't benefit from it.)
Edit: Spolsky has written too much about why things like writing something like Wasabi is terrible idea to link it all. Besides, a lot of it has been linked in other comments. But I don't think I can express strongly enough that my anti-Wasabi position is simply repeating the things the guy that signed off on developing it and using in production wrote. ...then he decided to write a new language because apparently Ruby was too slow to possibly use to generate a graph, and there was literally no alternative to using Ruby for graph generating than writing your own compile-to-VBScript/PHP language. Words fail.
Wow, rabbit hole is right... The first mistake was writing a web-app in VBScript (not VB) ?? I'm not even sure that's possible, it's such an awful, limited language. Probably required components written in C++? The developer who started that should have been fired as incompetent. These are the "B-players in a hurry" you should get rid of, that can cost your company millions.
Then instead of cutting their losses, they doubled and tripled-down on it until
they had their own language and sophisticated tools around it. Around this time, Django and Rails had been started already. And several decent cross-platform web frameworks were years old, such as CherryPy. Even PHP would have been a better choice. One of these could have been phased-in parts at a time to minimize disruption.
Did I get this right? Because there are so many WTFs that I must have missed something.
VBScript is the language of Classic ASP. Many, many sites were written with that technology ~15 years ago and some are still around. An unpleasant language, yes, but certainly not an incompetent decision at the time.
Second, the "developer who started that" were the founders. One had worked at Microsoft, and so was very familiar with ASP.
Third, FogBugz was a complex, mature application by the time Django or Rails were even considerations for production systems. Phasing in in parts is not really a viable solution either. There was a lot of business logic written and shared that those new parts would need access to, which would create more work to implement.
In short, no, you did not. You're looking at decisions made 10 years ago through today's lens, without considering the state of the art at the time.
(In case you're wondering, I worked at Fog Creek from 2005-2012, though I was never a part of the FogBugz team.)
As I mentioned PHP was pushing 10 years old at that point. They already had built a translator to it "Thistle", so they could have gone to PHP at that time, just like they mention going to C# now.
I'm not a huge PHP fan, but it would have saved 10 years of development and maintenance over their NIH solution. See the "Jumped the Shark" piece from Coding Horror... not a minority opinion.
VBScript was literally the prescribed way of building web applications for IIS. This was a long time ago and things were way different than they were now. ASP was a revelation compared how web applications were built prior to it's introduction.
It seems archaic now but those were archaic times.
http://jacob.jkrall.net/wasabi-the-parts/index.html
Having now read it, I've come to the conclusion that the blog gives the wrong impression about the implications of having a custom language. Readers of the blog post come away with the idea that Wasabi was so full of "o.O" that someone was moved to write a book about that. In reality, the book is simply documentation of the language features, with callouts for weird interactions between VB-isms, ASP.NET-isms, and their language.
You should definitely read the "Brief And Highly Inaccurate History Of Wasabi" that leads the document off. It's actually very easy now to see how they ended up with Wasabi:
1. The ASP->PHP conversion was extremely low-hanging fruit (the conversion involved almost no logic).
2. Postprocessing ASP meant PHP always lagged, so they started generating ASP from the same processor.
3. Now that all their FogBugz code hits the preprocessor, it makes sense to add convenience functions to it.
4. Microsoft deprecates ASP. FogBugz needs to target ASP.NET. They can manually port, or upgrade the preprocessor to do that for them. They choose the latter option: now they have their own language.
It's step (3) where they irrevocably commit themselves to a new language. They want things like type inference and nicer loops and some of the kinds of things every Lisp programmer automatically reaches for macros to get. They have this preprocessor. So it's easy to add those things. Now they're not an ASP application anymore.
Quick rant: if this had been a Lisp project, and they'd accomplished this stuff by writing macros, we'd be talking this up as a case study for why Lisp is awesome. But instead because they started from unequivocally terrible languages and added the features with parsers and syntax trees and codegen, the whole project is heresy. Respectfully, I call bullshit.