It has always struck me as extremely bizarre that computer science graduates would recoil from someone solving a business problem using what appears to be very basic compiler theory.
The second half of your comment transitions from weird to mean-spirited, as you begin speculating about people you don't know and their reasons for changing jobs. I'm a little confused as to why you've been voted up so high on the page.
I still think the fact that even most people with a 4-year degree still haven't done a compilers course is the core problem. Both other times we've had this discussion I haven't noticed anyone popping up to say "Yeah, I've written like 3 compilers and Wasabi was just an insane idea." (Of course, the Internet being what it is, someone will probably say that now. But the point is I haven't seen it before I asked for it.) A lot of people are doing the cost/benefit analysis with an order of magnitude or two too much in the "cost" column. Yeah, of course it looks insane then... but the problem is the analysis, not the reality.
Compilers just aren't that magically hard and difficult. I'll cop to not having written a true compiler yet but I've written a number of interpreters, and I've written all the pieces several times (compile to AST, interpret, serialize back out, just never had the whole shebang needed at once).
If you're reading this, and you're still in a position where you can take a compilers course, take it! It's one of the most brutally pragmatic courses in the whole of computer science and it's a shame how it's withered. (Even if, like me, you'll probably write more interpreters than compilers. And nowadays you really ought to have a good reason not to pick an existing serialization off-the-shelf. But it's still useful stuff.) It's one of those things that is the difference between a wizard and a code monkey.
I've written like 3 compilers, and while I don't think Wasabi was quite insane (they had an interesting set of constraints, so I could at least follow the logic), it's not the choice I would've done. Or rather, it's totally the choice I would've done as a fresh college grad in 2005 having written my first compiler for work (which was ripped out in about 2 months...it didn't take me that long to realize my mistake), but it's not what I would've done with the hindsight experience of that and other compiler projects.
The cost of an in-house programming language isn't in writing the compiler. It's training all your new team members in the language. It's documenting the language constructs, including corner cases. It's in not being able to go to Stack Overflow when you have problems. It's in every bug potentially being in either your application code, your compiler, or your runtime libraries, and needing to trace problems across this boundary. It's in integrating with 3rd-party libraries, and in not being able to use tooling developed for an existing mainstream language, and having to add another backend to every other DSL that compiles to a mainstream language.
All that said, I agree that if you're ever in a position to take a compiler course, do it. It's one of the most valuable courses I ever took, and really peels back the mystery on why programming languages are the way they are. It's just that the difference between wisdom and intelligence is in knowing when not to use that brilliant technique you know.
"It's just that the difference between wisdom and intelligence is in knowing when not to use that brilliant technique you know."
Which is precisely why I've never written a full compiler, even though I've written all the pieces many times.
For instance, instead of writing a parser, could you perhaps get away with just a direct JSON serialization of some AST? Do you really need to emit something, or will an interpreter do? So far I've never been so backed against the wall that I've actually needed a full compiler.
Yeah, one of the compilers I wrote just used JSON as the AST, with it being generated by a GUI interface. Another used HTML with annotations (although go figure, I wrote an HTML parser [1] for it, because there weren't any C++ options at the time that didn't bring along a browser engine). A third had a custom front-end but then emitted Java source code as the back-end.
The interesting thing is that the more experience you get, the more alternatives you find to writing your own language. Could you use Ruby or Python as the front-end, much like Rails [2], Rake [3], or Bazel [4]? Could you build up a data-structure to express the computation, and then walk that data-structure with the Interpreter pattern? [5] Could you get away with a class library or framework, much like how Sawzall has been replaced by Flume [6] and Go libraries within Google?
In general, you want to use the tool with the least power that actually accomplishes your goals, because every increase in power is usually accompanied by an increase in complexity. There are a bunch of solutions with less power than a full programming language that can still get you most of the way there.
I'm doing this now, for a crappy language and a crappy processor. It's been a nightmarish hellscape of a project, but also very expanding. Highly recommend.
(If you're interested in goofing around with Starfighter, you're going to get an opportunity to get handheld through a lot of this stuff.)
Assuming I could transfer the benefit of hindsight back to Joel's position in 2005, including all the knowledge of how the market has evolved over the past 10 years? I would've jumped on the SaaS bandwagon, hard, and converted the existing the existing VBScript codebase to a hosted solution, discontinuing support for the PHP/Linux version and freeing the company up to migrate code as it wished on its own servers.
I recognize that this would've been a huge leap for anyone in 2005, when 37signals was basically the only company doing small-business SaaS and the vast majority of companies insisted that with any software they buy, they actually buy it and the source code and data sit within the company firewall. Heck, when Heroku came out in 2007 I was like "Who the hell would use this, turning over all of their source code to some unnamed startup?"
But looking at how the industry's evolved, that's pretty much the only way they could've stayed relevant. Many companies don't even have physical servers anymore. That's the way FogBugz did evolve, eventually, but they were late getting there and had to back out all the existing Wasabi code and fixes they made for it to be easily deployable (which was one of their core differentiators, IIRC; they were much easier to setup than Bugzilla or other competitors).
It makes me appreciate how tough the job is for CEOs like Larry Page or Steve Jobs, who have managed to stay at the leading edge of the industry for years. Larry was pretty insane for buying a small mobile phone startup called Android in 2005, but it turned out to be worth billions eventually.
Tangent: Your description of how people resisted SaaS a decade ago makes me wonder if the only reason the industry did eventually move toward SaaS was that most on-premises apps were such a nightmare to deploy. After all, some of the disadvantages of SaaS, such as lack of control over one's own data, are real. If Sandstorm.io had existed back in 2004, might we have avoided SaaS altogether? (Of course, if Sandstorm.io had existed back then, Fog Creek would still have needed to port FogBugz to Linux.)
I think the move to SaaS was a combination of factors:
1. The primary product of many companies got too large to deploy on their own server farms, and so they started moving toward AWS etc. for scalable hosting. Once your product is in the cloud, it makes sense to deploy your supporting infrastructure & tooling there as well, because otherwise you're paying the support, hosting, & sysadmin costs for just your non-critical corporate infrastructure.
2. Bandwidth became a non-issue. In the 1990s there was a very measurable difference between 10BaseT internally vs. an ISDN line to your hosting provider. In the 2010s, there's little practical difference between gigabit Ethernet vs. 10M broadband.
3. HTTPS became ubiquitous, taking care of many security risks.
5. Employees started to blur the line between work and home, leading to demand for work services that could be used, encrypted, from a user's home network. VPNs were a huge PITA to set up. This was a big issue for much of the early 2000s; one of my employers made some clever network software to punch through corporate firewalls with a minimum of configuration.
6. Development speed increased. SaaS companies could push new versions of their product faster, react to customer feedback quicker, and generally deliver better service. Because all customer interactions go through the company's servers (where they can be logged), they have much better information about how people are using their products. Deployed services were left in the dust.
tl;dr: #1-4 made lots of businesses go "Why not?", while #5 and #6 made them go "Yessss."
It's interesting that many of the arguments about why you should not use SaaS businesses now (like privacy and security, and lack of ownership) were relatively minor reasons then. I do kinda wish (in an abstract way) that something like Sandstorm would catch on, but I think they may be early: SaaS just isn't that painful, and until we have a major shake-out where a lot of businesses get taken out because their dependencies go down, it seems unlikely that it will become so. Or the other way this could play out is that a new powerful computing platform comes out that lets you do things that aren't possible with thin clients, and you see a rush back to the client for functionality.
All very good reasons. I'll add another - accounting.
The monthly bills for small purchases of SaaS fits on what could be expensed on a corporate card. By the time IT gets wind, the product has already infiltrated the organization. If there's a very large up front cost, then IT is involved, you need a formal RFP process, lots of people weigh in, those opposed to the purchase can try and block it... As soon as "Put it on the corporate card" became viable, power moved back to the business units.
With Sandstorm, we could actually get that effect on-prem. Since no technical expertise is needed for deployment, and since the security model is so strong, and the IT department will be able to manage resource quotas on a user basis rather than an application basis, it's actually entirely reasonable that people outside of IT could be permitted to install software without IT approval.
Granted, it may take a while to convince IT people that this is OK, but fundamentally they have every reason to prefer this over people cheating with SaaS.
Actually, not that late. I think their main problem was that the environment changed around them. Besides SaaS, the whole developer ecosystem changed as well: when I look at who really won the bugtracking market, it's GitHub, who added it as a feature on code hosting.
If winning the bugtracking market was the goal, they probably would've taken VC money. You may notice that everyone who's in a position to claim that has done so (Github, Atlassian, etc).
They did learn from this, as you can see by the very different paths StackExchange and Trello are on.
Joel wrote an essay about this. [1] His basic thesis is that organic growth wins over VC when there are entrenched competitors, few network effects, and little customer lock-in. VC wins when there are wide-open markets, strong network effects, and strong customer lock-in. Stack Exchange's investment was consistent with this thesis [2].
The developer tools market changed from one with very few network effects to one with a large network effect around 2010. The drivers for these were GitHub, meetups, forums like Hacker News, and just its general growth - they made coding social. When I started programming professionally in 2000, each company basically decided on a bugtracker and version control system independently, and it didn't matter what every other company did. By 2015, most new companies just use git, they host on GitHub, and if they don't do this, they're at a strong disadvantage when recruiting & training up developers, because that's what much of the workforce uses.
Interestingly, both GitHub and Atlassian resisted taking investment for many years - GitHub was founded in 2007 and took its first investment in 2012, while Atlassian was founded in 2002 and took its first investment in 2010.
Right! And this isn't even a compiler, the way most people think of "compilers". It's a transpiler to three target languages each of which has an extraordinarily full-featured runtime. Two of which are (a) widely available and (b) as source languages, awful.
And it's not their own language, it's an extension of VBScript. And now that the tools around C# are better and Linux support for .Net is official, they have used these tools to transition to C#. Like you, I don't get the outrage.
Do you think that the name Wasabi contributes to the outrage?
Coffeescript has a similar name to Javascript, so you can quickly draw an association between the two.
The name Wasabi doesn't have an obvious connection to the VBScript that it's based on, which seems to be the cause of people talking about writing a whole new language, etc.
I've written some toy compilers, and I can at least say:
1. compilers have bugs
2. it really sucks not knowing if a bug is in your code or in your compiler
3. it sucks not having a source-level debugger
Anyone can write a simple compiler, just like anyone can make a simple database. The hard part (at least for a non-optimizing compiler) isn't the comp-sci theory, it's making the tooling around, and the extensive amount of testing needed to be sure you don't have subtle data corrupting bugs lying around to bite you.
I won't categorically reject the idea, for instance I think Facebook writing their HipHop compiler was completely defensible. But you need people with compiler experience, and people who know the pain of working with crappy, undocumented, buggy toolchains to make that decision, not people who once took a compiler course.
I've written like 3 compilers* and Wasabi seems like it was probably a reasonable solution for the problem they had at the time. Compilers just aren't that magically hard and difficult.
There are very few situations where writing your own langue and toolchain is a good idea. I used to work on a proprietary company language that was actually a compiler generator for language-to-language translation, plus a bunch of other stuff, and it was a horrible pain.
Documentation? None
Online community? None
Transferability of skillset? None, apart from knowing how compilers work. Makes for good nerd conversation, but that's it.
Writing your own toolchain is almost as bad. I've seen multiple talented people leave companies I've worked at when they were forced to build and maintain horrible tools for the in-house ecosystem. Some too-big-for-his-britches second-system-as-a-first-system ass had written them, and everybody else got stuck with it.
As the other commenter noted, this seems like epitome of bad software engineering and I'm surprised employees put up with it if they were any good.
EDIT: I learned to program in assembly, so compilers didn't seem super mysterious to me as they are for someone who learns Java first perhaps.
Can't you say the same things about a proprietary database, or a proprietary template language? What are the kinds of computer science that we can safely deploy without taking extra precautions to document and maintain it?
Both of those should be looked upon with suspicion. I can't say "never do it", given that every employer I've ever worked at has had its own proprietary database, and one of the projects I worked on at Google was a proprietary template language. But all of them were a large maintenance burden, much larger than originally anticipated.
I think the old business adage about "In-source your core competencies, outsource everything else" applies here. If you derive a big competitive advantage from having a proprietary database or proprietary template, and it generates enough revenue to afford a dedicated team of experts to maintain it, build it. But if you have a bunch of smart & motivated developers who can build a proprietary database, but your product isn't databases or templates and your core differentiator isn't the performance or query patterns you get from building it yourself? Put them to work improving the product, and work with the infrastructure that other firms have built already.
i'd actually be way more suspicious of a proprietary database, unless there was a very compelling reason why none of the existing ones worked. maybe this is just my inexperience in the field, but a database engine seems orders of magnitude harder to get right and maintain than a compiler (that too a transpiler, so you can even human-inspect the output!) does.
Yes. Any proprietary system will require you to document/educate the users, and you will not have the benefit of an online community to get help from, or bug fixes, or security analyses. There are very few problems where rolling your own solution is the right solution. Maybe if you are Google and your database isn't big enough or something.
If you have great people building the software, or at least competent ones, and you have competent users, you might succeed, maybe. But that's assuming you have a maintenance plan and a roadmap, which most software companies do not. Maintain software? YOLO! What happens when you have a bunch of morons using and maintaining the software?
In short, computer science in industry is largely practiced as shamanism by people who cannot engineer their way out of a crackerjack box.
"There are very few situations where writing your own langue"
Well, I can see how you might struggle there ;-)
Good natured snarks about spelling aside, part of the issue is that writing, documenting and maintaining your own language is only hard if your toolchain sucks.
If you're interested in writing a specialized language to solve a particular problem, take a look at PEG for JS, and either Racket or Common Lisp (the latter if you need native compilation).
I've recently been involved in the design and implementation of an English-like language for the expression of business domain concepts in web apps. It's a great approach if done thoughtfully and professionally.
That's probably the key, actually. The horror stories we hear are of the bad examples. And we all know that shitty tools, weak languages and bad documentation can come out of large software companies as commercial products as well.
I didn't take a course on compiler construction, and now I don't remember if my university's CS department had one (it was a fairly mediocre CS department at a state university). Now I wish I had.
Do you think a good compiler course would prepare the student to do a project with the scope and complexity of Wasabi? For one project, I wrote an interpreter for a little domain-specific language, then later reworked that interpreter into an on-the-fly compiler (to Lua, to avoid double interpretation). But that's a long way from writing a compiler for a general-purpose language, that can do global type inference and produce human-readable output in a target language that's fairly different from the original VBScript (if not Wasabi itself).
The trickiest bit of Wasabi is the type inference, which I admit is not "production-ready" (or "good code") because we basically invented it from scratch. If I were to do it now, I would know just enough to realize that I need to read about Hindley-Milner rather than reinvent the wheel.
Producing human-readable output is an exercise in tedium and bookkeeping, not any particular amount of skill or brilliance.
Thanks for confirming my guess that the type inference was the trickiest part. These days, I guess Flow (http://flowtype.org/) would also be worth studying. Edit: Or PyPy's RPython.
I imagine most people upvoted because it sounded smart.
It's fascinating how easily cruelty can be popularized by using the right, nice-sounding words. Coating/mask your bile in a rhetoric popular with a community, indirectly imply some terrible things, perhaps obfuscate anything that could raise any uncomfortable, thoughtful questions, and presto! You'll have the right set-up to manufacture consensus.
It's funny because what got me is how dumb it sounded: the words "such a colossal error as writing their own language" made me think "that's the kind of thing a mid-level IT manager at an insurance company would say".
It is usually a "colossal error" to write your own in-house closed-source/proprietary language, no matter how small or large the language is.
The main reason is exactly as the article states, maintainability.
> As time wore on, our technical debt finally began to come due. Compilers like Wasabi and their associated runtime libraries are highly complex pieces of software. We hadn’t open-sourced it, so this meant any investment had to be done by us at the expense of our main revenue-generating products. While we were busy working on exciting new things, Wasabi stagnated. It was a huge dependency that required a full-time developer — not cheap for a company of our size. It occasionally barfed on a piece of code that was completely reasonable to humans. It was slow to compile. Visual Studio wasn’t able to easily edit or attach a debugger to FogBugz. Just documenting it was a chore
What am I missing? This is an internal language designed as an incremental improvement over VB that gave them cross-platform common codebase. It lasted 10 years: that's 19,932 in SaaS years. When they transitioned off of it, they did it not with a rewrite, but with mechanical translation.
Colossal error vs spectacular success story. Yay hyperbole.
I'd say the actual experience is somewhere in between. Sure it enabled them to support client requests to be cross platform and proved useful for a very long time, but what was the broader opportunity cost? Did supporting this proprietary infrastructure eat up resources and prevent them from exploring other ideas? Probably.
Big projects are routinely ported once or twice to new languages. Normally, moving a project from one language to another entails a full rewrite. Because the process of building their own language had the side effect of instrumenting all their code (the transpiler was custom-designed for their application), they were able to write a program to do the port for them. That's a win I don't see captured in the original blog post, and that's all I'm pointing out.
I don't think it ate up any more resources than, say, MySQL support. Fog Creek employees have produced tons of innovation since Wasabi was introduced, including brand-new companies like Trello (2011/2014) and Stack Overflow (2008), as well as in-house products like Kiln (2008), WebPutty (2011), and Make Better Software: The Training Series (2009-ish). None of these projects were particularly resource-constrained by having to do some compiler and runtime maintenance in the process of building FogBugz.
Thank you for the most insightful comment on this story. I find it bizarre that so many are unable to understand that making the decision to 'kill' wasabi today does not necessarily mean that it was a mistake.
This whole story is a fabulous insight into software development for business over the long-term.
I mean, adding an esoteric tool into your development process -- one where an employee has to write a book about it's quirks -- sounds like a failure for the guys in the trenches.
On practically every software project in the history of software projects that have lasted more than 3 release cycles, there is a person on the team who could write a _${Some Component}, The ??? Parts_ book. Nobody ever thinks to do that, because _${Our Report Generation Library}, The ??? Parts_ is super boring.
The reality though is that line-for-line, a transpiler is probably not much harder to write than a serious report generation tool. I agree with the commenter upthread, who thinks this is a result of people simply never having tried to write a compiler before.
It might be, but for what many would argue are the wrong reasons, maybe even unnecessary.
> This is an internal language designed as an incremental improvement over VB that gave them cross-platform common codebase.
The problem is that at the time, there was already several cross-platform technologies in existence, many of which were being developed in the open. Utilizing one of these technologies would have allowed FogCreek to focus on what they do best, make software. Instead, they took a proprietary uni-platform language and attempted all on their own to make it cross-platform capable - which led to years of maintainability issues.
> It lasted 10 years
They gained an early advantage of not having to throw out the codebase and start over, yet they bought themselves 10 years of technical debt which continued to pose a burden on the small company. Many would argue that biting the bullet early on and switching to an open, community-driven cross-platform language/environment would have yielded much more return on the initial investment.
> When they transitioned off of it, they did it not with a rewrite, but with mechanical translation
Yes, that is an achievement, but again, for the wrong reasons.
I feel like you could take this comment, make very minimal tweaks, and deploy it in any language-war debate. "They succeeded with Golang, but for all the wrong reasons. They gained an early advantage but bought themselves 10 years of technical debt that a more modern cross-platform language would have spared them".
And I feel like when you get to the point where the best arguments you can make against something are isomorphic to the arguments you'd make against mainstream languages in language-war debates, that's a win condition.
Sure, FogCreek thought it was a good idea at the time, but over the years it became a significant burden, to the point they had dedicated staff working on just keeping Wasabi alive. Time was even spent writing an internally distributed book about the caveats of the language!
I know you will dismiss this as "routine", but it's not...
For a small company, this is an enormous waste of time, money, and energy.
A big company like Google or Microsoft can afford to throw developers by the dozen at internal proprietary languages and not even blink -- but according to the article, FogCreek did blink every time they had to dedicate time to fixing it. It took time, money, and energy away from their core business - making software.
That's a lose condition.
FogCreek should have bit the bullet and re-wrote their application in an open, standardized cross-platform system. They would have been able to spend zero time worrying about the language, and 100% of their time worrying about their application. They could hire engineers off the street and have them produce in days-to-weeks instead of weeks-to-months. They would have saved themselves an enormous amount of time, money, and energy invested in a language that is now dead anyway.
It may have seemed like a good choice back when the decision was made, but in hindsight it appears to have been a very poor, short-sighted choice.
> For a small company, this is an enormous waste of time, money, and energy.
I think you have this backwards. A small company that writes a compiler and loses a few weeks of dev time per year survives for a decade, while spinning up various new products.
In another world, a small company rewrites its only source of revenue. 18 months later, they release the rewrite with zero new features and a chunk of new bugs and promptly die, because who's going to buy a product that spends a year and a half going backwards?
> FogCreek should have bit the bullet and re-wrote their application in an open, standardized cross-platform system.
Ah, so you happen to know better than Joel how much resources they had available at the time, how long the rewrite would have taken, how much it would have affected their ability to ship new features?
Fog Creek was a much smaller company back when they wrote Wasabi. Postponing the rewrite until they had more resources to spare was probably a good decision.
I think his point can be summarized as "it's better to have to maintain your software than to have to maintain your software and the compiler for it", which is hard to argue against.
Then again, given that they had the codebase already, writing their own transpiler sounds like it was the best option at the time.
"It is usually a "colossal error" to write your own in-house closed-source/proprietary language"
I'm curious if anybody on this thread who has written more than three or four compilers/parsers would agree with you.
Depending on the task, the only solution to some problems is to write a custom/proprietary language (whether it's closed source, of course, is up to the company).
But "bug tracker" is not the problem that was being solved.
The problem was taking a big pile of legacy code and translating it to more than one platform vs rewriting the entire app from scratch in a new language that was cross platform.
It just happened that the legacy code was for a bug tracker, but it could have been for anything.
It is usually a "colossal error" to write your own in-house closed-source/proprietary language, no matter how small or large the language is.
Really?
I don't think so, having done this once, to the great success of the company. They also wrote their own database. The compiler was maintained by a team long after I left the company.
Software is hard. There are more interesting, and hard, problems than pushing the value of a field from one subsystem to the other.
Whereas, complex systems built on popular OO frameworks never have issues with maintainability.
/me deactivates snark mode
I see the issue exactly the other way around.
If you can build a domain specific language that lets you express concepts in a clear way free of boilerplate or conceptual hackery (ORMs, for example) you will wind up with a much lighter maintenance load than the equivalent functionality built on an off the shelf framework.
Of course, there's nothing stopping you from using the two as appropriate. Simple CRUD app? Rails. Need to express a very complex domain in a readable, easily maintained form? Custom language time.
Most things that have lots of reverse-dependencies require a significant amount of maintenance. Compilers are not much different from "common" libraries, or special-purpose frameworks, in that respect. Also, writing a direct-to-assembly compiler is probably not a wise idea.
I'm not saying you are wrong. Building the era of custom tools just for your company, only to do specific jobs is long gone. This is for a lot of factors of risk in a business.
It would work, if management didn't treat programmers as a replaceable cogs in a wheel. But the day you seek to make the craft of programming a commodity that could be done anyone, you need a ecosystem whose knowledge is available to everyone. Only then would you get reasonable expertise at affordable prices to finish your projects.
The opposite is to make so programmers so special that the knowledge of specific tools is available only to them. This definitely puts programmers in a lot more special position to negotiate pay and other things at will. Because the very existence of a business depends on them.
This is like saying "the era of computer science is gone, we all just wire form fields to database columns now".
The fact that a team building the canonical wire- form- fields- to- database- columns application (if there weren't such a thing as "blogs", bug trackers would be the "hello world" of database-backed web apps) found a reason to deploy computer science is, to me, a beacon of hope that we're not all doing something that's going to be automated away in 10 years.
> Building the era of custom tools just for your company, only to do specific jobs is long gone.
Isn't that the biography of 99% of the open source projects in the big data and distributed processing world? I understand they are open now, but didn't they start as custom tools just for a single company?
It seems like the "error" that Fog Creek made was to not open source Wasabi, though even that seems more like a hindsight has 20/20 vision kind of thing, as open sourcing a project is no small feat, especially to a small software company.
Sorry, but nine-hundred and ninety-nine times out of a thousand, it's a complete and total waste of resources to write a proprietary language or runtime enviroment to solve a non-core business problem.
First: they didn't write a runtime. They used .NET/Mono as their runtime.
Second, you wrote your comment in a text box rendered by an application notorious for being written in a custom language (in fact, it seems like arc:lisp::wasabi:.net).
Third, do you have evidence to support "nine hundred ninety-nine times out of a thousand", or is that hyperbole? Have you worked on projects where people used custom languages? Your argument would be much more interesting if we could read more about your experience with it.
You could view their core business problem as easily creating products that as many people as possible can pay for (since that's how they make their money).
Because it's bizarre and stands out. The clarifying sentence of marrying people that live elsewhere etc screams of "the lady doth protest too much".
There's nothing wrong with saying "Over time, and over the natural turnover that happens at all companies, none of the original Wasabi designers are no longer working at FogBugz".
Sure, some snarky people will make the comment "Oh yeah, I bet they left BECAUSE of Wasabi", but most will ignore them.
By completely negating the possibility that any of those people left for any reasons not involving family, it actually seems to INCREASE the probability that Wasabi was more unpopular within FogCreek than Joel would prefer to admit.
Do you know anyone at Fog Creek, or even anyone who has ever worked there? Did they tell you something that would lead you to believe that a blogger for Fog Creek would, completely unprompted and with no real need, make up stories about why people left?
Or could I just as easily argue, with the same total lack of grounding, that you're a secret shill for Atlassian trying to poison the well? (You aren't, of course, but you take my meaning.)
Good news, I actually do know everyone who was part of the original build of Wasabi! None of them left because of the language. I think this accounts for everyone who was there at the time:
1. Original author left because his wife was going to medical school out-of-country and Fog Creek didn't allow remote work at the time.
2. Second author left because his wife was going to medical school out-of-state and Fog Creek didn't allow remote work at the time (see a pattern?). Later came back because Fog Creek offered remote work. Went on to author the blog post we're talking about.
3. Developer left to go work on Stack Exchange (me!)
4. Developer left to go make the world a better place at Khan Academy
5. 2x developer left to go work on Trello
I think that was all of us. People move on in the course of 5+ years. Turns out most of those reasons don't have to do with programming language.
FWIW, I think Wasabi was a bad decision and I'm not going to defend it. But I really don't like these massive assumptions about people's motivations for leaving.
Can I guess at why you think it was a bad decision?
(a) Too incremental to be worth it, given where the .NET ecosystem was heading
(b) FC couldn't commit the resources required to adequately support a whole language, and it's better to commit to a lower common denominator than limp with a poorly supported language
(c) If you're going to create an additional obstacle to on-ramping employees, it had better be something every project in the company takes advantage of --- like, even if you had built FogBugz in OCaml, that would be a problem since the company is not designed to take advantage of OCaml.
(d) Unless you're getting a truly transformative advantage from a custom language, it's not worth it to be out of a "Google your way out of most problems" mainstream sweet spot
(e) No matter how good the language is, using a different language makes you incompatible with toolchain, so edit/test/debug cycles are needlessly painful
I obviously have no idea if Wasabi was a good decision or not, but a workplace where people are allowed to deploy basic computer science to solve problems is (sadly) an attractive stand-out to me.
So, I'm not David, so I'm not going to pretend to know what his thoughts are, but I'll say that I've always had really mixed feelings about Wasabi.
Let me start by saying that Wasabi as a strategic move was brilliant. If David disagrees there, I'm a bit surprised: FogBugz represented an awful lot of battle-tested low-bug code, and finding a way to preserve it, instead of rewriting it, made one hell of a lot of sense. I'm with you that the general thoughts in this forum that we'd have to be insane to write a compiler are misguided. Wasabi let us cleanly move from VScript and ASP 3 to .NET without doing a full rewrite, and I'd be proud to work at a place that would make the same decision in the same context with full hindsight today.
That said, I think Wasabi made two technical decisions that I disagreed with at the time and still disagree in with in retrospect. First, Wasabi was designed to be cross-platform, but targeted .NET prior to Microsoft open-sourcing everything and Mono actually being a sane server target. At the time, I thought Wasabi should've targeted the JVM, and I still think in retrospect that would've been a much better business decision. I really prefer .NET over Java in general, but I know that it caused us an unbelievable amount of pain back in the day on Unix systems, and I think we could've avoided most of that by targeting the JVM instead. Instead, a significant portion of "Wasabi" work was actually spent maintaining our own fork of Mono that was customized to run FogBugz.
Second, Wasabi worked by compiling to C# as an intermediary language. There was a actually an attempt to go straight to IL early on, but it was rejected by most of the team as being a more dangerous option, in the sense that maybe three people on staff spoke IL, whereas pretty much everyone could read C#. I also think this was a mistake: the C# code was not human-readable, made debugging more complicated (VS.NET had something similar to source maps at the time, so it wasn't impossible, but it was very indirect and quirky for reasons I can get into if people are curious), and that decision meant that Wasabi had all of the limitations both of its own compiler, and of Microsoft's C# compiler. IMHO, these limitations are a big part of why the ultimate move away from Wasabi was even necessary in the first place, since they increased both the maintenance and developer burden.
So from my own perspective, I think that Wasabi was a mistake in that, if we were going to go to C#, we should've just got the translation good enough to really go to C# and then ditch Wasabi; and if we weren't, we should've actually owned what we were doing and written a genuine direct-to-IL compiler so we'd have more control over the experience, instead of going through C#. But I still really do genuinely believe that our going to Wasabi was a brilliant strategic decision, and I think Fog Creek would have suffered immeasurably had we not done it.
I'm particularly interested in your thoughts on Wasabi compiling to C# rather than CIL. What characteristics of Wasabi led to the C# output being suboptimal for human reading and editing? If a compiler is going to output human-readable code, are there any general design pitfalls to avoid?
To add to Ted's comment, the main mistake we made in generating readable C# from the start was using `System.CodeDom` as our code generator, which explicitly does NOT care how readable your output is.
A better idea would have been to hand-code the generator, though of course that would have been a lot of string manipulation as well as a little extra effort.
Roslyn solves both of those issues for us, but it didn't exist until very recently.
Beyond what tedu and krallja pointed out, the debugging required inserting tons of #line markers in the C# output. But a single line of Wasabi could map to multiple lines of C#, making the definition of stepping ridiculous. Throw in that Wasabi necessarily grandfathered ASP globals that C# lacked and you also had fun variable display.
The semantics of wasabi (VB) and c# are slightly different. A fair amount of the code was actually the result of various code generators. It dumped everything in one giant file (delimited by #file markers, though). Nothing intractable, but nothing high priority.
Having done something similar, but entirely different, several times, I'm surprised you didn't choose to slowly refactor the code to be more and more native C# over time. iYou start with 100% Wasabi / 0% C# and slowly work up the native C# parts, in code units, until you reach a level sufficiently high to feel confident to do a final push to switch entirely to C#.
(In my experience, you need to build up an inter-op layer first to make working in C# somewhat sane, but it's usually not hard to identify the necessary helper modules needed. Having the .NET runtime actually is a boon here since the IL is designed for inter-language inter-op.)
Why did you find yourselves maintaining a fork of Mono (versus fixing upstream)? Was it something like forking, although being problematic, had lower impedance than doing the necessary rituals for getting your changes accepted upstream?
You can't exactly tell customers to go check out the latest subversion head and compile it themselves. Changes were pushed upstream, but that doesn't push them to customers. Neither could we ship nightly builds, because who knows what changes get introduced? So we had a fixed release with a pile of patches on top of it.
I think there's a useful lesson here about precision in writing. _Why_ the Fog Creek developers left isn't central to the essay's main point, and it's not interesting in itself. But the author did include it, so clearly he thought it was relevant somehow, but why? One salient hypothesis, in this context, is definitely "they left because of Wasabi".
Well, I don't think they actually did leave because of Wasabi; the chain of reasoning I described above isn't very sound. But it's easy and obvious, and the author could have avoided it by doing a little less.
Sure. As I said, I don't think skepticism is _correct_ here. But a critical reader will always be asking themselves, "why did they write it that particular way?", so as an author you have to continually ask yourself the same question.
I do not know anyone who has (or does) work at FogCreek.
But I can imagine a scenario where an employee leaves due to tech stack woes. The employee may not want to burn bridges during their exit interview by saying, "I'm leaving because this tech stack sucks: it's affecting my long-term career progression (no other companies use Wasabi); management is too slow to adapt despite our repeated kvetching; the 1 full-time guy who knows the language is overworked and doesn't have time for proper training." Instead, the employee just says, "I'm leaving for personal reasons" and everything is wrapped up nicely.
Edit: Glad to hear from other commenter that this wasn't the case at FogCreek. I have known people to leave jobs due to tech stack woes; they didn't tell management the real reasons why the left.
"The people who wrote the original Wasabi compiler moved on for one reason or another. Some married partners who lived elsewhere; others went over to work on other products from Fog Creek."
Unless I'm mistaken, this was all that was written about the reasons people left, which to me does not seem very "bizarre". Sure, he could have included "... and some left for other companies" but there's a difference between omitting the obvious and saying something like:
"Let it be known that all the ones who were truly good programmers, and by extension, real human beings! Only the best stayed at Wasabi. No one who was a good employee left for anything other than a personal reason. No one!"
It's not about the compiler, it's about the practical aspects of the language. The compiler isn't the hard part, the compiler is the easy part. You can write a compiler in a weekend if you want. Using it is a much different matter.
As GP notes, why would a dev bother inflicting upon themselves the brain damage of learning this language which will never be useful anywhere else? It's like requiring the devs at your company to use keyboards with a unique layout or something.
The second half of your comment transitions from weird to mean-spirited, as you begin speculating about people you don't know and their reasons for changing jobs. I'm a little confused as to why you've been voted up so high on the page.