Hacker News new | past | comments | ask | show | jobs | submit login
Game dev: Linux users were only 0.1% of sales but 20% of crashes and tickets (twitter.com/bgolus)
414 points by belltaco on Jan 7, 2019 | hide | past | favorite | 400 comments



Linux is the perpetual scapegoat.

They selected a middleware, Coherent UI, that didn't work properly on anything but Windows. They also didn't make proper use of the Steam runtime, a mistake that continues to cause issues. Most games don't make these mistakes, so this isn't really representative of the larger state of Linux gaming.

It's worth pointing out that the devs did make a legitimately good attempt at making Linux/Mac support first class, by making a purely OpenGL engine for the game. But it seems they weren't aware of some other best practices.


Technically speaking I'm sure you're right. In practice it doesn't really matter though, if they did it that way it's either because it was easier or because that's the way they were used to doing it. The fact that "technically" it's not Linux's fault doesn't really matter, unless you're more interested in the moral concept of guilt rather that the practicalities of making a Linux port of a game.

Why should they change the way they develop games for 0.1% of their sales? So that they'd get kudo "clean code" points from people on HN? Most for-profit software cares very little about that, and for good reasons.

Gaming on Linux sucks because Linux is not popular for gaming. Linux is not popular for gaming because gaming on Linux sucks. That's the root of the issue.


> Technically speaking I'm sure you're right. In practice it doesn't really matter though

I think the details matter because they're relevant to how we as a developer / customer community move forward.

If the GP is correct, then the pain felt by these particular devs might not be a sign that targeting Linux is in general a bad idea. For example, we might help future projects be successful simply by spreading awareness of techniques known to make it easier to target Linux / SteamOS.

Regarding the 0.1% of sales issue, perhaps some of that number's smallness comes from a bootstrapping problem. I.e., there's a vicious cycle of: (bad drivers) --> (game is crashy) --> (poor sales) --> (gfx card vendors not motivated to improve drivers) --> (bad drivers) --> ...

I don't think there's much an individual game dev shop can do to break that cycle, but perhaps it's still useful to understand the problem.


Have you not seen Steam Play these last few months? Because of Proton the majority of the top 250 highest rated games now run fine on Linux.

I've been playing Skyrim, Witcher 3, Dark Souls 3, Castle Crashers, Overwatch, Heroes of the Storm, etc all of which are Windows-only but now WINE is so good it runs like it's native.

Linux gaming doesn't suck anymore and it's coming to eat Windows lunch. Check out ProtonDB to see compatibility with your favourite games and the Linux gaming subreddit for more news.


You know WINE exists 25 years though. During this time, Windows still has 95% market share ( https://store.steampowered.com/hwsurvey )

You could also think that, because of WINE, Windows is first class and then they check Linux with Wine.

Depends on the perspective.

PS. Yes, steam detects Wine


Sure, my point is though that it's only in the past few months when Proton was released that the compatibility was good enough it actually mattered to game developers and players.

By making Linux compatible with Windows games it gets rid of that objection "I'd move to Linux if it weren't for my games" which was the remaining objection for a LOT of people.

Because Steam tracks WINE that's a very good thing, so they can detect players who bought their games to play it on Linux.

This helps encourage Linux native gaming growth, because developers can see the chunk of Linux players rising as more get rid of Windows because they no longer have that remaining obstacle.


I'm not sure that this does encourage native Linux game development. Why bother putting in the porting work for 0.1% of potential users when someone else might do the work for me via Wine/Proton.


I used to think the same way about native first and forcing devs to adopt Linux. The whole "No Tux, no bucks" thing (which has had pretty much zero impact since it's not enough bucks). But, over the years, and especially with Proton being so good with Steam, I've completely switched. If Linux gaming is to be a thing, then there needs to be an adoption first perspective.

So, yeah... maybe that means most devs will just say "my Linux support is just it runs on Proton probably, good luck!" but the thing is... there are games on Linux now. Lots of them. Lots of good ones. I was playing The Witness last night by just pushing play on Steam. No winecfg or winetools or separate DriveC Steam installation. No messing with drivers. I pressed play, the game loaded, I played it. I've repeated this loop in the last few months with most of the games in my library. Endless Legend is back in my rotation again. All of the dumb anime Japanese games where they don't even know Linux is a thing that exists suddenly work. It's glorious.

Wine/Proton may be the lazy way to develop for Linux and might not give people the coveted title of Linux exclusive gamer, but it's working really well if all you care about is playing games and not installing Windows.


Compatibility is a stepping stone to increasing the 0.8% (actual numbers) Linux population on Steam to higher numbers. If 2% or even 10% of users were Linux-based then developers would have second thoughts about choosing DirectX over Vulkan for example when it makes it more difficult for them to reach those customers.

Also the "no tux no bucks" philosophy many Linux users take in avoiding paying for non-native games.


People keep forgetting that game consoles have their own 3D APIs.

Contrary to FOSS folks, professional game studios aren't religious about APIs, as long as there is money to be made.


> Have you not seen Steam Play these last few months? Because of Proton the majority of the top 250 highest rated games now run fine on Linux.

Whilst this is cool I think it may also, sadly, be commercially irrelevant. Why bother worrying about Linux compatibility when only a tiny (i.e., somewhere between 0% and 1%) number of players/purchasers will run your game on Linux?


Perhaps because you will earn way way more money by making your game cross platform given the 3 consoles and 3 pc platforms that exist and once you have committed to such linux support might only take 1% of your effort.


Correct. If your game can't cope with Linux "fragmentation" (most of which is already abstracted away by Steam, so the remaining "fragmentation" is with hardware, which is the same problem you have with Windows), then you're in for a world of hurt if you try to port to a console with its far-from-ordinary hardware and programming APIs and such.


Porting to consoles is much easier than Linux, as each console is a very fixed platform. In my (limited) experience, middleware like Unity works better on consoles than it does on Linux.


Why should I get excited about decent compatibility when I can stick with perfect compatibility? Linux gaming doesn't suck compared to linux gaming 5 years ago. It's still crap compared to Windows.


You’re coming at this from the perspective of someone who wants to set up a PC primarily in order to play games, and probably wants to try out every game under the sun. Of course, you’ll install Windows.

Try instead coming at the question from the perspective of someone who already has a Linux workstation (e.g. for work) and wants to do as little as possible in order to play a few games—maybe the ones their friends are trying to get them to play as a member of a team. Windows isn’t worth it here: you wouldn’t use it for anything else (so every time you boot into it, you probably have to spend two hours installing updates), and booting into Windows would also prevent you from multitasking to the Linux apps you rely on. Compatibility shims, if decent, are far more interesting to this audience.


People who own a Linux workstation at home and just want to play a few games is vastly outnumbered by people who own a Windows desktop at hone and just want to play a few games. Probably at least 100 to 1. And the former group, with the “almost works” compatibility, will be a much bigger maintenance burden per customer.

Heck, I’d bet money that the Linux casual gaming crowd you described is also heavily outnumbered by people who have a dedicated Windows gaming PC (e.g. me, Mac user otherwise).


You're neglecting the crowd that have Windows at home and want to stop using it also. If gamers can have the exact same UX on Linux then that's one of the biggest obstacles to switching solved.


Like me. I will hug windows 7 goodbye on my way to Neon OSville, with redoubled hope my sim City 3k and rct2 may now work without having to be in a VM


"People who own a Linux workstation at home and just want to play a few games" make up a disproportionately large amount of the developer-base for pretty much any software, though, including games. It doesn't matter if none of your users care about a particular feature, if a fair number of your own devs do.


There are lots of developers pretty happy with macOS and Windows at home.


Yeah, but for a piece of software to acquire Linux support, you don’t need the majority of its developers to own a Linux workstation and want to use the software with it; you just need a non-negligible amount (i.e. enough developers with the spare man-hours to get the work done.)

Sometimes, in fact, it only takes one or two developers. I can’t think of a good Linux example here, but I know of a good few projects (Dolphin, for example) where the macOS target is supported entirely by the one or two developers on the team who use macOS.


Quite true, my comment was more against the typical HN remark that "developers" only use GNU/Linux, as if the software for the two biggest desktop environments would appear out of thin air.


In my case, it's because it removes the one thing left that tempts me back toward Windows. I'm significantly more productive on a proper Unix-like OS (like Linux), and it's wonderful to not have to dual-boot or maintain a second PC if I want to take a break and fire up some game or other. Between the native ports and the growing library of reasonably-Proton-compatible games, I feel that pull less and less.


> Have you not seen Steam Play these last few months? Because of Proton the majority of the top 250 highest rated games now run fine on Linux.

Put me in the skeptical camp, I've been hearing the same line with wine support for nearly 2 decades and never found it to be remotely true.

I've had enough trouble getting the actual linux supported games to work. Some work with gnome+wayland, others only work in gnome+xorg, some silently fail, some just freeze, open source AMD drivers still crash the whole machine, etc.


Proton only got announced in the past few months. You can view on ProtonDB reports from actual users and what drivers/hardware they use.


Linux Gaming might not suck anymore but all combinations of Linux the desktop operating system, that I tried certainly do. Both windows 10 and macOS are so much nicer, more stable and consistent that it is not even funny. Just the other day installing a pip package froze Firefox on Ubuntu for >30 seconds.

That is especially true if you are forced to use professionally maintained Linux without root access.


> Check out ProtonDB

You almost convinced me so I checked out ProtonDB, and found out you're overstating things.

Only 50% of games are rated "gold+", and most of those are native Linux versions. Most Windows-only games have issues.

https://www.protondb.com/


The moral concept of guilt is uninteresting here.

Developing for any platform requires a degree of competence and research. The fact that lots of engines are themselves cross platform doesn't in fact mean that linux support is a box you can tick with no further effort it means that it is a lot less effort.

I think the take away is that their life sucked because they were ill prepared and not just ill prepared to develop for Linux by all accounts. We could do well to maintain comprehensive and evolving info on the current best practices of developing on this platform and importantly pro mote them so as many as possible are educated.

Incidentally honestly I don't think gaming on linux sucks. There are an absolute ton of games available. I tend to buy humble bundles and steam sales and I presently have about 10 I haven't even played yet. Maybe I would feel differently if games were the primary use of my computer. I don't need every possible game to be ported to gave a good gaming experience on Linux. I just need there to be more games than I can possibly play.


> Technically speaking I'm sure you're right. In practice it doesn't really matter though, if they did it that way it's either because it was easier or because that's the way they were used to doing it.

Yes but then the question is, why did they even bother? If GP is right that they picked a middleware that is known to only work properly on Windows, that pretty much means they decided to fuck themselves right from the start. I agree that generally, properly targeting Linux is harder for multiple reasons, but if you want to go multiplatform, make sure you build on a solid foundation.


But it's very relevant. Steam provides a runtime so developers have a stable foundation across several distros.

If you fail to use their runtime properly, you're doomed to suffer from Linux fragmentation problems.


> In practice it doesn't really matter though

It does matter unless you just want linux to perfectly emulate windows APIs you will always have to do work to port to a different platform. Choosing the wrong tools for the job and then blaming the platform is just bad engineering.

I don't bitch about how hard it is to assembly my desk because it requires a socket wrench when my last piece of furniture only required a screwdriver and that's all I own.


Counterpoint: There's a reason IKEA packs Allen wrenches into everything it sells.


Shipping games as live DVDs is an elegant...ish... solution:)


FWIW I've been doing all my pc gaming from Linux the last few months. (Before that there was a bit back and forth - sometimes CSGO on my machine was best on Linux, other times on Windows bit now I haven't booted Windows for gaming since lste summer or something. And for me the convenience of KDE makes me stay for work as well.)


Any multimedia on Linux sucks these days. I honestly think it's regressed from the days of running mplayer in a terminal. I can't watch a video in a browser without horrible screen tearing or general performance issues. It seems the poor quality of graphic drivers and X interaction are the underlying problem.


> Why should they change the way they develop games for 0.1% of their sales?

Your last paragraph provides the most important reason.

By changing the way they develop, they will increase their customer base.


It is relevent in this context, because other developers know they can avoid the said issues if they do things a little bit differently.


> Why should they change the way they develop games for 0.1% of their sales?

Have you seen how tight the gaming market has become? You need a niche to get any attention. Coming out with a truly cross platform game that has Linux as a first class citizen would buy tremendous free publicity. It would be on the front page here a lone, a site with an enormous audience at minimum once a quarter. It would appear in countless tech related subreddits, twitter hacker-verse would take off. The “maker culture” would adopt it as its son or daughter.

Sounds like a tremendous opportunity.

We are not representative of the culture but we are hackers and developers and eventually one spark is all you need to start a fire. I have the perception that gamers are dying to get off of Windows, but I could be wrong, I certainly was when I was a gamer.


I've been in games and game middleware (including in many cases with Linux support) for fifteen years and I have never once seen this happen. Generally there is a tremendous amount of commentary like this from Linux folks and the port neither gets meaningful (or profitable) publicity nor sales that outstrip support. This comment is the stereotypical Linux game post.

When supporting a platform can mean debugging and submitting patches for a users graphics drivers in exchange for a shockingly low conversion rate, it's an easy no for me.

PS I built and maintain an Enterprise AWS GPU app on Ubuntu and the platform is great. But it was very non trivial to get working.

PPS if you spend less than $150/yr combined on all software purchases (including mobile/console) please don't make a case for Linux gaming. TuxRacer is your apotheosis.


> This comment is the stereotypical Linux game post.

I have absolutely no affinity for linux. My times spent twiddling with settings, packages, and getting drivers to work are long behind me. I have been exclusively Mac for almost a decade now, so I am in no way shape or form a "linux gaming homer" I could care less, but I am an opportunist, and I do see an opportunity now (and not 15 years ago)

> I've been in games and game middleware (including in many cases with Linux support) for fifteen years and I have never once seen this happen.

Yes, 15 years ago, Windows 10, with all it's privacy intrusions basically being a spyware OS didn't exist. A Steve Job-less Apple, doubling down on it's disdain for OpenGL and gaming in general at Apple didn't exist (see relevant John Carmack posts). And forgive me, but I would be willing to bet we didn't hear about it your game because I was specific when saying "first class citizen" meaning the game worked just as well on Linux as it did Windows. If you achieved that and still got no publicity I would be shocked. All you would need is one popular twitch streamer streaming Fornite (which also didn't exist 15 years ago, and wasn't pervasive until recently) on Linux or some similar big title and you would have a spark.


I have seen this claim many times, but wonder about it's validity.

It's easy to think that a truly cross platform game that has Linux as a first class citizen would make a lot of stir. But I used to play Heroes of Newerth years ago when Linux support were way worse than today. I also read HN then.

I never saw any special treatment for that even tho I loved the game and it worked really, really good on linux.

Searching on algolia just proves my point: https://hn.algolia.com/?query=heroes%20of%20newerth&sort=byP...

I simply do not buy the story that just because you release it on Linux, have good linux support etc it will spread like wildfire.

S2Games, which made HoN, made a superior MOBA imo but none of my friends play it anymore. They play Dota2 tho. I think the reality of the situation is that no one cares about if a game run on linux or not except extreme nerds that would never use Windows and go through the daily struggle that is desktop Linux.

I used to be one of them, nowadays I only use Windows.


Unreal Tournament 2004 was cross-platform Windows/Mac/Linux in 2004. But like this post indicates, the tech support headache isn’t worth it, so these days you can only get the Windows version on Steam despite the cross-platform binaries existing somewhere in Epic’s (back then still Epic MegaGames) vault.


Can you still run UT2004 on the current version of, say, Ubuntu?


>daily struggle that is desktop Linux

lol what year is this? 1999?

but I agree with the rest of your points


Every time someone mentions not using Linux Desktop because they had a lot of issues, someone like you comes out of the woodwork and pretends that Linux doesn't have issues anymore.

Maybe that's yet another reason people don't switch to Linux: the evangelists are annoying and untrustworthy.


> Every time someone mentions not using Linux Desktop because they had a lot of issues, someone like you comes out of the woodwork and pretends that Linux doesn't have issues anymore.

Did someone call me? Jokes aside, as a person using Linux (95% of the time, for ~15 years or so), I can honestly tell that Linux has its fair share of problems. However, for some time, the problems I experience are not more frequent than Macs that I have or the Windows PCs of my family members.

Is Linux perfect? No way. Did it improve over the years? Yes, tremendously. Also, I can say that advanced desktops like KDE can do very nice things for automation and productivity. I'm currently happy about the state of Linux, but it doesn't mean its perfect or the very very best.


Don't get me wrong, I agree that progress has been made. Sound, unless you need low latency, is pretty much a solved problem now, for one.

But there are a lot of reasons that Linux's particular brand of issues are actually still a deal breaker for people, and refusing to acknowledge that will never attract anyone to the platform.


For low latency I've played with Jack a little while I was playing bass. It wasn't very bad, but I don't have recent information on the issue.

I for one do photo post-processing and development on Linux mainly, and have no problems while doing what I want to do.

> ...Linux's particular brand of issues...

Can you please elaborate? I'm interested. Since I'm using Linux heavily and for a very long time, I might be blind to that problems.


Just google around a bit, even just on HN, and you'll find dozens of examples. A lot of it comes down to poor drivers, especially for laptops, but much of it is systemic.

I'm reluctant to go into detail about my own personal blockers because every time I do I end up in a multi-page argument with some evangelists who insist that everything I want to do is completely wrong and I should just change my entire workflow to match theirs.


You're right. Laptop support tends to be problematic, and boils down to selection of right hardware "platform" in the beginning. The worst part is, the right platform is not always budget friendly.

I personally found out that professional class hardware (Dell XPS, HP EliteBook, Lenovo ThinkPad) has the best Linux support out there. I have a EliteBook 850G2 at the office, and except the fingerprint reader (which I don't use), everything is working without any problems. Battery life is also great (~7 hours). However, works for me is not a valid excuse, esp. with hardware.

If you want to discuss further, you can reach me via my profile page.


No, it is 2019 and my Linux Netbook still lacks hardware accelerated video decoding and OpenGL 4.0 support, although the card is a DirectX 11 compatible one.


Well yeah, it is of course much better today I believe. But I wouldn't be surprised if you still have to spend hours in trying to configure stuff if you have the wrong hardware.

Misunderstand me right, it's still mostly on laptops I have experienced issues. On a stationary computer you just get a performance drop, at least for most graphics cards.

I think it's great that it has improved so much and I hope it will continue to improve so that sometime in the future I can return to the promised land.


Tried installing Linux on MBP last month. Ran away screaming after 30+ hours of dealing with drivers issues. I do this every couple of years, hoping that finally THIS time I can get off Windows. Next attempt will be circa about 2021 probably.


Try this is an exercise instead pick a random dell. Attempt to install OSX on it. Post about how huge a hassle this was and how the end result was a non functioning brick and OSX still isn't ready.

If you google computer model linux. If the result is 17 pages of results about how it didn't work you may want to try a different model.

Generally how well your machine is supported is a function of how hostile your oem is towards openness, how different from existing hardware your machine is, how common it is, and how much time people have had to add support.

Current macs aren't well supported. Supporting all hardware under the sun is a Sisyphean task and ultimately an unimportant one. For Linux to be useful it doesn't have to support all possible machines just a good range of hardware.


I've installed Ubuntu 18.10 on an XPS 13, which everyone tells me is well supported by Linux, Dell even sell it with Ubuntu. It won't come out of sleep. Googling suggests other have this problem.


XPS is a range of models and 13 is a size it doens't uniquely identify the model. Does it have the problem under the lts version that dell presumably ships?


I don't know, I just tried installing the latest Ubuntu. I could go and track the version Dell ships for my laptop, and make sure I never upgrade it, but surely that proves the point that Linux is a pain to run?


How did we get from run the latest long term service release which ships every 2 years like clockwork to never update?


You said I should only run the lts version which dell ships...


> Linux on MBP

Well there's your problem. The companies that make the custom hardware that Apple uses in their laptops refuse to release driver support for Linux for basically the same reasons as the writer of this Tweet. Whether the fault for this is on Apple or the manufacturers is up for debate, but driver compatibility with Apple's laptops and anything but macOS has always been a crapshoot and only became decent for Windows in the last few years.

Note: This is coming from an Apple fan who has been wanting to try out dual booting a Linux distro or one of the BSDs but has watched support tickets get answered with "testing MBP drivers on Linux isn't worth our time" from multiple OEMs.


Installing linux on a Mac is your problem. Macs are notoriously a huge pain when it comes to linux compatibility. Tbh even windows isn't that great on a Mac...I'd just stick to OSX on a mac.

When it came time to replace my old macbook air, I got a dell xps 13 and linux works great on that. All of the hardware works out of the box without having to do anything with drivers.


Why not use OS X on an MBP, or Linux on a generic x86 machine? I'm not clear on why this is the only route to get you off of Windows.


Macs have good hardware. Or you might also want to dual boot.

Sad to say, but some Macbook models work great, and others are fucking terrible. I have two models - my older model where the only thing that has ever consistently worked is bluetooth, and a slightly newer one where nothing has ever broken.


Running Linux on new hardware is usually a bad idea, due to the nature of the process you have to expect at least a year before divers for new hardware have settled into distributions.

Then things should be pretty sweet for quite a while. Unless your hardware is really poplar, things will bitrot away eventually, but expect a 5-10 year sweet spot where everything should just work out of the box.


THIS is the problem. Windows drivers start working on day 1 and continue working. The breaks are when we went to 32-bit drivers in NT and when we disabled real-time hardware interrupts in Vista.

Linux needs a driver compatibility story this strong to even start.


On the flipside, while Windows has a greater quantity of drivers available for devices on day 1 of release, Linux tends to have a greater quantity of drivers available for devices at time of install. With Linux, there's no separate step of having to wait for Windows Update to pull the driver, since all the drivers are included alongside the kernel (the exceptions being printer drivers - which aren't developed alongside the kernel - and firmware for wireless NICs if you're going with a strictly-FOSS-only distro).

Meanwhile, I "fondly" remember having to have a USB stick on hand for Windows 7 installs because the default install didn't include wired (let alone wireless) NIC drivers for 90% of the laptops and desktops on which I installed it. Thankfully Windows 10 is better about this (at least on the wired front; wireless drivers are still hit or miss), but still.


> Meanwhile, I "fondly" remember having to have a USB stick on hand for Windows 7 installs because the default install didn't include wired (let alone wireless) NIC drivers for 90% of the laptops and desktops on which I installed it.

I worked in an IT support shop at the time windows 7 was released, and I imaged and installed hundreds of copies of windows 7 over the time I worked there. While you're right about wireless drivers being a crapshoot, I can not remember a single instance of missing wired NIC drivers on install. I'm not doubting that some were missing (there is lots of hardware, lots of manufacturers out there), but it was definitely not as huge a problem. The biggest issue was usually SD card readers and trackpads which required downloading from the manufacturer.

I've done a few linux desktop installs (same job) and the situation was definitely more painful. Issues with sleep/wake, webcams, network drivers (usually wireless), multiple displays were basically guaranteed, and the help process was usually "You're using the wrong hardware", which isn't really helpful.


"I can not remember a single instance of missing wired NIC drivers on install"

Were you pre-installing NIC drivers with your images? That'd be a good reason for the high success rate.

It might also have to do with specific manufacturers/vendors. Most of my installations were on Dells; it's possible HP or Lenovo stuck with chipsets that Windows properly supported out-of-the-box. Linux worked fine in all cases.


dota 2 works on linux though. ;)


> Coming out with a truly cross platform game that has Linux as a first class citizen would buy tremendous free publicity

It does not. Take a look at the Steam games supporting Linux. There's thousands of them, it's not special.


> would buy tremendous free publicity.

That would not buy food on the table. Most money comes from Windows and what games should be optimized for.

See r/choosingbeggars for more entertaining stories on the "value" of "publicity".


Choosing beggars literally think that publicity is payment enough. Linux gamers PAY.

Really a big difference.


> Coming out with a truly cross platform game that has Linux as a first class citizen would buy tremendous free publicity.

This particular game (Planetary Annihilation) was designed with Linux support in mind from the very beginning; as the person who wrote the tweets notes, he and numerous others within the company were huge proponents of supporting Linux. As events proved, it didn't work out for them.


They later admitted they overstated this issue and that their problems were self caused.


You're either underestimating revenue from Windows purchases or overvaluing that publicity bonus. There are already a lot of indie games with Linux support, and triple A developments already sell millions of copies. If that added publicity made another 1000 people buy the game, it still won't be a financially sounds decision.

Heck, I'm mostly on Linux but boot into Windows for gaming... I'd probably continue to do so, even if the games were released for Linux just for that extra performance and significantly less chance too run into an obscure issue with my setup.


win10 really messed things up with the SAAS and telemetry, and integrated update rollups.

the thing is microsoft really painted themselves into a corner. ya see windows was made for all sorts of skullduggery to occur , this was supposed to be a boon for MS but they punched so many holes in W32 that it was secure as a sieve, I know this first hand as i have spent a major amount of time decompiling and snooping through the binaries. the file structures in a windows OS has big voids of zeros for padding, leaves lots of room to insert malcode and fix the file headers, Thread Local Storage is neat, a thread can move data packets through its kernel objects to other objects or static files on swap. And the alternate data paging that win executable files may have is just mindblowing. a file {any W32/W64 file} that every one expects to do one thing can be trojaned by the system as a feature using alternate data streams in a file! IT GOES ON AND ON this and win10 is why i left windows and gave linux a serious try and never looked back. Linux runs games and keeps getting better at it with no cat and mouse game of you modding system files so things work and having MS updates demodd them and then prevent you from making thing workable without major klugeing around.


Literally nobody cares about Linux gaming. It's not even press-worthy because of SteamOS. Anyone who games on Linux does it for the adventure of the process, not because it's convenient.

If you ported your game to the Atari ST or the Amiga you'd get more press and probably more sales.


There's nothing adventurous about using your preferred operating system and enjoying being able to have a working game experience. I've spent significant time in all 3 major OS options over the last few decades, and came to the conclusion that Linux is the one I prefer. A few years ago, I resigned myself to either rebooting to Windows or running Windows in a VM when I wanted to do any modern gaming. The significant advancements over the last year in terms of video driver functionality (both on Nvidia and AMD sides), binary compatibility (WINE/Proton), and more developers releasing native games have been a huge boon and very welcomed. I use Linux full time because it's what I enjoy and feel comfortable with as a primary OS, and I absolutely care about the gaming landscape improving. I'm only one person, but many others won't speak up.


It's like Mac gaming. It sucks. Unless it's a for-Mac native title, which is exceedingly rare these days, it's going to be trash because the publisher invariably uses some crappy DirectX to OpenGL wrapper that cripples performance and crashes constantly.

Developer kits like Unreal and Unity have helped a lot here, but those are far from flawless. Even those struggle with Linux because there's just way too many distributions and way too few standards.


I used to reboot into Windows to run games. Thanks to Proton I don't bother anymore. In fact I dread booting into Windows now because I know it's going to spend 30 minutes patching itself and then reboot on me.


Rhetoric like this is less than effective. Some of us do game in Linux because we don't want a Windows box. So, literally is demonstrably false.

Now, statistically, I realize we don't exist. But at absolute levels, we do.


You've got to recognize you're an outlier.

In practical terms nobody plays games on Linux unless they're using something like SteamOS or, technically, Android.

Given how ridiculously hard it is to get a simple application to run across all the various distributions of Linux that exist, expecting something as complicated as a game to run at all is asking way too much.

Windows is ridiculously hard to support, but at least it has sales volume to justify the work necessary to get a game launched. Linux doesn't.


>adventure of the process, not because it's convenient

There's no need for adventure for me. I prefer to code on Linux and not need to reboot into Windows. Playing games on Linux IS the convenient way for me.


Most people just get a game console or play games on their phone. A very tiny group of people do what you do on Linux.


Most games don't make these mistakes...it seems they weren't aware of some other best practices.

I see this time and time again. Game developer runs into one of the difficult subjects in computer science/programming. Dismisses the difficulty. Gets themselves into trouble. Blames the library/product/platform.

The last time I was at a game jam, I found myself explaining generational garbage collection to a game dev. He then proceeded to "solve" all of his problems by writing a 5 minute hack to ensure all of his objects would be collected as quickly as possible. Which didn't work, because the lifespan of his objects is dependent on gameplay, so he can't ensure all of his objects will die in Eden. I kept trying to explain that he should actually hold onto his objects (especially bullets in his bullet hell game) and recycle them as much as possible, so that he would reduce memory pressure and have those objects promoted to old space, so they are looked at less often.

I see this sort of thing time and time again.


> I see this time and time again. Game developer runs into one of the difficult subjects in computer science/programming. Dismisses the difficulty. Gets themselves into trouble. Blames the library/product/platform.

Members of this particular team worked on the highly acclaimed Total Annihilation (1997) and Supreme Commander (2007) so inexperience or lack of technical expertise seems unlikely to be the cause here.


Parts of PA are brilliant, absolutely brilliant. The team pulled off some incredible feats:

- In a 16-player game, each player can operate thousands of units across multiple planets orbiting a star, and afterwards, you can replay the match exactly within your own copy of the game.

- The planets themselves were mobile battlefields which could be destroyed, moved, and weaponized.

- The path-finding for those thousands of units crashing into an opposing army of thousands of units is smooth and there isn't total chaos on the field.

But the team also made some really questionable bets that ultimately doomed the title.

- Then, for some reason, they used an out-of-proc UI compositor to draw the 2D elements that would fork one process per layer IIRC and drop interaction events, and broke the tool that the players use to interact with the wonderful simulation.

- Then, for some worse reason, they launched it like this, and reaped the terrible early reviews that doomed the title. Even after patches resolved the issues, the long-term damage was done.


So they have no excuse for missing out on those other best practices.


What on earth has a generational gc and can't handle a little bullet spam?


Nearly any per-frame instancing tends to hit generational boundary cases due to ambiguity over lifetimes that the VM can't account for. They usually don't result in GC pauses that are large enough or frequent enough to make the game unplayable, but they do prevent the desired "solid 60hz". It's actually a huge thorn in the side of fast gameplay code. Recommendations always turn towards engineering a static allocation(value types if you can do them, object pools if not). However it takes enough effort to build the latter that quite a few games running on GCs ship without doing it in all cases, and some runtimes make it borderline impossible(string processing).


OP was at a game jam, though. They were building a prototype. If they were going for a smooth 60hz, they were doing it wrong.

If there were pauses causing significant gameplay issues, it's doubtful GC was the specific cause. There was probably something very wrong being done - my guess would be instantiation of a managed resource like a texture or 3d model.

I doubt generational GC came in to it, and feel like the OP is prematurely celebrating his own insight in to what was causing the issue.

(Then again, maybe it was a VR game and pauseless 120hz was the goal.)


Saw "Coherent UI" in your comment and thought, "I bet it's Planetary Annihilation" and was not disappointed.

The dev team struggled to get Coherent UI working on Windows, to say nothing of Linux. They switched to Coherent UI late in the development cycle and the beta/launch was constantly glitched out. The problem with trying to run an out-of-proc UI renderer within a game loop were so numerous -- the game would frequently lock up in full screen with multiple instances of the Coherent UI render process running if you managed to escape.

... in Windows, not in Linux.

For fans of TA and SupCom, early PA was terrible. Later, the team worked through a lot of the issues, but the damages to their finances (and reviews) were done and PA never got the e-sports league it deserved.


Truth be told I don't think PA ever had a chance to truly be big. So few people wanted the Supcom style gameplay. Forged Alliance Forever only has a couple thousand people and the best or second best commentator/stream of games (Gyle) gets only a few thousand views per video. I think the gameplay style just doesn't appeal to as many people


He points out that a very large percentage of the crashes were graphics driver related, not dependency issues that would be solved by the Steam runtime.


I mention the Steam runtime because this affects the Linux version to this day: I am unable to launch the game (on a fully supported system and OS, mind you) without deleting some files from their install.

> He points out that a very large percentage of the crashes were graphics driver related

This was still largely related to the Coherent UI middleware and the strange way it was being used, as other posts have noted.


My gaming box is still Xenial while all my work machines are at least bionic. While btrfs and my personal system setup practices make it easy to test and also port setups from machine to machine, I'm not relishing the moment when I'll have to bite the bullet and upgrade.


From the article:

> "We eventually laid out a guide with known good versions of Linux and graphics drivers, but it didn't matter. Part of the allure of Linux is the customizability, so few actually stuck to it, and generally wanted to run the game on older hardware we didn't support."

It seems that it's because Linux users are unwilling to upgrade their hardware. From a philosophical standpoint, I agree with them, we shouldn't need to buy a new computer every 3-5 years, it's very wasteful and I am very against planned obsolescence and perceived obsolescence. That said, I can see the benefits of writing code to work on the machine you have in front of you, and not doubling your QA work by testing it on older machines that no longer even receive OS upgrades.


I bought PA early, and tried playing it on a low end 2007 laptop, and it was almost playable. The same machine could only barely handle supcom on lowest settings anyway. Meaning, if your hardware could handle supcom well, then it could probably handle the basics of PA (at least that's my bet)

They should not be expected to support systems that could only kind of handle the previous game anyway


But if the set of best practices that works for your target hardware platform is narrow-band, that set itself can severely constrain the set of game houses that can develop for that platform.


I think you're in denial.

If developers need to get this, that and the other thing right specifically for Linux, but it's still only 0.1% of sales, then it's just not worth doing - even if there aren't any bugs.


100% this.

source: - am a business person - am a developer - am a linux users - am a linux gamer

I love the idea of being able to game and work/dev on the same machine. But its not the current reality and its not the reality of the near future. Ive recently settled for macos because even though it doesn't offer optimal solutions for anything, it offers the really nice solutions for everything.

I tried linux gaming for years, and Steam really really gave it a strong push. But the truth is, gaming on linux is complicated and doesn't offer the same experience as on Windows or even macos. I'd rather my favorite studios and developers dedicate its linux resources to customers that pay instead.


I keep a dualboot system because of the current situation.

Linux for "real work"

Windows for gaming (no-frills setup) and music production (much easier and less clunky low latency setup, way more availability of free synths etc)

I absolutely adore Linux for everything it has done to my career but I cannot drop my Windows system yet.


I likewise run and prefer a Linux desktop and boot into Windows for the occasional gaming foray. Also, to run stuff like Fusion 360.

I have to say that even on Windows, driver stability leaves much to be desired (for Nvidia in my case).


The thing is, if you only boot into Windows occasionally, you'll have to wait for updates about every time you use it. That's pretty annoying and has actually stopped me from doing it at some point.


I don't think I realized it until you mentioned it, but I think I'm in the same boat with my dual-boot box. Doubly so because it's a laptop and half of the time I only have phone-teathered internet.


> I have to say that even on Windows, driver stability leaves much to be desired (for Nvidia in my case).

I agree. That is why I tend to leave my Windows box really bare. It has Steam with some 4-5 games on it and my Music Production Software. Every other endeavor is tackled in Linux.


Take a look at cloud9 for your real work :)


> I love the idea of being able to game and work/dev on the same machine.

For what it's worth, I used to have that, and I am happy I don't anymore. Having two separate computers for gaming and work has been very positive to both my work and gaming experiences. I recommend this to anyone who can afford it.. even if you work on Windows.


Settled on MacOs for gaming? I built a Windows PC (for gaming only) because I was tired of every title I wanted to play being exclusive to Windows. Plus graphics card support is much better on Windows than everything else.


i travel a lot, so unless i want to two laptops, I had to compromise.


I agree with you that it is definitely not commercially worth it. But I guess you missed his point. He is not saying that "it was worth it for this game if well implemented", but that this game is actually a really bad example of how gaming on Linux is. It is like analyzing PC gaming by looking at Devil May Cry 3 or Resident Evil 4...


This is a question of margins, so you're neither right nor wrong.

It normally isn't just 0.1% of sales when developers do get everything right, although whether or not the usual 1-5% of sales is worth it depends on the particular game, budget, and sales numbers. Linux support clearly wasn't worth it for Planetary Annihilation due to its low sales numbers, but a game with (1) high sales and (2) easy porting process would be leaving money on the table not to support Linux.


In general you are actually looking at 2% of your potential user base not 0.1%.


I work for a small company that produces a DAW and VST plugins. Supporting Linux is a huge amount of work compared to Windows and Mac.

The main issue is that 'Linux' is not a thing you can support. You have to pick the distros you want to support, and then once you've picked a distro, what versions you want to support.

And you need to use the C++ version that ships with that distro, so if you want to support old versions then the entire project is held back from using latest C++.

And distos aren't backwards compatible. ie when libcurl4 is released, they remove libcurl3. So you can't have one binary for Ubuntu 18.04 and 16.04.

So it means for every product, have a build for every distro / version combination you want to support.

So now the solution is AppImages where you bundle up your app with all dependencies like a container. Haven't investigated this yet, not sure it will work for plugins.


> And distos aren't backwards compatible. ie when libcurl4 is released, they remove libcurl3. So you can't have one binary for Ubuntu 18.04 and 16.04.

If you don't want to depend on the libcurl provided by the OS, ship your own.

If you don't want to depend on the glibc provided by the OS, ship your own, with your own dynamic linker.

readelf -n <your binary> will tell you the oldest kernel that will run your stuff. Put that in the requirements document. Write a bash script that sets LD_LIBRARY_PATH correctly and make sure that's the main entry point to your application during deployment.

You're set.


This is not as easy as it sounds, because funny things happen if you ship your own libc - I remember last time I tried this there were problems with PAM and name services. But that was a number of years ago.

Can you ship with your own version of GNOME different from the currently running one?


> Can you ship with your own version of GNOME different from the currently running one?

I'm confused. GNOME is a desktop environment. It implements a task bar, a control panel, a way to set your desktop wallpaper etc. Why would you ship GNOME?

Maybe you meant shipping a proprietary application that uses a GTK version that is different than the OS-provided GNOME uses?

If so, of course you can. GTK is a bunch of shared objects that generate system calls, X messages, etc. It has no reason to interfere with the OS-provided GTK as long as it's got everything it needs in your bundle.


This doesn't work for things that are dynamically loaded into other applications (as VSTs are). In general, you need to be using the same version of the libraries as the rest of the application.


Your suggestion to ship glibc caused coffee to end up over my keyboard :) This is the road to hell, but you'd only know that if you had the slightest clue of the ABI interactions involved in swapping out a core library like that - starting with the reality that parts of the (probably binary-only) graphics stack must be linked and loaded in-process, and they naturally depend on at a minimum glibc. Your suggestion is to effectively ship the game in the form of its own Linux distro, which is of course complete nonsense.

Snap and Flatpack will hopefully help with the dependency aging problem, but they're brand new, and they both still suck one way or another.

To give you an idea of what swapping out glibc would involve, here is the list of shared libraries loaded by 'glxgears' on my machine, arguably the simplest possible OpenGL program:

    /lib/x86_64-linux-gnu/ld-2.28.so
    /lib/x86_64-linux-gnu/libbsd.so.0.9.1
    /lib/x86_64-linux-gnu/libc-2.28.so
    /lib/x86_64-linux-gnu/libdl-2.28.so
    /lib/x86_64-linux-gnu/libexpat.so.1.6.8
    /lib/x86_64-linux-gnu/libgcc_s.so.1
    /lib/x86_64-linux-gnu/libm-2.28.so
    /lib/x86_64-linux-gnu/libnsl-2.28.so
    /lib/x86_64-linux-gnu/libnss_compat-2.28.so
    /lib/x86_64-linux-gnu/libnss_files-2.28.so
    /lib/x86_64-linux-gnu/libnss_nis-2.28.so
    /lib/x86_64-linux-gnu/libpthread-2.28.so
    /lib/x86_64-linux-gnu/librt-2.28.so
    /lib/x86_64-linux-gnu/libz.so.1.2.11
    /usr/lib/x86_64-linux-gnu/dri/i965_dri.so
    /usr/lib/x86_64-linux-gnu/libdrm_intel.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libdrm_nouveau.so.2.0.0
    /usr/lib/x86_64-linux-gnu/libdrm_radeon.so.1.0.1
    /usr/lib/x86_64-linux-gnu/libdrm.so.2.4.0
    /usr/lib/x86_64-linux-gnu/libglapi.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libGLdispatch.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libGL.so.1.7.0
    /usr/lib/x86_64-linux-gnu/libGLX_mesa.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libGLX.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libpciaccess.so.0.11.1
    /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.25
    /usr/lib/x86_64-linux-gnu/libX11.so.6.3.0
    /usr/lib/x86_64-linux-gnu/libX11-xcb.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libXau.so.6.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-dri2.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-dri3.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-glx.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb-present.so.0.0.0
    /usr/lib/x86_64-linux-gnu/libxcb.so.1.1.0
    /usr/lib/x86_64-linux-gnu/libxcb-sync.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libXdamage.so.1.1.0  
    /usr/lib/x86_64-linux-gnu/libXdmcp.so.6.0.0 
    /usr/lib/x86_64-linux-gnu/libXext.so.6.4.0
    /usr/lib/x86_64-linux-gnu/libXfixes.so.3.1.0
    /usr/lib/x86_64-linux-gnu/libxshmfence.so.1.0.0
    /usr/lib/x86_64-linux-gnu/libXxf86vm.so.1.0.0


Ignoring your personal attacks, you and I are on the same page.

I do agree that shipping your glibc is "the road to hell" and shipping with the glibc could be qualified as "shipping in the form of its own Linux distro", as amusing as that sounds :) But that's the price of freedom you pay if you want to draw your platform boundary at the kernel -- you need to ship the whole userland! Where's the surprise in that? That means you need to understand the interactions between different version of /usr/lib64/opengl/nvidia/lib/libGLX_nvidia.so.0 (that you need to ship) and nvidia.ko (that is nvidia's binary driver loaded by the kernel, that you do NOT ship)

That's what people mean when they say they "are targeting Linux", even when they don't really understand the kind of work that that statement entails :)

Linux is an OS kernel. I'd wager it's the most popular one by far, in terms of the number of platforms that uses it.

Saying that "Linux is fragmented" is the wrong way to look at the problem. One should rather realize that Linux is just and OS kernel that is used by countless platforms. It's up to the developer to draw platform boundaries and focus engineering effort on the chosen ones.

If you need to support GNU/Linux with glibc-2.25, then you need to set up your testing bench to accommodate that. If you need to have native look and feeling in ubuntu, kubuntu and lubuntu, you already have 3 platforms with at least 2 LTS versions that you need to test with.

Each platform has its own quirks. Linux distros are the ones with the least amount of quirks, but in exchange we get many many platforms, because it's so easy to create them.

I guess that's life :)


Why would you want to take on that massive burden for such a tiny percentage of sales - most of which you'd have gotten anyway?


glibc is pretty good at ABI backwards-compatibility. The right thing to do would be to statically link libstdc++, libgcc, and extras like libcurl, but dynamically link libc, libX11, and libGL. Luckily libX11 and libGL are pure-C so you can get away with static-linking libstdc++, which means you can use whatever C++ version you like.


It's definitely technically possible to find a hacky combination that works today, but only by shouldering the cost of understanding all the possible deps and symbols and the structure of their underlying objects used by all possible e.g. graphics drivers and X11 libraries now and into the future. You can hack around it today, but it's a fool's game and absolutely not something that can be relied on to continue working in any sound manner, without risking e.g. some memory corruption due to a struct layout difference long in the future, etc.

I wanted to check whether the official Nvidia driver uses C++, but it's not installed on this machine just now, and of course that I even have to check just highlights the problem with this approach.


> I wanted to check whether the official Nvidia driver uses C++,

I don't know for nvidia, but I had a problem two years ago on a machine with a radeon card: I was developing a GUI software which used LLVM at some point. Insta-crash at runtime on this computer whenever I'd oepn a window. The reason ? the radeon driver linked and initialized LLVM which didn't support being initialized twice...


There's no other option, though. If you aren't integrated into the distros' builds, then you have no way of getting your binary updated when a dependency makes an ABI-incompatible change -- much less shipping a single binary that works across all distros. The only reasonable option is to determine a subset of your dependencies which offer long-term stable ABIs, dynamically link those, statically link the rest, and hope for the best.


But there is another option, it was mentioned in the original complaint :)

> So it means for every product, have a build for every distro / version combination you want to support.

Of course it's a total pain in the ass, but it's still vastly preferable to unfixable bugs sometime in the distant future


I'm not sure how per-distro (or distro-version) builds fix the problem, unless you're committing to continuously produce new builds for future distro versions to account for ABI changes. If you can do that, you can also fix issues with static linking if/when the come up.


There's a better solution: don't ship on an OS that insists on being a pain in the ass to ship on.


This is added complexity that neither the Windows nor Macintosh runtimes demand of their developers.


Windows actually does require this, which is why applications tend to ship with (or otherwise require) some kind of C(++) runtime, like those provided by MSVC or MinGW. Some rely on these being installed separately, while others include them as part of the app's installation process. Same deal for .NET. I haven't done enough macOS development to confirm or deny, but I'd assume the situation is similar there, too.

The actual difference is that Visual C++ and XCode (presumably) automate most or all of this, since this is the standard way of compiling and distributing applications for those platforms. In contrast, Linux development tends to revolve around software that can be recompiled for each distribution, so the tooling is going to be optimized around a workflow of relying on system-wide libraries and binaries and other data managed by a package manager.


Great and now I have to also track security updates and ship those for all my dependencies. libcurl is a great example with a release cycle of 2(!) months and tons of vulnerabilities being fixed each time.


[flagged]


[flagged]


That is completely ridiculous. Pointing out that statically compiling dependencies or shipping them with a game can solve many so called compatibility problems is directly related to games working on linux.


It is also the solution they currently use on Windows ... And what games are doing on Steam that support Linux.

I won't comment on shipping your own glibc, because that probably could cause issues.


As you can see, both comments were indeed flagged by one of those stackexchange types that is always marking useful questions as off-topic. Probably the same person who thinks using anything but rpm is bad.

The thread has then been detached for the crime of discussing voting, so only we can see it.

The first rule of HN voting is not to discuss voting.


It's a wonderful idea - on paper, but all that's needed is for one of the many mandatory deps for running a GL program to link against the system libcurl or any other replaced dep and you're back in Crashville again, population 1 - cloth-eared developer


Are there mandatory dependencies for running a GL program that can't work without using the system libcurl shared library? How does that make any sense?


If you distribute and link libcurl v69 dynamically, the dynamic linker symbol table for your process will contain references to v69 symbols. If you then load an e.g. GL driver that in turn links libcurl v42, the GL driver will end up with some combination of v69 and maybe v42 symbols, despite being compiled against v42 headers containing v42 struct sizes and offsets.

Now there is an opportunity for silent memory corruption in your program, one that even once detected, which may not happen for years, cannot easily be fixed without completely rearchitecting the build.

When using a wide variety of system APIs including at least NSS and OpenGL, there is little control over what dynamic libraries end up loaded, the exact configuration will vary across machines

Substitute libcurl for libexpat or libX11 or libstdc++ and the principle continues to apply. I can't think of a good reason something like libcurl would be used by the Nvidia driver or some bizarro embedded board's GL driver, but that doesn't mean the Nvidia driver doesn't now or in the future link it, and if/when that day arrives, the problem lies entirely with the program that mixed random system deps with random self-compiled deps.


> If you distribute and link libcurl v69 dynamically,

This whole thread is about avoiding this.


> So now the solution is AppImages where you bundle up your app with all dependencies like a container.

But isn't this what you're doing with Windows? No version of Windows comes with libcurl (as far as I know) so you put the DLL in your application directory. It's no different.


On Windows, I use the Windows API for downloading files. I don't need to worry about curl.

On Linux, I could have statically linked curl, but didn't realize it was going to become an issue. Ubuntu 18.04 was released and curl3 was removed, so that means our software that was working was either automatically removed by the package manager or it started crashing.


Can a installer prompt user to download missing libraries if needed? For example - Dota2 on Windows downloads bunch of C++ runtime libraries on first launch. Sharpkeys and bunch of other software do that (certain version of .net runtime).

libcurl may or may not be a good example but surely all software has dependencies that isn't shipped with OS. In Linux case the special pain is, library that is shipped with OS might be outdated but bundling compiled lib/o/so files should work.


On Windows, I often get prompted to install C++ runtime dependencies when installing. One big thing Steam did was manage multiple installs of DirectX for each game. It's just a matter of Linux being cost/benefit to overcome these things, like they have on Windows.


Which is probably way at the back of the developer's minds: the main audience of Linux is happy to compile from source (either themselves or via a downloaded package).


Exactly. Quite a few of the "problems" with Linux development ultimately stem from trying to shoehorn Windows-oriented expectations into Linux development/deployment. Very few applications in the Linux world target specific distros themselves; instead they publish their source code and leave the actual distro-specific finagling to distro/package maintainers (sometimes those maintainers are also developers, but they're regardless treated as separate concerns).

Of course, quite a few game developers are curiously averse to publishing any kind of source code, which tends to thus require compatibility efforts to happen on the developer side instead of the package maintainer side. This can still be worked around by targeting a specific distro (say, Ubuntu or SteamOS) and letting package maintainers for other distros apply whatever workarounds they deem necessary to get the app running outside of a "supported" environment (see also: Steam, Spotify, Slack, etc.).


I'll grab source code if I'm doing development or something obscure, but I feel like packages have been pretty solid and widely available since the early to mid 2000s. When working with source code I'll almost always install to a custom PREFIX--but if it's just an app the source is the last resort.

All of the commercial packages I've used at work for the past 15 years target either one or a handful of distros. Even the open source stuff that's either too new or outside of the default repos I much prefer to install from a package; turbovnc, grafana, chrome, etc. The few games I've seen have worked the same way as other commercial user-oriented apps.


You don't have to statically link, you can use LD_PRELOAD to point to local versions of all libraries. It isn't optimal, from space/security perspectives, but it works. You can use the local libs first or last in the path.


You don't even have to do that. When you compile your binary you just need to set the rpath to your own directory of libs and they will be search automatically and fall back to the system if missing.


Last time I looked the rpath is a static absolute path, which is inconvenient if you want to install the libraries elsewhere.

The automatic fallback to system libraries can also lead to mysterious problems.


The linker recognizes a variable that expands to the location of the binary so you can use relative paths.

${ORIGIN}/lib will be the relative library path.


Fascinating.

Some useful stack overflow answers on this topic. First, the trick to get GCC to pass the ORIGIN option to the linker: -Wl,-z,origin.

Secondly, how to pass it as the rpath option:

"-Wl,-rpath,'$ORIGIN' -- note that you need quotes around it to avoid having the shell interpret it as a variable, and if you try to do this in a Makefile, you need $$ to avoid having make interpret the $ as well."

https://stackoverflow.com/questions/38058041/correct-usage-o...

The chrpath utility is also fascinating, as it allows you to rewrite the elf headers to change the rpath. Possibly better for installs than passing lots of env variables.

Various warnings on this topic to not use actual relative paths unless you really want that behaviour.


I work for VCV, which also ships for Linux. I don't find it an issue at all, since the build system is a Makefile supporting Mingw64/MSYS2 on Windows, Mac, and Linux.

We statically link everything except glibc, which we've decided to dynamically link to glibc 2.23 (meaning that Ubuntu 16.04 is the oldest we support). We've had no problems with this approach, although the disadvantage is that we have to use an old version of GCC to compile the software, since I haven't figured out a way to make a new GCC version produce binaries which link to old glibc.

No need for containers or anything, just make a mostly-static-except-for-glibc binary.


> the disadvantage is that we have to use an old version of GCC to compile the software, since I haven't figured out a way to make a new GCC version produce binaries which link to old glibc.

I personnally compile with latest GCC or LLVM on centos 7, this way I can use the very latest C++ standards with a venerable glibc


That sounds like more something to blame the Debian family over than the whole ecosystem. I might be running "bleeding edge" on Arch but I can still depend on and use libcurl3 with the libcurl-compat package.

I maintain about two dozen software packages in the AUR, including some really old stuff like the Heretic 2 Linux release from 1999 and RBDoom 3 BFG which has a boatload of dependencies. Breakages are extremely rare for the average package even with the rolling release and generally any breaking change in a common library will see the legacy version hang around since stuff will still depend on it.


I don't think it's Debian, because Debian tends to have compatibility packages for old versions too. For libcurl there's libcurl3. It depends on the package, of course.


Only for so long - try installing qt3 on a recent Debian or Ubuntu.


I don't have a Debian machine to test it on but a casual search finds qt1-3 in the AUR on Arch. Of course building the whole GUI suite would be a bit annoying but its better than nothing if you have some really old software that depends on it.


I might compare qt3 to Silverlight or ActiveX.


Why not pick a "distribution" like org.freedesktop.Platform/18.08 (i.e. pick a flatpak runtime) and support that? It runs on all desktop distributions. Updating supported flatpak runtimes is then on your leisure, not on their release cadence.


How long are flatpak runtimes supported? What is their update policy?


The runtimes are not locked down, anyone can publish theirs (under their domain name, ofcourse), so it depends on who publishes it.

For the freedesktop one, they have following policy[1]:

  - When a new stable release release is done, the changes on that branch will only be:
    - security updates
    - stable releases (tested carefully); no ABI breaks / API build breaks.
    - we will try to keep updating that branch as much as we can

  - We release a new major release every year, only if:
    - There is a ABI break
    - There is a "API build" break (apps might not compile because new major releases of important packages (GCC))
    - Looking at the GCC and other major project cadences, this is likely going to happen annually.

  - We only maintain the current stable release and the previous one, this means:
    - Stable releases get 2 years of security updates
    - We maintain maximum 3 releases at any given time:
      - Development
      - Stable
      - Old stable
[1] https://gitlab.com/freedesktop-sdk/freedesktop-sdk/wikis/rel...


Proprietary software on Linux is a bit of an alien there.

But there is always static linking, is it not? (And now flatpak+snap+...)


I am developing an open-source DAW with Qt (https://ossia.io), to solve the problem you mentioned I used AppImage with great success - of course you have to ship all your libs, but that's what you have to do on macOS and Windows too anyways.


Do you ship any plugins? Do you do a separate AppImage for each plugin?


> Do you ship any plugins?

I used to but I'm switching to a different model

> Do you do a separate AppImage for each plugin?

no, they are just loaded as .so / .dll / .dylib in the ~/Documents/score/addons folder. They can't have further dependencies though.


I concluded ages ago that no substantial applications should be integrated into an operating system. Dynamic linking if you're building something portable is just an awful idea because of how much variety you'll inevitably have to support (or choose not to support).

Some people have written about AppImages, this also applies to FlatPak and similar techs too. Isolating anything more complicated than a command line tool is the way forward and the tech exists. Not using isolation this way is a recipe for pain, whether it be on the desktop or the server (or the phone).

It depends how the plugins are loaded - it'd be great if they could use sockets so that AppImages were viable. No idea myself, though.


They are VST plugins and the are dynamic libraries. Call dlopen and call a specified function.


Confusingly, you can statically link a dynamic library (i.e. the dynamic library includes all its dependencies statically instead of recursively depending on more dynamic libraries). You can even link it in such a way that the dynamic library gets its own version of each dependency regardless of what the main executable is linked to (otherwise the main executable's links could override yours).

I did this once with a ruby gem that had particular c++ dependencies that kept breaking when the build machine had different library versions than the production machines. If you aren't integrated directly with the development of a linux distro, it's best not to use their packages as runtime dependencies, since they really only consider their own use before changing things up.


Music software's dependance on Windows is such a mess. Tons of cool plugins are stuck as windows-only VSTs! I was really hoping that Propellerheads would use the fact that it can ship on any platform to ship Reason and all associated REs on linux or in-browser, but instead they gave in and added VST support.


A historical lack of low-latency audio is likely why you haven't seen many music software releases on Linux. For years, you had to have a custom kernel to get "decent" performance.


Can confirm. I fondly remember late nights hacking the Gentoo kernel so my college radio station could reliably live stream shows via icecast and darkice.


The DAW isn't Bitwig is it? (just because your username has the same amount of syllables ;) If so, then PLEASE keep up the great work supporting Linux. I was only able to ditch my Mac a year ago because I switched from Ableton Live to Bitwig!


renoise is also available on linux, great product works like a charm on linux :D


also, Renoise is frigging awesome.

it is a blessing to be able to work with a different paradigm than piano roll and with such modern tooling (renoise supports vsts, rewire integration and whatnot), being essentially a supercharged tracker.


I've been meaning to pick it up, would you recommend any tutorials for figuring out how to use the tracker interface to sequence?


Not BitWig


Music Production is one of the last things I keep Windows around for. That sounds like a cool job though.


biggest problem with plugins imo is that often they don't support the same things as the daw does so usually come without linux installers (even though the code runs just fine on linux... )


Why wouldn't you just create a statically linked binary?


AFAIK you can't call dlopen from a static binary, and if it's a DAW you have to support dlopen in order to open the VST plug-ins


Some binaries are meant to be linked statically


bitwig?


Why use not Electron? Take a look at vscode it has plugins and it seems to work everywhere.


Electron is basically Node+Chrome. There’s no way in hell it’s appropriate for writing VSTs.


In about ten years someone will post one to HN. Written in Javascript.


And it will use 1TB of ram to run.


Unfortunately DAW and VST plugins are not something you can make in Electron


Can you please explain why?


Audio software has strict buffer/latency demands which usually cannot be guaranteed on interpreted language platforms.

Doing audio synthesis with JS or any other interpreted language really is totally possible and has been done in a more or less serious way in several implementations and webtoys etc. But if you need extremely low latency and guarantees you cannot go that route, sadly.


You can always write native extension in c++/rust/c.


so then why use electron as well if the majority of your code is going to be native anyway?


exactly.

the UI is the minority of the code. the main engine is gonna need to be implemented in some language which has strict guarantees about performance. also, not having a garbage collector that fires in a seemingly random way helps a lot too :)


It's probably a decent way to package your program. And you can use all the glammy JS you want.


There are few reasons: - they most of the time run embedded in a DAW - they are usually computationally intensive themselves - they are meant to be instanced as far as memory/CPU can go

ad to the third point: Music producers already require and use pretty powerful rigs: 32-128GB RAM is not uncommon, CPU as good as it gets. There's great benefit when you can run 100 instrument synthesizer instances parallel vs 14 instances - it's a difference between a differentiated orchestra and a rock band.


Because the plugins embeds into existing native applications and uses existing native SDKs.


Friends don’t let friends use Electron


> You have to pick the distros you want to support, and then once you've picked a distro, what versions you want to support.

What if game developers release the game's source code and let community developers help with porting to different distros and platforms? The game's assets can remain paid. For example, Doom has been ported to pretty much all platforms, and it's up to maintainers to ensure compatibility. I guess at this point it becomes a partly open source/free software game.

I'm aware this may not align with current business practices.


> And you need to use the C++ version that ships with that distro, so if you want to support old versions then the entire project is held back from using latest C++.

> And distos aren't backwards compatible. ie when libcurl4 is released, they remove libcurl3. So you can't have one binary for Ubuntu 18.04 and 16.04.

Release the source and the community will help you with many of these issues.


I don't know why people say that. You don't support Windows XP and Windows 10 either with the same binary.

In contrast to Windows you can provide or pay someone to provide libcurl3 if you want to continue to use it. MS will just say you can go F yourself.

In contrast to Windows you can study the source code and write a wrapper that provides a central API point that you can use in your single code base.

In contrast to Windows you can actually go there and provide patches. Even if the original authors won't merge it you can still use it via a fork etc.

And last but not least I bet there are actually still people supporting and providing libcurl3 binaries to this day and you just need to google their package server and add 2 lines to your installer script (one to add the public key for that package repo and one to add the package repo to your package manager).

PS: If you provide software for sizable amounts of people you need to provide 1-3 out of 3 reasonable Distros: Debian, RHEL, Suse. Even if you just provide one most people can deal with it thanks to VMs or docker.


For a 32 bit version, a single binary compiled in VS2017 could support XP to 10. For 64 bit, a single binary supports Vista to 10.

In my experience, MS doesn't say go F yourself. They go to extreme lengths to keep old software working.


> You don't support Windows XP and Windows 10 either with the same binary.

I think you picked a terrible example there, because quite often people do; certainly Vista upwards is quite normal (that was the transition point for lots of APIs).

Windows is much clearer about how you're supposed to solve "DLL hell". You have the OS libraries, which provide a stable API; COM, where interfacing is done at runtime dynamically; and for everything else you put it in your application's directory.

Theoretically if COM components aren't interchangeable - the API has expanded - they should have a different CLSID and therefore not clash.


This is a common misconception. Microsoft is a little insane about backward compatibility. You can target versions of Windows with a single binary from the latest all the way back to unsupported OS's like XP. There are lot's of companies that take advantage of this. It's one of the reasons why MS has so much trouble moving app developers to the new hotness, even if it's safer, faster, or whatever. Because of that the new hotness doesn't get enough traction to support continued development and they sunset it early which draws even more ire from people.


I use a Windows audio software binary last compiled in 1997, for Windows 95. It still works just fine on Windows 10 64 bit.

That's 20 years of backward compatibility.


While I love Linux, I think this is a totally fair "state of the union". At the end of the day, gaming or kickstarters or similar... are a business. You need to figure out your ROI (Return On Investment). It doesn't many any logical sense to spend a huge amount of time on a platform that nets you tiny amounts of sales (and gains you a huge amount of support costs).

This is why gaming on the Mac is only now starting to be a real thing. Not a decade or so ago, there were still relatively small numbers of people using a Mac. It cost a lot to develop for a new platform and didn't make much money. Now it makes more sense.

Of course Linux fragmentation will probably hold it back for significantly longer.


Is it really fragmented? You support Ubuntu (in either its flatpak runtime or the Steam runtime) and you let everyone else figure it out on their own. Distros like Arch have been very good about providing the Steam Native Runtime and Ubuntu flatpak runtime in its own package system for just that reason.

The support burden for developers is proportional to the operating systems you officially say you support. If you support "Linux" you are supporting thousands of completely dissimilar execution environments. If you support Ubuntu, or Steam, or a Flatpak target you are only supporting that one operating system.

And thats fine. Thats all Linux users really want anyway. If it breaks on your distro its your responsibility to fix it so long as it works on its officially supported target OS.


> You support Ubuntu (in either its flatpak runtime or the Steam runtime) and you let everyone else figure it out on their own.

And isn't that already how it is? Pretty sure Steam only supports Ubuntu LTS, steam on every other OS else is an unsupported hack.


I think what is what zanny is saying. The fragmentation argument is fud.


They don't support every LTS as far as I know. Also, they don't really make any effort to keep up with getting ready for the next LTS release.


> This is why gaming on the Mac is only now starting to be a real thing.

Is it? I mean, there's a good amount of support for gaming on the Mac, but I think it's always been so ... I wouldn't call it "a thing" though ... not in comparison to PC or Console ...


I remember when major games being launched on Mac was worth a mention, or even a full feature in an Apple Keynote. Nowadays games are being released on the Mac constantly and nobody bats an eye. It's a better time to be a Mac gamer than ever before.


It would be even better if they had decent GPU's available without spending $1000+ on an external setup.


Or if they went with Vulkan instead of rolling yet another half-baked graphics API while also deprecating OpenGL because reasons.


Agreed, have been a Mac user since 2004… and got the first Intel iMac specifically for the ability to boot Windows for gaming too. I think the only native Mac OSX games I had when I got that first 12" PowerBook were Q3, UT2004, Starcraft / WC3 / Diablo 2… and maybe Tux racer ;)

Have long given up on dual booting. Granted much of my gaming is also done on consoles, Mac gaming support is better than ever outside of still generally anemic GPUs (and who knows what will happen when they move to ARM… a great many games will be orphaned in x86 forever -- e.g. will Starcraft HD get ported to ARM?)

Thank you, to all the indies (and open source engine porters) who have supported Mac over the past 14 years!


This was definitely not the case 10-15 years ago.


I had the iMac G4 about 10-15 years ago and I had a slew of good games for it. Nowhere near the amount available on other platforms but I was kept busy. These days when I look on Steam for instance yeah there's perhaps a larger proportion but still most games are Windows only ... The Mac never was nor will it ever be a serious gaming platform.


The same claim can be made for Linux today. I have a large number of games I can play. I would still say it is a smaller market.


Not disagreeing ... it makes little sense to develop commercially for Linux and never has. That doesn't mean that Mac isn't still a niche platform though, and banging on about number of tickets raised isn't really an appropriate way to quantify the value of a particular segment, or not.


They said the problem wasn't fragmentation of distros, but the versions of graphic drivers, and it's not like Windows machines are all on the same version of the same driver.


Windows generally has less of an issue of heterogeneous graphics driver ecosystem, mostly for market reasons---Windows laptops and desktops are bought more often for the purpose of playing games than Linux machines, so a Windows machine hitting the market with a graphics driver that's too outside the norm tends to perform poorly in sales, and the market encourages driver homogeneity in basic functionality. I've seen some stinkers trying to write my own game engine (THANKS FOR LYING ABOUT THE OPENGL CAPABILITIES BY IMPLEMENTING IT IN SOFTWARE AT SECONDS-PER-FRAME PERFORMANCE, INTEL, YOU ARE A PEACH), but not as many as in Linux.

(in short, "What machine should I buy to play games" is a question that's easy to answer for Windows; there's reams of magazines dedicated to the question. It's a harder question to answer for Linux, which makes it a harder question to answer for developers trying to write games against a Linux-based OS).


This is just an out-of-context tweet (well, the context is there - it's a reply to another tweet, which is out of context)... so perhaps it was just auto-crash reports, but I wonder if you get more tickets from Linux users, because Linux users are more likely to file tickets?

When something crashes on my partner's computer she yells at it and restarts it. When something crashes on my computer I spend an hour trying to figure out WHY. If I can't figure it out, I log a ticket thinking I'm being a good citizen!


They still represent a vastly disproportionate amount of work for how much extra money they get you.

They get you 0.1% extra sales, but they cost you 25% (= 20/80*100) extra support work. And this is nevermind the extra time spent during development.

Your point would hold if the issues reported by Linux users would also fix issues on Windows. Some issues would occur on both systems, but I bet the vast majority is weird Linux and setup specific bugs.


> I bet the vast majority

This is the crux of your assertion and requires substantiation.

On the other hands, all these Linux user might be doing you "a favour" taking the time to log these tickets that less conscientious users on other systems would.

Of course that does depend on the classification of the tickets but coming from that community I wouldn't expect them to be trivial issues ...


It's even more of a favor if the bugs aren't just linux-only. But if they're in gfx drivers like the tweet implies and auto-reported, the only thing you'd really care about is "newest drivers?" Same as the parade of crashes in Windows for the same reason, you want to just tell people to update.


Which is why chromium pulled support for certain drivers on linux


They didn't pull the support at all.

They've added another driver to a long list of blocklisted drivers that cause issues with the hardware acceleration. Nothing about that list is Linux-specific, as it includes a bunch of macOS and Windows drivers as well: https://src.chromium.org/viewvc/chrome/trunk/src/gpu/config/...

Firefox does that too: https://wiki.mozilla.org/Blocklisting/Blocked_Graphics_Drive...

Between a crashy browser thanks to a crappy driver and a stable browser without hardware acceleration, browsers will always just take the latter route. Don't like it? Run it with a flag and expect it to be a little less stable. That's it. You absolutely can't blame neither the browser nor the platform for that, only the vendor of the driver (or in this specific case NVidia, since it's actively hostile to the vendor).


I'd expect that class of bug is easy enough to triage ... what's the problem?


> requires substantiation.

From the linked series of tweets:

"In the end they accounted for <0.1% of sales but >20% of auto reported crashes and support tickets (most gfx driver related)."

"So yes, fragmentation is still totally an issue."

"We eventually laid out a guide with known good versions of Linux and graphics drivers, but it didn't matter. Part of the allure of Linux is the customizability, so few actually stuck to it, and generally wanted to run the game on older hardware we didn't support."


Historically, that 0.1% income claim has been disproven a couple of times. Some indies reported a pretty even split between Windows, MacOS and Linux and even a willingness for Linux gamers to pay more. But that was before Steam for Linux, so I do not know how that has changed things. Just wanted to throw that out there.


Why do they have to do the work?

If someone releases a game and I play it, maybe I get a Linux-only bug where the audio sometimes becomes static-y until I reset (real world example). If I file a bug for that, they probably won't fix it and I probably will (and did) continue playing. I lose nothing, they lose the time to triage a bug.

If I can't launch the game and I file a ticket, it's probably because I either want to exchange my money for a working game - which on my own might not be enough incentive for them, and that's OK - or because I have to do it because of asinine refunds policies.

The developer is framing bug reports as a burden, when really they're neutral or a benefit - the bug that exists because of the lack of platform-specific work is the burden. That's understandable, there are a lot of platforms and not a lot of paying players.

If bug reports are not a benefit, perhaps software will stop trying to send usage data back home? I doubt it.


Because having a buggy game on any platform is a reputational risk. If your game is buggy, people will complain and people on other platforms might reconsider buying even though the version they'll get might be bug free.


You'd thinking having another build target would help find bugs on existing platforms. Kind of like the BSD's.

Didn't ID software outsource the Linux ports to someone in the community and just provide it as free?


>They still represent a vastly disproportionate amount of work for how much extra money they get you. //

(With a sarcastic tone ...)

So you spend 1000 man-years developing your AAA title, which you release for MS Windows only.

Then someone makes a WINE wrapper on their w/e off and you start selling on Steam as being able to run (but not supported) on Linux.

Now you find so many more Linux users filing bug reports to help you fix your game ...

Bloody Linux users, eh, who'd have them.


That's not what happened in the article (tweet). They released a native Linux version.


Sorry, I thought the thread was about the broader subject of supporting Linux gamers rather than Planetary Annihilation alone. The title should probably be changed.


Similarly, if I hear from all my friends that a program crashes a lot on Linux, I'm much less likely to want to download it in the first place.


https://twitter.com/bgolus/status/1080544133238800384

Follow-up tweet by the same author:

> As a follow up to this, I've been told by those actually involved with Linux stuff that this wasn't true. I probably just stopped paying attention to Linux issues at a time when everything was broken.

Just so the discussion does not overly focus on the (apparently wrong?) numbers.


This is a follow up to another tweet though


This is a pretty well-known effect in gamedev. The specific numbers change, but Linux sales are only a small percentage of total, while generating much more development and support work.

At least dev work can be drastically simplified with middleware like Unity, which is how small studios can even consider Linux. But support difficulties are real. Players have so many possible distros, configurations, and drivers - actually supporting all of them would require a level of Linux expertise that gamedevs simply don't have (being typically Windows or OSX based themselves). Limiting yourself to something like Ubuntu LTS helps a bit, but there are still plenty of gotchas.

So it becomes a simple matter of numbers: given relatively small amount of extra sales, is it worth the extra work, and spending the time to learn Linux development and administration in sufficient depth? Sadly usually it is not.


But for a game that is built on Unity, wouldn't these support issues be Unity's responsibility? Assuming the issues are triggered by their glue layer.

What's the arrangement there, does anyone know?


Unity does not do your customer service for you. Also, bug fixes might only go into a future release of Unity (months out) and updating to the new release to get the bug fixes might break your game. Unity games that spend a few years in development often ship on a very old version of Unity.


Gaming is the #1 reason I haven't fully switched to Linux on the desktop at home (at work I exclusively run Linux).

A lot of my Steam games work flawlessly with Linux but others will just silently fail to launch and it takes hours to properly diagnose and fix, which isn't what I'm after when I get some time to play games.


I've switched to 100% Linux at home due to Steam Play. All of my Windows games (except 2) work perfectly on it. I really hope Valve keeps improving it, it has the potential to be a game changer for Linux gaming.


Your comment is the perfect example of how there's an "invisible" demand demand for gaming on Linux.

Nobody can see it, because those who want it will mostly make do with Windows.


+1.

Personally, I spend 98% of my time in Linux, but I dual-boot just for the sake of the few games I like that need Windows to run. If not for that, I'd have dropped Windows years ago.


I haven't logged my hours, but substitute 98% above with whatever my number is (90+%). I don't even have Windows installed on my main, but older, desktop. I just have it on my newer laptop, which is where I do any gaming of significance (when I have time nowadays).


Invisible demand is undistinguishable from no demand, as far as B2C is concerned.


I agree that they can't distinguish it but there was no visible demand for an iPhone before the iPhone.


Only if we forget about the work done by Symbian, Psion, Microsoft, Compaq, Dell in portable Touch devices.


How did the demand work out for Symbian, Psion, and Microsoft phones?


Pretty well in Europe and Asia, until a certain company in Mountain View decided to screw Sun and offer a mobile OS for free to OEMs., in exchange for user data.


People forget that Windows Mobile 5 and 6 unseated Palm to become the sales leader in smartphones in the mid-2000s, shortly before the iPhone came along and blew them and everybody else out of the water.


Not true. Invisible demand is a potential market and a bunch of prospects in waiting.

However that niche is way too small to be worth it.


For all the prophets out there: no one stops you to build a business for that "invisible demand". Let's see how you can predict the market.


This happens all the time. Some endeavours fail, other succeed. Some are more likely to succeed or fail than other and some succeed or fail unexpectedly.

Nothing to do with prophecies though.

Not every demand can be fulfilled, no matter what.


I'm in the same boat. I'd love to be done with windows but since I spend a large amount of my free time playing games with friends I need to be on windows.


Despite the problems and shortcomings, I'm actually pretty impressed with how well Valve has managed to make Linux gaming work over the past decade. Of course, a lot of credit is also due to other contributors to the WINE project, but Valve used Steam to create a targetable set of dependencies for native games in addition to their WINE contributions. I'm frankly a bit amazed that they solved that part of the problem. And a bit disappointed in the people blaming them for not somehow solving the GPU problem by continuing to push Steam Machines no one was buying anyway.

For me, the gaming situation on Linux has become tolerable enough that, in comparison to Microsoft's Windows 10 bullshit, it is no longer a barrier. However there are several other barriers that are unlikely to be dealt with any time soon.


The situation seems to be similar for Mac: https://twitter.com/bgolus/status/1080327518983315457

> Yeah, I didn't want to touch on Mac support in the original post, but Mac support also cost way more than it made in sales.


Innumerate person here, does that mean that supporting Mac is a net loss for a business?


It's impossible to say for sure, without knowing people's behavior and motivations.

In the past, I've certainly been more likely to buy software if I see it's cross-platform, even if I know I'll never use it on those other platforms. Or it might be a corporate requirement. There are many possible reasons.

For example, if a restaurant serves only steak, is it a net loss for them to put a pasta option on the menu, if they know they'll never sell enough to make up the cost of ingredients and preparation? Maybe not -- it could mean a party of 10 (with 1 vegetarian) is willing to eat there, when they otherwise wouldn't have been.


A "loss leader"


For this particular business. I don't think he's representative. Using tools like Unity makes it much easier to deploy on Mac and Linux.


Please mark it with (2014):

https://twitter.com/bgolus/status/1080380532108619776

> The world of 2014 Linux graphics drivers was not a friendly place. We absolutely encountered issues where some driver revisions only worked on certain distros, and AFAIK we did not use anything specific to a Linux distro.


"I didn't want to touch on Mac support in the original post, but Mac support also cost way more than it made in sales."


> Issues specific to Linux were almost entirely graphics driver related, and unique to the platform.

Just want to point this out before this discussion starts blaming the devs for 'not writing with portability in mind' or other such nonsense like the twitter thread did.


Part of the problem in 2019 is that the vendor with the best drivers on Windows is the vendor with the worst drivers on Linux. (aka Nvidia).


Care to elaborate?

Nvidia used to be the only discrete GPU manufacturer that had Linux support worth mentioning. I know that AMD/ATi has made a lot of progress, but is Nvidia really the worst at this point?


I just went through the process of getting NVIDIA drivers working on my brother's dual-boot Ubuntu. Here's the issues we hit:

1. You don't don't get drivers out of the box. The NVIDIA driver is non-free, so Ubuntu's (and other Linux distros') hands are tied here: they're not permitted to distribute it. Getting it is relatively easy (check a box, install a package), but you need to know that you need it, and then figure out how to get it.

2. (For laptops) You need to install optimus/primus/bumblebee, s.t. the system isn't using the integrated GPU. This is another package that you need to know you need, and install. Ubuntu here is particularly bad: it apparently installs the drivers in the wrong spot, so even with everything correctly installed, you have to find a config in /etc, edit it to what a couple of comment on a GitHub issue that you found after Googling the error indicate. (Other distros, in particular Arch, do not have this issue.)

3. (For laptops) You need to configure the game in Steam to take advantage of primusrun. Steam does not do this automatically.


Last time I checked, both AMD and Nvidia expect you to download manufacturer drivers for Windows. I'm willing to jump through similar hoops for Linux. (Incidentally, when someone complains about how hard it is to install Linux, that person has usually never tried installing Windows without using the OEM disc from the manufacturer. Take that away, and Windows becomes a lot harder.)

Linux support for Optimus and friends is abysmal, I agree. When I say good things about Nvidia's Linux support, I am referring to their discrete desktop and workstation cards.


Recently I tried to install Windows on some HP Elitebook, a few weeks back.

It didn't have drivers for the graphics card, nor both network interfaces (and several other internal devices). How am I supposed to download the network drivers without a working NIC?? (same machine, like many others, have no troubles with Ubuntu, out-the-box)

It was only so my wife (honest) could play The Sims 4. Fortunately, thanks to the hard work of the WINE developers, it runs flawlessly on Linux (small amount of trickery required for Origin).


USB tethering long enough to download drivers has been a reliable way for me to get around this problem.


I have perfectly working nVidia binary drivers that allow me to subject OpenGL to novel and creative abuses without problems. The rate at which I discover driver bugs has gone done a lot.

Mesa, in contrast, had floating point texture formats blocked because of suspected patent issues until a few weeks ago. These are ste mandated formats in the OpenGL standard so Mesa was never technically compatible. And these formats do matter a lot in practice.


I believe nVidia is the only major GPU manufacturer left without full-featured open-source drivers. nVidia was once the best GPU manufacturer for Linux on PCs but they proved unable to make the leap from proprietary drivers to OEM-supported open source; Nouveau exists as a community effort but remains well behind the AMD/Intel equivalents. IMHO, assuming you get to choose your GPU, the only reason to mess around with nVidia's proprietary drivers at this point would be support for CUDA—there are still some apps which only support CUDA acceleration and not OpenCL, and AFAIK CUDA remains exclusive to nVidia hardware.


The thing to understand about Nvidia is that they use the same model as Apple: Produce really nice software, and use it to drive hardware sales.

On the driver side, a lot of the features and performance advantages of Nvidia cards are rumored to be largely implemented in software. (Think of things like having a really good optimizing compiler for shaders.) Open sourcing that code would allow other brands to copy a lot of their techniques.

On the application side, Nvidia produces a lot of developer tools (like CUDA) that are either tied to or heavily optimised for Nvidia hardware.

All this means that Open Source is largely antithetical to the way they do business.

OTOH, ATI Linux drivers went through a very rough patch when ATI decided that Open Source meant, "Hey, we can get a bunch of suckers to write drivers for us!" Everyone involved seems to have actually come out the other side stronger for the experience, and I'm very optimistic about their future.

That said, my understanding is that you are committed to Linux, you still get a more robust OpenGL implementation (ATI used to be buggy as sin) and better shader performance with Nvidia hardware/software. The situation is changing rapidly, and I would love to know if the information at the start of this paragraph is out of date.

Now that I've provided an infomercial for Nvidia: I'm willing to defend Nvidia's products, but I'm not really ready to defend Nvidia the company. Even if Nvidia's hardware beats the competition on all the benchmarks relevant to your use case, it's entirely reasonable to prefer choosing a company that is more willing to play nice with the greater community.


> That said, my understanding is that you are committed to Linux, you still get a more robust OpenGL implementation (ATI used to be buggy as sin) and better shader performance with Nvidia hardware/software.

That's probably true, if your use case requires the absolute best GPU performance. For me, though, the Intel and AMD open source drivers have gotten good enough that the marginal improvements possible with proprietary drivers simply aren't worth the hassle. At least with AMD you are free to choose between Mesa and Catalyst, and can be assured of a decent experience either way. If you opt for nVidia hardware you're effectively locked in to their proprietary driver if you want to use even half of your GPU's capabilities.

It helps that Intel and AMD and all the non-PC Mesa drivers share common infrastructure, so improvements to one driver tend to enhance the entire ecosystem—except nVidia. As you noted, the situation is changing rapidly; I think it won't be much longer before nVidia's proprietary drivers forfeit whatever performance advantage they have left to the combined efforts of the open source community.


You know, my advice is colored by the use cases I have: I'm usually in one of two situations:

First, situations where performance doesn't matter, and it needs to be cheap and Just Work. Intel beats everyone else here. (I'm really eager to see what Intel does with their upcoming discrete video card.)

Second, situations where performance is a Big Deal, and I'm willing to both pay money and jump through hoops to get it. Nvidia was winning here the last time I looked closely.

ATI products exist somewhere in the middle, and I can't remember the last time that I needed anything like that.


Huh? The nVidia binary blob drivers work perfectly on Linux. Noeaveu is garbage and it's a shame that it's the default on popular distros.


I don't think it's fair to call nouveau developers' work garbage – they did a tremendous job with the resources they had at their disposal. They received close to no support from nvidia.


and no drivers on macOS.


I don't think the Linux situation is good per se, but I haven't heard anything this negative from the devs of other multi-platform Kickstarter games, like Obsidian and their Pillars of Eternity games.

(Not that they were terribly positive about it.)

To paraphrase the Obsidian take:

Supporting Linux for the first time was much much more work than they expected, but it was mostly coming to grips with everything, learning all the gotchas, and integrating that into their build system and software dev practices. Once they got over that hump it was much easier/not so bad and worthwhile enough to continue. But at the same time, if they'd known what they were getting into the first time they think they'd probably not have promised Linux support. So first game it had a huge marginal cost to support, but once the tooling and experience was there that marginal cost to support was much much lower.

Of course Pillars was a Unity game and probably not as demanding performance-wise as PA.


> We started working on the game in 2012

> Issues specific to Linux were almost entirely graphics driver related, and unique to the platform.

So this is essentially a rant about linux graphics drivers circa 2012?


Are they any better in 2019? Last I checked the anything beyond desktop compositing features is still a tyre fire of lying capabilities, broken functionality and wishful thinking. nVidia being by far the most stable while lacking integration with modern Linux APIs.


As a full-time linux user and occasional gamer, I have been using the AMD open drivers since circa 2010. The improvements have been massive, both in stability and performance. Since the availability of steam, I have been doing all my gaming exclusively on linux, logging a probably unhealthy number of hours on various games (Dota, various iterations of Civilization, City Skylines, Stellaris, etc.).

Not to say the situation is perfect by any means, but the difference from 2012/2014 is night and day.


"Are they any better in 2019"

..Yes.

Not to say they don't still have issues, but the linux gpu driver situation is vastly better than it was in 2012. AMD drivers especially has improved massively since then.


I'll take the step and say that most developers do the mistake to treat linux like any other proprietary system which they can target with stable interfaces and call it a day.

Free software changes more often, but when it does, it is not that big of a problem because you have both control over code and the code that your code interacts with, because all code is accessible.

If game developers opened up, and here I don't demand the optimum, which would be open sourcing their game[0], but just to get people involved that actually are active in both their code and the free software community, they could deal with the eco system much better and make their code actually work even on most of the many configurations that exist.

Of course, the usual argument is that game developers don't have to care about this on other systems, but in fact they did have to gather experience and training on the other platforms, and with free systems, once you are involved in it, you can deal with it much more efficiently than what it seems at the start. There is nothing magic about filing a bug against mesa.

I am one of the persons that contacted Uber about PA not working on my linux system. It was a recent Debian stable without modifications. They didn't offer meaningful support. I just asked them to install a debian stable with free mesa drivers on a computer with a recent AMD gpu, and try to build and run PA, and see why it doesn't work (there were obvious, easy to reproduce issues). That is not a lot of engineering time, and I bet solving those problem would have solved many problems in linux systems.

[0] This starts to become a cultural problem that seems to need laws to deal with. As a society, we shouldn't accept to never have our cultural heritage, which games are part of, in the public domain in a usable form (source code and documentation).

Edit 1: Writing closed programs for a free software community is difficult. I think that it should be difficult. That pain is a constant reminder that you're doing it wrong.

Edit 2: Steam counts unsuccessful launches of a game where the game is laughibly broken as playing time, so after some extenstive testing, you can't even refund a game on Steam because it says you've played more than 2 hours -.-


> "Free software changes more often, but when it does, it is not that big of a problem because you have both control over code and the code that your code interacts with, because all code is accessible."

This is akin to saying "When you're bleeding internally it's not that bad because all of your blood is still on the inside."

Games are a hit-driven industry. In most cases, it is strictly better to be working on a new game than to improve incredibly niche support for existing products. It would be one thing if fixing Linux bugs effectively future-proofed it from more bugs down the line, but that does not seem to be the case for most of the devs I know who've shipped on Linux.


> it is strictly better to be working on a new game than to improve incredibly niche support for existing products

I'm not sure about this one. Compare Ferals and Ubers approach as two data points.


AAA games rely on a single purchase fee for revenue. Once sales taper off because the game isn't hot and new anymore, there's nothing to be gained from further investment. (Ever notice how games steadily decrease in price the first few years after release?)

Uber gets recurring revenue from their app, so expecting them to provide updates is like expecting a rental property owner to maintain the property--entirely reasonable. Expecting a game developer to release patches years later is like a homeowner calling up the original real estate developer and demanding improvements. Not gonna happen.

(With the obvious caveat that homeowners are allowed to do the work themselves without the original developer's permission, but this about economics, not the absurdities of software licensing.)


This advice is only strictly true for AAA games. From what I've read from the more indie side of the games industry, there seems to be a common thread where the more time spent working on features for a game, the fatter the long tail became, and the spikier the stegosaurus tail tended to become. There seems to be advice that simply adding support for random steam features lead to sales (think things like trading cards, achievements, badges...). Language support can be a really unexpected big one as well. What's also counter intuitive is that porting games to other systems tends to increase existing sales channels.

Now, it's not entirely clear from what I've read, if working on all these extended features for a game have the highest expected value, since it really is a hit driven industry, but they seem to be lower effort work, and are definitely worth it from a lowered risk payout point of view, particularly for a small dev shop that can't risk too many failures.


If their one shot release isn't good enough, they deserve all the flak they are getting. I do have empathy with them getting frustrated, but the root cause is their approach.


The original post in the thread basically said that Linux support is easy if you just make a new release every time the libraries you depend on make a breaking change. (Or a GCC update breaks binary compatibility for everything.)

Even if an initial release of a Linux game is flawless, it tends to stop working with the next major distro release. (Unless it's statically linked to everything, or uses something like Flatpack to distribute all its own dependencies. Both options have their own disadvantages.)

You also have to understand that, from a business perspective, game companies have much more in common with a Hollywood studio than a normal software company.

Movies don't get patches. Occasionally they get re-releases for new formats, but at that point consumers are expected to go pay for it again.

None of that applies to today's slot machine^H^H mobile game companies that rely on in-app purchases, or MMOs that rely on monthly subscriptions. The former would never survive in the Linux world anyway, and the latter are often quite successful on Linux, even if it's only in the form of official support for Wine/Cedega.


I understand the business aspect that the Hollywood model is possibly incompatible with games in free software systems (i.e. "Linux"). There are other business models to make games, though.

The problems are real and I think it is natural that they are there, because the movie model is suboptimal. I see similar pains with Android device manufacturers and updates to released devices. That pain is because they are doing it wrong (i.e. not mainlining and doing things binary). That pain is a feature, not a bug. It is a constant reminder that the approach is wrong.


Ignoring that most publishers will recoil in horror if you suggest open sourcing their product, there's two problems your post fails to address:

1. The game industry is stuck on a "release and forget" model. Except for games designed up-front to have some recurring revenue component, few get more than a couple token patches after release. (There are lots of industry horror stories about studios that find out that they can't even compile a game anymore two years after release.)

2. Most games today rely on proprietary middleware, and the industry is steadily moving farther in that direction. Expecting anyone who wants to rebuild the game to pony up for a site license sounds like a nonstarter. In fact, even sharing information on the middleware is sometimes an NDA violation (everything related to consoles is shrouded in an absurd amount of secrecy) so the codebase might have to be scrubbed of all comments prior to release. I'm sure the games community will be thrilled.

What I think we really need is for someone to revive the Loki model where a studio dedicates itself to publishing Linux ports of existing games. At this point, I'd be willing to pay a company like that a lump up-front fee for a game that includes, say, a year of bug fixes/upgrades, and some recurring fee after that for compatibility updates.


> Free software changes more often, but when it does, it is not that big of a problem because you have both control over code and the code that your code interacts with, because all code is accessible.

Except for video drivers. And, as the tweet notes, the bulk of their issues on Linux were related to graphics.


Why except video drivers? Both AMD's and Intel's drivers are free and open, as is mesa. The other closed drivers are alien to a free software system anyways.


What was your expectation? That they would find a bug and submit/fix it to Debian and then walk through the contribution process so it makes it into the next stable Debian release?


No. I expected them to compile their game for the system and try to run it. If it didn't, it would be immediately obvious because the issues were so obvious.

The next step would be to upgrade the system to Debian testing, and then check again. If it still doesn't work, upgrade to Debian Sid, and then check again.

If it worked on any of those, report that and everybody knows in what version the issues will be fixed. If it doesn't work in any of those, go over your marketing and remove claims about linux compatibility. The engineering side of this is less than one work day of one developer. It's literally installing, compiling, starting, upgrading, compiling, starting, upgrading, compiling, starting.

The next step is to figure out in what component an error is, and get the latest upstream version of that component. Then, if the problem still persists, file a bug report against upstream, not against Debian. At that point, you are mostly done, and can at least claim to having done your part.

This isn't magic, nor is it difficult nor time-consuming. Heck, I would do it for free on some weekend if I were given access.


Nobody's going to do that, they'll just say "Use ubuntu" and close your ticket.


I _know_. That stone-walling is a reason why they have so many issues. I'd rethink that approach, and consider the cost of the ticket answers and marketing efforts to fight the flames vs that one day of a developer.

I don't understand why they wouldn't do it, though. I see no plausible explanation.

However, what if it doesn't work on Ubuntu either?


>I don't understand why they wouldn't do it, though. I see no plausible explanation.

Money seems like a good reason. If a tiny fraction of your sales and a large fraction of your bugs are from a particular subset of those users, you don't want those users. They are too expensive.


> Money seems like a good reason.

I'd get that argument when the question is whether to make a linux version or not (using the 'movie model'). But my non-understanding was regarding Uber's refusal to spend the work day of one developer to do the right thing (you don't even need a trained developer for most of that work). If it doesn't work on either version of Debian, it is pretty safe to claim that it won't work on Ubuntu, now or soon in the future.

My principal point is that games on linux can be made just fine, but it must be done differently than for propietary systems. If you try it without changing the approach, there will be pain. And people will complain. Rightfully so.


The flip side of this is: I only game on Linux. If your game doesn't run well on Linux, chances are, I'll never play it.

There's already a back-log of games on my wish list that I'll get to as soon as i find the time.


I've been Linux only since 2012, which coincidentally was when RadeonSI came out and I "took the bet" on AMD drivers getting good with a 7870 purchase.

Its 100% no tux, no bux at this point. My last Windows title was Darksiders 2. Darksiders 3 came out in 2018, I was totally hype for it, but it was Windows / Xbox / PS4, so no deal there. Not like I have a shortage of dozens of other games to still get around to, my library is overflowing with titles I need to finish like Tomb Raider and Metro.

The worst part was that Nordic was apparently intent to rerelease the remasters of Darksiders on Linux but just dropped it at some point. If they did the math and said it wasn't worth the time fine by them.


This! I've been gaming exclusively on Linux since 2009, though it was rough until 2013. I also quite a few Linux gamers are great at diagnosing their own problems. When we do report bugs, I bet the reports are more useful on average.


What are your favorite games that run on Linux? I'm looking for new games and these days the few times I boot into Windows is when I can't easily get a game working in Linux.


I was one of those who for every problem I had I would go out of my way to reliably reproduce the problem, then I would create a detailed description of it, with full system specs, terminal output (on both Windows and Linux) and all log files I could find.

Today, I still do that on Linux. But if it excludes Linux as a platform, it must be rock solid on windows, right? If it isn't, all it gets is a negative review going about the "crashes all the time", "glitchy" and "is unplayable". That saved me a lot of time.

Being an indie GameDev myself, I personally love well written tickets. But the passive aggressive responses I often receive from big devs makes me wonder if I'm actually the only one.


Why don't we just support the Steam version of Debian and be done with it? Anyone running custom things should be ready to make their custom fixes...

Oh well, this guys sounds full of hubris. Just because something isn't immediately profitable doesn't mean the community as a whole doesn't benefit from you ironing out bugs.


> Oh well, this guys sounds full of hubris.

From a later tweet by the same person: Linux support was a passion for many on the project, and I was a proponent in favor of that support.

If the Linux community is going to bite the hand that feeds them, don't be surprised when other developers aren't anxious to support Linux in the future.


This sounds like complaining about users being able to report bugs.


Do you run a business, say software? Or even a non-technical business like window washing or pet walking?

If .1% of your pet walking clients made 20% of the complaints, what would you do?


Complaints or tickets? I would bet that in this game's case what is happening is Linux gamers are techy and will properly file tickets and try and be helpful. While windows players will simply do as I do on windows games and leave a negative review.

Looking at the game's reviews, first two negative ones:

> "wont fit my screen. no setting to scale it correctly... Looks like i have to go buy a new monitor to play this game to its full extent"

> "Constant crashing. When it wasent crashing there were so many bugs involved it wasent funny. The classic one was way better and im happy I still have it so I can play. I will change my review if they fix the game and get it so people can play it, but fornow its just contant crashes. Beware! "


They are bugs, due to issues with linux drivers. Not helpful things to improve.


Depends. Sometimes the complainers are telling you about issues that everyone else doesn't like about you, so you better fix those issues, otherwise the other customers will leave to someone else. Other times those complainers are just complainers who will never be satisfied and I'd fire those customers. Generally it is a mix so you need to read and consider every complaint.


That depends on the complaints.

If those complaints are valid and lead to a better service for everyone, then I would rather have them instead of not having any feedback.


Wait what? What if only 0.5% of your costumers complain?

Or that 0.1% of the pet you walk are cats and 99.9 dogs, and you get complaints that you are bad at walking with cats and bad at walking with dogs. Is it good use of the metric to conclude that the cats are the problem?


The conclusion here is stated in the article as

"Issues specific to Linux were almost entirely graphics driver related, and unique to the platform."

So somewhat similar to having 0.5% of your pet walking customers complaining about issues you can't do much about, or have to do extra work to make them happy - their doghouse at home is too small, their car has an issue so they can't come and pick up the cat - basically a subset of your customers giving you a whole lot of extra work for very little gain.


Ye, well I agree it's probably not worth the fuzz support Mac or Linux for gaming.

But I mean, without numbers on how much extra dev time the linux port demanded it's hard to reason about what a 0.1% revenue increase means. 0.1% is still 0.1% and could have been more if it didn't crash at that rate and discouraged buyers.

Since they released for Mac the game must have been programmed with cross platform functionality in mind.


It's a good metric to conclude that you should stop walking cats, no matter what the underlying issue is.


Maybe cat owners complain because their cats were eaten by the dogs, and you scared away other cat owners and lost profit.


Same outcome. How am I going to convince those cat owners that I totally fixed the dogs-eating-the-cats problem? Cats don't even need to be walked really, certainly not alongside the dogs. Putting the cats into the equation was a questionable business decision to begin with.


If that .1% was using a specific version, I’d conclude the quality of that version was far lower than the mainstream offering.


It depends on how much the .1% cost, how much they provide, and how ok it is to ignore them.


I read it more as an admission that they underestimated how hard it is to a game to work consistently across Linux graphics drivers. The user reports are used as a metric, not to damn the users.


In this case it sounds like the bugs were mostly Linux-specific ("gfx driver related").


Reading the thread it seems this all happened a while ago:

> The world of 2014 Linux graphics drivers was not a friendly place. We absolutely encountered issues where some driver revisions only worked on certain distros, and AFAIK we did not use anything specific to a Linux distro.

About the same time Chromium project found a lot of GFX bugs and created a lot of workarounds but when I disabled all of them now in 2019, they don't seem to exist. Anecdotal experience but I think things have improved and it wouldn't be this hard any longer.


I first started using linux in 1998 with RedHat 5.2, installed from a CD off a magazine.

At almost any point in my time using linux there was something "temporarily broken" that would eventually get fixed/improve some time down the road (modems, sound cards, wifi, power-management modes, GPUs, etc), at which point it didn't take long for something new to break.

There are of course those calm eras of peace in between but anecdotally it never seemed to last long until the next nuisance.


Ha, I started with Slackware around the same time (downloaded over 2 days to about 10 floppies).

When I got bored of fixing stuff I moved to Ubuntu. It pretty much just works now except when I break it (which is often). Point being I could just install and run with zero issues.


yes it's not as bad as it used to be, but installing ubuntu on any random laptop (i.e. not one bought specifically because it's known to be well supported by linux) is still a crapshoot of at least something minor not working 100% (like a touch screen or some other "weird" feature).

It is rarely something truly fundamental but it happened just often enough for me not to want to bother and just pay 2x for a MacBook Pro every 5 years (a trivial expense for a well-compensated software developer).


In particular, AMD released the open source graphics drivers (AMDGPU) in 2016.

There are still problems with nVidia/nouveau, but that's easily solved by not buying their hardware.

The Intel iGPU drivers have also been solid for a long time, though the hardware may not be fast enough depending on the game.


AMD first started with open source graphics with the radeon kernel module and pre-SI drivers around ~2009. It was just that Mesa in 2010 was nascent and tiny with it only reaching OpenGL 3.0 support in 2012. It was less AMDGPU being a big deal in 2016 and more that was when Mesa hit approximate parity with the latest OpenGL.


> It was just that Mesa in 2010 was nascent and tiny with it only reaching OpenGL 3.0 support in 2012.

Which meant that people who wanted to play games were stuck with the proprietary fglrx and all the compatibility and other problems that come with a proprietary driver on Linux.

The big advance of AMDGPU was that it was a usable open source driver that made it into the kernel tree, which meant that kernel changes would no longer break the driver people were actually using because drivers in the kernel tree get patched when the kernel changes.


Well as he mentioned those were autogenerated reports. That being said... I tried Planetary Annihilation and it worked without issues for me on Linux.


what?


Too bad Planetary Annihilation was a poorly optimized, buggy, under-delivered piece of shit on Windows too.


I almost stopped playing games when I switched to Linux and I think it contributed to free time in my life for learning, side projects, sports, etc. I can always go back to OpenTTD if I crave gaming.


I'd say us Linux folk are probably more likely to identify an issue and report it as well. It takes a special determination to run Linux, and I'd wager that Linux users have a keep eye for undefined behavior, feature requests, and wonky workarounds.

Once time I updated a Linux box and the Nvidia drivers downloaded and started to build like usual, but there were tons of errors. After the graphics drivers failed GCC updated and rebuilt itself. Switching kernels fixed the problem. It took me a week to submit a ticket to Nvidia and then another 3 days of working with support before we realized that the Nvidia drivers were built with an outdated GCC and then GCC was updated afterwards. So when we checked GCC version it was the proper version, but the drivers were already built with the old version.

If it was a Windows box I would have formatted it after 12 hours and called it a day.


The conclusion I draw is that we should raise linux game prices by 200x


Except Linux users want all their software to be free.


I'm a Linux user and supporter of free software and I certainly want source code for my software. I'm quite happy to pay for it, though (and I do believe it or not). Not a lot of games offer source code, though, which is a shame. I've bought several games simply because they offer source code and I want to encourage it. I've bought a few games that don't offer source code as well (the only such software on my computer).

It's a bit frustrating seeing facile comments like these both conflating the "free as in beer" vs "free as in freedom" as well as implying that those who care about "free as in freedom" are just a bunch of deadbeat freeloaders. You can imagine how insulting it must feel to be tarred with such a brush when it is completely untrue (at least in your specific case). If you want to make a statement about a population, at least back it up with some evidence. Do you have access to any studies showing that people using free operating systems are unwilling to pay for software?


I'd be happy with code escrow too, which would mean we can keep the game forever and the company gets to protect their secret as long as they exist.

I think we should talk more about code escrow, in general.


The Humble Bundle statistics reject your affirmation ;) http://cheesetalks.net/humble/

Linux users are paying more per bundle than other OSes users


Maybe the game devs should do like RedHat and release the Linux version for free but charge for support.


There are Linux users who buy MS Windows just to enable them to play games. Win10 is £120.

I'm building a PC now, that's going to be about 20% added to the cost; I'd rather spend that on games.


Nearly every Humble Bundle release of games shows that Linux users spent more than Windows or Mac.


In terms of numbers of users, I thought it was interesting that the first Humble Indie Bundle had about 16% of downloads being Linux downloads, whereas the newer releases are like < 5%. Did the novelty of Linux gaming wear off? I bought the first few but haven't bought any since, so I'm contributing to those stats - but also I got older and stopped playing as many games.

All these numbers are way higher than 0.1 % though. I admit I just don't believe the OP that Linux sales were 0.1 %. I would believe 1%.

Also my first interpretation of the headline was "Linux users provide valuable feedback!", which matches my experience. I report so many bugs against everything all the time as a Linux user, if a game had a bugtracker I would definitely use it if I encountered a bug. Whether the issues were Linux-specific or not, I don't know.

Finally, my experience with cross-platform development is just not nearly as bad as I keep hearing. I do it. Bundle, and use an already cross-platform library for your GUI or graphics. I admit I don't make 3D games, but I'm super perplexed as to what is supposedly so hard about the cross-platform aspect of it. For my (non 3D) graphics I don't even use electron, just Qt (also perplexed by those thinking electron is the only way to do cross-platform). I've never tried to freeze my code so it will work on future systems (it is open-source and maintained to stay up to date), but I know exactly what I would do if I had to - I would bundle literally everything except glibc and the kernel and call it a day. Would anything trip me up if I were to do that? Am I naive and waiting to be burned by something I don't understand? Maybe - but it seems like the devs complaining about cross-platform development being hard aren't even doing this step.


I found that the first few were actually indie, and progressively, the bundles became more and more un-indie, more DRMed, people buying for Steam key resale, etc?


It sounds like they just want it to not crash


Many Linux user would pay for games, and they would pay more for a Linux version than a windows version.


Maybe, but many Linux users would also complain that the Linux version is more expensive.


It may be better to solely target SteamOS, Valve's GNU+Linux distribution, and to not support the other distributions who shall adapt/follow SteamOS.


Seems to me there is no sales on Linux because the game barely works on Linux? Maybe you'd have bigger sales ratio if you'd fix those 20%?


Anybody competent enough to use linux is competent enough to dual-boot into Windows. I would love first-class support for Linux gaming, but dual-booting is such an easy fix for me that I frankly don't care that much about linux support.

I totally get there are other potential issues revolving around having to use windows, but for me at least, I don't really have any issue with using windows for games.


I was one of the PA Kickstarter supporters and played the game on Linux. It never worked flawlessly (graphical glitches). Today however I can play the latest DOOM (through Steam) on Linux using Vulkan without problems using an AMD card and open-source drivers. There is no reason not to build games for Linux, just use Vulkan and graphic problems are a thing of the past.


As I recall the development of Planetary Annihilation was a clusterfunk of bugs and really bad performance. I kickstarted it, and remember getting it to work on linux. Put even though it ran, it did so 15 frames a second, even on a powerful machine. I actually just gave up on the game as the developers released it, even though it was still unplayable for me - and going by steam reviews - numerous others (mostly windows users I would guess).

I guess it makes sense now that they 1.0'ed a barely working game, with such an animosity towards users reporting issues.


It barely worked on Windows. I don't know what he's trying to pin on Linux.


How do they measure the 0.1%? Many times I've purchased games because of the promised Linux support, and played it on Windows until the promised Linux support arrives.


Judging by its Wikipedia article this game had Linux support since the beginning: https://en.wikipedia.org/wiki/Planetary_Annihilation#Release


I'm sure it is measured by number of installs or downloads. Either way the statistic is still very telling. If 0.1% of installs/downloads/purchases created 20% of the bug reports, I can understand the developer's frustration.


The stats might be measured differently: Steam might not report a Linux user as such if they played on Windows at first (or something like that), while the tickets might be correctly labeled as Linux-specific by the user.


For sure, it's harder to understand what platform people WANT to play it on based on an action they performed at checkout.

I know personally, when it comes to purchases on Steam, I've bought ~200 games/applications, and I've probably only installed a third of them.


The original release was online only, so I imagine they have some fairly accurate idea on the number of clients connected. Even when the same user is using multiple different clients.


Most of the people I've met over the years who are tech literate say they have some flavor of linux installed, love it, but keep windows around because 'gaming.' I would venture the number of folks who had linux installed but didn't buy it for that platform because of instability to be much higher than 0.1%. It's a chicken/egg problem.


Considering my graphics driver is buggy enough to cause the KDE panel to crash, I'm not at all surprised games on linux have issues.


Well if the game crashes a lot on Linux then those players would likely switch to an alternative platform. This becomes a self-fulfilling prophecy where games don't run well on Linux and game developers don't see Linux gamers as a large demographic.

If most games ran smoothly on Linux, you can expect the % of Linux sales to rise... no?


>There is no denying that there are plenty of issues with some parts of Linux, but still I believe it will become predominant gaming platform, especially in the more "appliance" space. It is worth investing in the know-how early.

Bold claim to make. I doubt Linux will ever be a predominant gaming platform.


Umm... Phone games? All Android devices are based on Linux. Android is a major player in the mobile gaming industry.


What skills/tech you use for Android games is "Linux" specific and would transfer from/to a desktop Linux system or other Linux platform? (instead of being Android-specific, or specific to an engine etc)


Quite a bit actually, if you've only worked with Android Java ecosystem you probably won't see it, but if you're working with the Android NDK at least as many game developers are, you're frequently using the same libraries and compilers you'd need to get your game working on desktop Linux.

Especially when it comes to things like graphics libraries and game engines, stuff like OpenGL, SDL or even higher level engines like Unity are quite popular on both platforms. While you could say this is just due to having cross platform libraries, but the platform specific steps for them (compiling, linking, initialization) are definitely more consistent across Android and Linux than going to Windows for example.


Anything from Android can be forked into any number of Linux distros you want to make. So none of it is Android-specific, really.

For example Android software worked on Jolla's Sailfish OS.


Sailfish has a crappy Android simulation layer. You don't really want that on the Desktop. Android and Desktop Linux are completely different platforms, except maybe for the graphics APIs (Vulkan/OpenGL).


I'm not saying I would want that. My original point was that the person I was replying to said that they won't believe Linux gaming would ever be predominant. I'm saying it already is.


Would you say the same about FreeBSD gaming then, given the PS4 runs FreeBSD?


You can easily fork Android and create an OS on top of it that connects with the Android ecosystem. Can't do that with PS4 and its OS and ecosystem.


The tweet was about a game on desktop platforms Windows, Mac, and Linux. The discussion was not about mobile games.

https://en.m.wikipedia.org/wiki/Planetary_Annihilation


Steam and the work that Valve has done on the linux side seems to show that's not the case. (They've managed to port quite a lot of linux games).

This sounds like a case where a platform wasn't a priority for their business was showing that it had bugs.


I game on Windows because of the momentum behind it but Linux support is a selling point that goes into every purchase, because the day I migrate I will still have my purchases.


2007s 'Supreme commander' is still by far the best game in that series, and the best RTS ever imo. It's still going strong, and has an awesome community-built matchmaking service, that effortlessly supports mods and maps, called 'Forged Alliance Forever' or 'FAF' https://www.faforever.com/

Imo the idea of projecting an epic RTS onto a sphere is just, bafflingly bad. In supreme commander, you zoom out the view to get a strategic overview of the entire map.. You cant ever do that with a sphere. I am just bemused by this game.


I'm aware that if you buy a game on Windows and play it on Linux, it might be tagged as a Windows game purchase. Wonder if this 0.1% truly tells the whole story.


I should note Android runs with a Linux kernel. It has been estimated that games like Pokemon Go have made over $1 billion on revenue from Android devices - larger than its revenue from iOS. Candy Crush Saga, Clash of Clans, Dragon Ball Z etc. do well on Android as well. So the "would totally skip Linux" advice is not heeded by every company for every platform.


This comes up all the time by people that usually lack Android developer experience, specially with the NDK.

Nothing on the Android app stack exposes the underlying Linux kernel to app developers.

Yes it is possible to access certain paths or syscalls, however none of them are part of the public API nor guarantee to work across devices.

Only OEMs have direct contact with the underlying Linux kernel, which could be replaced tomorrow without any issues for app developers, e.g. Fuchsia.


Linux in the thread specifically refers to the Linux Desktop, and even more specifically to the only two supported distros: Ubuntu and Mint. Citing the success of an Android game does not in any way invalidate the advice.


Pokemon Go sold because it had the "Pokemon" IP. This is a very bad example to generalise from.


it would be nice if companies could ship to linux and just say ' yeah good luck on that' and do some fixes where they feel the time is well spent. but unfortunately that's also not acceptable in the market, so it's either full on or nothing for many.


While I am a Linux user, I boot my workstation under windows for gaming. So I would not even be accounted as a linux gamer by the stats.

While rebooting is a bit cumbersome, I don't really mind now. I also wouldn't trust all the games I play to run as my user on my workstation. And it would take a bit of efforts to properly sandbox them.


Pretty much every game runs fine on my linux box. You guys are just doing it wrong. ;)


And these number are flipped for Mac users.


I'm sure those statistics are entirely real and not pulled out of his ass at all.

Also didn't Planetary Annihilation have that awful chromium based UI?


So, free beta testing?


> In the end they accounted for <0.1% of sales but >20% of auto reported crashes and support tickets (most gfx driver related). Would totally skip Linux.

OK. So just another dumb developer being wrong. Nothing to see here move along.


Developing software for Linux is like painting a moving train.


SDL is binary compatible on its major release (2.0) and will probably be forever at this point. libc++ only broke its ABI once when C++11 required it.

What else do you need to make games for Linux? If you are using Unity or Unreal its going to statically link all its dependencies anyway. Don't use libraries that break semver.


SDL is a 2D library. Any modern game will use 3D APIs and there you'll quickly hit the insane minefield of OpenGL bugs that are graphics drivers on Linux. Those are pretty much optimized for driving the desktop compositors with everything else being a stroke of luck if it works as defined in the spec.


Any modern game should be using Vulkan for just that reason.


I really, really don't get it. Just bundle all your dependencies, what is so hard?


But wasn't this supposed to be the year of linux on the desktop ?


Also I wonder how many of those crashes would have happened in windows or mac. I suspect the code is shared and would have happened.


"gfx driver related" code is unlikely to be shared between Windows, Linux and OSX ...


Nvidia shares a huge chunk of it's (proprietary) driver code across operating systems.

I suspect the crashes are coming from bad OpenGL usage since he mentions the same thing happens with Mac.


Then redirecting those issues to the right mailing list could still be very welcome by someone.

IMO issues does not need to be fixed by the game developers and drv developers might be happy to have them and some code that reproduces them. Not sure why the game dev complains here.


Because dealing with the users who experience the problems costs more than they make in sales? If every additional Linux sale costs you more than it makes?

There's being a good citizen. There's wanting to offer support to a small set of customers. Then there's losing money. At some point, you just can't afford to service the long tail.


If the issue the user has found is in the driver, then redirecting them there or maybe sending an automatic email for each unique issue to the driver mailing list might help fix the issue without much effort on the game dev side.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: