United States Army Staff Sergeant Robert Bales murdered 16 Afghan civilians and wounded six others in the Panjwayi District of Kandahar Province, in the former Islamic Republic of Afghanistan. Nine of his victims were children, and 11 of the dead were from the same family.
Bales was taken into custody later that morning when he told authorities, "I did it".
> * Extroverts: some people just prefer to be in an office, surrounded by and interacting with lots of people, rather than sitting at home in relative isolation. (I'm definitely not one of them, but I have good friends who are like this.)
You know what, since the dawn of work - it's always been the extroverted way. Introverts suffered through. Now that the tide has changed, let the extroverts suffer too, it's their turn. We have done our part.
> I think that folks overseas aren't as capable of communicating with Americans as other Americans are
I speak Italian, English and French. I speak English to you because it's the only language you know. We are not the same.
> I think that American tech companies would prefer a motivated at-will employee at 3x the cost of an unfirable European with a statutory month off every year.
We work to live, not the opposite. Again, we are not the same.
> I speak Italian, English and French. I speak English to you because it's the only language you know. We are not the same.
I have worked with overseas coworkers who spoke English. You're right, it's not always the same as having native fluency.
> We work to live, not the opposite. Again, we are not the same.
You're making my point for me - offshoring work fails for cultural reasons, not because overseas workers are dumb. Making work remote is not gonna change the cultural factors.
Also a very funny take because if you have ever visited the Bay Area, you can't throw a rock without hitting someone who struck it rich and retired at 30.
> Also a very funny take because if you have ever visited the Bay Area, you can't throw a rock without hitting someone who struck it rich and retired at 30.
What does it have to do with companies looking for "motivated" americans?
Also
> This is a gigantic effort from Larian, who among all things is still updating its software instead of resting on its own laurels.
What makes this story even better is how it actually came about - this wasn't initially a top-down corporate initiative, but rather a passion project from a single engineer who worked on it after hours. The fact that Larian immediately recognized the value and threw their full support behind it says everything about their culture.
Swen Vincke shared the backstory:
> The story of how this came to be really is one of true passion. The Steam Deck native build was initiated by a single engineer who really wanted a smoother version of the game on Steam Deck and so he started working on it after hours. When we tried it out, we were all surprised by how good it felt and so it didn't take much to convince us to put our shoulders behind it and get it released. It's this type of pure passion for their craft that makes me fall in love with my developers over and over again. Considering myself very lucky to have people like him on my team. Try it out!
They probably would have to get the permission of the engineer to name them publicly. With how the gaming community behaves on social media I wouldn't be surprised if the engineer doesn't want that. Because that could mean death threats for you and your family the next time a subset of the community gets upset with your employer.
Not sure why this is getting downvoted, you are absolutely correct. The unhinged weirdos are still a minority, but less and less ashamed of their own behavior online. No doubt that dev is better off remaining unnamed in this instance.
They may be a minority but they are more empowered than ever. Both by the new owner of Twitter and the current politics in the US.
It’s a shame that large companies like EA/Bethesda/Valve/etc don’t do more to fight against it, instead of cowering and leaving indie devs that are barely surviving to fend this off.
> all of mid/late 2010s online politics was colored by one reviewer giving a favourable game review to a game that some people disliked
That's kind of a twisted interpretation of events. It was coloured by one incel who though he owned the developer of a game and a whole lot of incels who sympathized because they too were owed a vagina by the ones who controlled them. Now it's spread to broader issues and higher levels of politics and is still going.
I remember the start of GamerGate well, it was all people screaming about "ethics in games journalism". But you're obviously right that that it wasn't really about ethics in games journalism, your description is probably a better reflection of the actual psychology of the people involved.
And then there are people, gamers, who were actually just dismayed with the conflicts of interest that ran rampant in the orthodox "games journalism" space and didn't give two shits about the personal drama side of the story, although that's mostly solved by finding your favorite youtube reviewer. And those who were genuinely focused on improving discovery of good indie games were subjected to some pretty horrible commentary that completely missed the point. Now there are smaller dedicated publications or channels that actually do regularly (weekly/monthly) review a decent volume of new promising indie games to help discover standouts, but that turned out to be a niche that the existing publications didn't want to keep up with, and a niche that suddenly many people denied even existed, for some reason? People who can't contemplate that there are amazing passion projects out there to be discovered, I suppose because those people can't imagine actually working hard on something people would enjoy, because they would rather spend their time raining on others' parades instead.
But it was too close of a tangent towards criticism of establishment journalism in general, so of course establishment journalism countered back with the only weapon it has, and suddenly the vast majority of people forgot any of it had to do with reviewing and promoting good indie video games.
People who make indie games are not losers. People who want good games to be promoted are not losers. It is an art. It's not for everyone. People who just want to play the latest AAA sequel can stick to those. But if you've ever tried a niche indie game and been more impressed than you expected, you know it's art, and you'd want other people discovering and promoting the good ones, and talking about what makes them special.
I am not going to re-litigate GamerGate here. There were people who were genuinely concerned about ethics in games journalism, sure. But it did not become the defining event in the online-political sphere of the mid/late '10s simply due to genuine concerns about ethics in games journalism.
Correct, because a large portion of the public has no idea what indie games are, or how the software industry works, but they know that angry nerds are funny.
What I remember is that there were a subset of people I was acquainted with online who when this started all /immediately/ started posting things exactly like the comment this is a reply to; "these people just don't respect women, you all need to sit down and listen to women and center women" kinds of things. They were all men; mostly straight men although some were bi, and all generally thought to be fine although known for being a little performative and mildly, as they say, horny on main a little too often.
Every single one of them later turned out to be a sexual predator. This is now known as the "softboi" or "male feminist". This kind of person is still out there and is dangerous as ever, so it's important to keep an eye out.
(None of these people were in tech; instead all my tech coworkers who were men and lived in SF also heard "we need to respect women", but being kind of autistic engineers took it too literally and didn't seem to know any women, so they seemed to think the right thing to do was go out and find a woman and literally just start respecting them. This didn't work out for them and they mostly ended up getting scammed by scammers who happened to be women.)
No you are the one twisting it. It was about conflict of interest regardless how hard you try to throw around ad-hominems and rewrite history.
A game reviewer should not be in sexual relationships with people selling games that get reviewed. I think anybody not ideologically captured would agree.
I also find it tasteless to use the same rhetoric here as it was used back then to slander someone into suicide.
Not that I'm aware of. I thought that was weird at first as well, but I assume it might be in a way to protect the engineer.
Unfortunately, singling out any individual developer, even for praise, can attract unwanted negative attention online. By acknowledging the passion and the work without naming the person, Swen gives them full credit internally while shielding them from becoming a public target.
This doesn't even necessarily have to be intentional harassment, but if this engineer is now the "SteamDeck guy" at Larian, their social media might get flooded by people who mistake their personal social media accounts for a support ticket.
I'm sure the engineer has the option to self-identify if they wish, but this approach feels like a sign of good and thoughtful leadership.
This is an interesting perspective... I'd be at a loss to think of an example of an engineer who's been publicly pilloried (having been highly regarded for great work) for the failings of their company. Perhaps you could cite and example?
Seems enormously more likely to be the all to familiar story in the games industry of not providing credit to individual devs. Something that goes back to the earliest days of Atari.
> I'd be at a loss to think of an example of an engineer who's been publicly pilloried (having been highly regarded for great work) for the failings of their company. Perhaps you could cite and example?
Because these guys and gals are not famous enough to warrant large coverage, and because the phenomenon is unfortunately so widespread that noone is going to cover every case.
Thanks, really appreciate the concrete examples. They're not quite what I was referring to (developer praised by company / media - then attacked for issues with the company beyond their purview), but they do point to a (largely invisible from outside the industry / twitter bubble) truly worrying and frightening level of animosity and aggression pointed towards devs that I wasn't sufficiently aware of.
I don't think you need a case quite this specific because of the following:
> then attacked for issues with the company beyond their purview
Ultimately, whether an employee is praised or not is completely irrelevant to the nutjobs taking their anger out on them because of something their employer did.
I agree. It's bad in either case. No issue with a game or game engine should ever result in threats of violence or harassment. It's vile to publicly shame, cancel, still less attack individuals for the mistakes of their companies.
My initial skepticism was based in the voluminous amount of false allegations of harassment and misportrayal of valid criticism as harassment that happened at one point several years ago in the games industry.
I'm not necessarily saying they'd get pilloried. I'm saying that having your personal digital space colonized by people who think you're customer support is insanely disruptive.
Think replies full of "I only get 8 fps in Act 3, pls fix" when you just wanted to post a photo of your vacation.
I can't think of specific names anymore since it's been a while since I have played it, but a lot of the developers for World of Warcraft used to be and likely still are active on Twitter. For a lot of them, the community knew fairly well which features of the game or which class they were responsible for. When I used to look at the replies to some of their Tweets (even ones completely unrelated to WoW), they were often full of complaints about their area of perceived responsibility.
I fully understand every engineer who just wants to put their head down and work on their stuff they're passionate about without having to also be public-facing. Even in a small company like mine, some of our devs constantly complain that some customers know that they are responsible for certain features of our product and email them directly rather than going through the proper support channels.
Your point about the games industry often struggling with providing proper credit to devs is well taken - it's absolutely an issue. But in this case, Vincke did actually do that, in a way. He could've just kept quiet and let the playerbase think it was a company effort, but instead he publicly highlighted and recognized the passion and work of one of their engineers (even though anonymously). That engineer can look at the countless positive replies to that post and get the nice fuzzy feeling without getting dragged into the spotlight.
I take your point about being inadvertently made a point of contact for customer support / complaints about technical issues with the game.
Disagree however about the value credit - personal credit has concrete value (career wise, status wise etc), warm and fuzzy feelings less so. Right now we can only guess whether the dev had a say in the matter.
You're absolutely right that named credit has tangible career benefits that go well beyond feelings. But I think Vincke threaded that needle well with the anonymous public credit - it creates a documented public record of innovative work at the company level while preserving the engineer's privacy.
The engineer can still leverage this (LinkedIn, internal promotions, industry networking) without being forced into a public-facing role they might not want. When they're interviewing or networking, they can point to Vincke's public acknowledgment and say "that was my project" in contexts where it's professionally relevant, without having their personal social media permanently associated with it.
Considering Vincke was impressed enough to publicly acknowledge this individual's passion and initiative, there's no doubt in my mind that this engineer could get named credit or something that would acknowledge their role in the project if they wanted it.
But to go a bit meta:
I think it's strange that we are discussing this in the context of a CEO publicly acknowledging one of their engineers (even if anonymously). Vincke is, at least in the context of the broader industry, going above and beyond. I doubt you'd see Ubisoft, EA, or Blizzard publicly acknowledging a single engineer's after-hours passion project in this way.
Feels a bit like misdirected energy, I guess?
Why are we debating about the nuances of named vs anonymous credit and recognition when industry leaders don't give any?
It's like calling someone out for only tipping 10% while ignoring the guy in the top hat who's tipping 0. If you want gaming companies to get better about giving credit and recognition, you should support the companies that are at least moving in the right direction. I know it's easy to be cynical, but don't let perfect be the enemy of good.
I'd cite that as an example of the tyranny of diminished expectations. To be clear - I was criticising not providing named recognition. Of course providing some recognition is better than none. Perhaps you're right, perhaps the engineer involved can leverage this in interviews (or perhaps not, it might be difficult to prove / DNA'd etc), but you're giving the CEO the benefit of the doubt here.
I very strongly agree all creative workers should receive fair recognition (and compensation) for their work. I disagree with directionality as a moral framework. Doing something similar to the right thing is not necessarily doing the right thing. In this case my immediate assumption would be that the CEO is boasting about their anonymous hardworking impassioned employees as a way of 'glazing' the company, rather than shielding them from public criticism. It's impossible to know, but CEOs are not generally known to be good and ethical people. Larian may well be exceptional in this regard, but giving the benefit of the doubt to CEOs in general is a poor heuristic.
I've worked enough with customers to know they're mostly fine, until you get that one weirdo that finds out where you work and follows you home. You get a few every year. Knowing that, who would want their name associated with something in a space that produces as many incredibly motivated folks as the videogame industry?
The Steam deck is really not that limited. Every game could be made to run well on it if some time was spent actually making the low settings work well. Something often skipped on modern games which optimise only for people with a $1000 GPU chugging 400w.
It's not like we have seen anything in gaming that wouldn't be possible on PS3/Xbox360 era hardware, certainly not in terms of complexity.
Just remember that stuff like red dead redemption ran on those things with all of 512 MB of unified memory. It ran and looked better than borderlands 4 does on current consoles.
I think you're looking back with rose-tinted glasses.
The 360/PS3 was a huge jump forward but very limited by today’s standards. RDR was one of the better looking games of the generation but could not maintain a steady 30fps at 1080p/i (and I’m not sure it was even true 1080).
The PC version came later, had higher resolution textures and other graphical improvements so it compares more favourably to modern games when you play it today. It still had problems running on all but the highest-end PCs of the time.
Of course even low-end PCs can run it without breaking a sweat, because they’ve become much more powerful.
Most Xbox360 and PS3 games were 720p at 30fps. 720p was mostly fine because 1080p TVs were luxury items back then.
The performance problems in modern games are often not caused by fillrate-vs-resolution bottlenecks though, but by poor engine architecture decisions (triggering shader recompilations in the hot path).
Shader recompilation causes stuttering not general performance problems. Shader complexity will though, which is a function of render quality.
But I’m confused about why you think fill rate isn’t an issue? If you are now upgrading from 1080p to 4K your GPU needs at the very least 4x the pixel pushing power and even then that’s only to maintain the same detail; you bought a 4K screen for more detail.
> But I’m confused about why you think fill rate isn’t an issue?
Because this is can be easily dealt with via upscaling or buying a more expensive GPU, but fixing shader recompilation in the hot path requires a complete engine redesign.
There aren’t faster GPUs affordable to most consumers, that’s the point. Yes, DLSS is used as a crutch because it’s easier to do AI upscaling than render at a higher resolution.
You don’t need a full engine redesign. UE5 provides tools for PSO bundling and also pre-caching, but you need to use them.
Also good material design and structure helps reduce the number of PSOs needed but again, you need knowledge of how the engine’s materials system works.
Presumably people do this because they hate money; as you say, it's much harder to make the pixels just slightly more crisp and you'll pay dearly for the privilege.
I might be misremembering, but I seem to remember most games of that era were 540p scaled to 1080p. 720p would have been an upgrade. But your point still stands.
Remarkably RDR1 was only released for PCs late last year, ~14 years after the original release.
Maybe that is even related to it's good performance on consoles back then: Rockstar invested a lot of development time and sacrificed portability for performance. Basically the opposite of what modern games achieve with unreal 5.
Deck can run witcher 3 and mh:world decently (maybe some hiccup and lower graphic setting). There should be not a big problem to make games run on steam deck (ignoring controller support since it's a separate matter).
I tried CP2077's Deck mode but it really seemed like a tech demo level of "you could do this if you really wanted to" more than it actually being playable.
The game felt like it had significant input lag, and at 720p with upscaling text becomes very hard to read. The game's visual style of "glitch" effects also translates badly with upscaling and I really had a tough time actually understanding what I'm looking at on the screen.
I thought it was playable on the LCD Deck. I did turn things down below what the Steam Deck preset was at. It certainly wasn't the smoothest 100% of the time but it was better than Fallout New Vegas on a PS3 IMO. It still holds up pretty well against the Switch 2 version in handheld mode.
That's true for GPU bound games but with CPU bound games like BG3 in Act 3 there's no easy toggle on the user side, and often no easy toggle on the dev side either, because the nature of the game necessitates CPU intensive work.
The problem is how horribly unoptimised Unreal Engine 5 itself is - with that sort of foundation there's not a lot you can do. It's a GTX-1050 equivalent GPU, there's only so much that can be expected of it.
UE is easier to ruin a project with but it's not inherently cursed.
The real reason many of these games run like shit is over reliance on real time lighting systems. RT lights are easy. It's easy to throw a bunch of artists into a box and hope for the best. A complete idiot can make a scene mostly look good without much thinking. Baked lights require a lot of anticipation and planning. It impacts iteration time, etc. The tradeoff being that this is orders of magnitude more performant than RT lights. Imagine watching Toy Story after the offline render vs attempting to do it live. This is literally the same scaling problem.
UE has always been a damn huge toolbox. Yes, sure, you can just cobble together all sort of libraries and get a visually very appealing game or, if you want, photorealistic rendering decent enough to back these giant virtual studios for triple-A blockbuster movies, but you will need the hardware to match if you want performance.
If you want performance on everyday hardware, there is no way (and I'd say this holds true for any engine, not just UE5!) that you dig down into the engine an the libraries and invest the money in testing to tune the performance appropriately.
When EVERY game stutters and has the same kind of issues, then you can't put a blame on individual developers.
This isn't a case of "these developers are lazy", UE5 issues are the case of "every single UE5 released game has shader stutter issues on PC". That's an issue with engine architecture and its APIs, not an individual thing.
> This isn't a case of "these developers are lazy", UE5 issues are the case of "every single UE5 released game has shader stutter issues on PC". That's an issue with engine architecture and its APIs, not an individual thing.
Just because an engine offers you a way to shoot yourself in the foot with a sawn off shotgun, you can't blame the engine maker when you do shoot yourself in the foot with a sawn off shotgun and end up with a bleeding ugly stump.
The thing is, of course game studios will go for "we want to use ALLLLLL the newest features, we want to show off with Nanite and god knows what else". Who wouldn't? But game studios aren't willing to put in the effort surrounding such an implementation to properly tune it.
And it's not just tuning engine components for what it's worth - often enough the culprit ends up being ridiculously oversized textures, there's nothing else that could cause dozens of gigabytes worth of patches [1], and it's not a new complaint either [2].
It's not that I think that UE5 is good for low end hardware, it's not.
One of the reasons that a lot of studios struggle with bad performance on UE5, is because a lot of studios, fired their most experienced devs and hired bunch of cheaper new programmers, because they bought into the whole make game with blueprints idea.
I have several friends (I know just one datapoint ), that were in games industry from 6 to 12 years that got fired, just for the studio to replace them with cheaper more inexperienced devs.
Baicly UE5 overpromised how easy it was. You still get some great working games that use UE5, but this are from studios that have experienced devs.
It’s not terrible at low-end hardware. Fortnite has been able to run on phones for a long time now. It’s not as lightweight as Unity or Godot by any means and they still remain the optimal choice for low-end platforms.
What you can’t do is hit compile out of the box and expect it to work well on those low-end platforms, because it will try to use all the high-end features if it thinks it’s allowed to.
I don’t think it exactly overpromises how easy it is, but unlike a lot of software it has a learning curve that seems gentle at first and then exponentially increases. It’s high-end AAA-grade development software aimed at professionals, it expects you to know what you’re doing.
Agree, Oblivion Remastered on low settings looks pretty bad while also performing dreadful. And it's not that the hardware is too limited because Skyrim looks better than Oblivion low while still running full 60fps.
Yeah. I just ran Goblin Cleanup, Mars First Logistics and Peak on a Framework 12 - that’s an intel integrated gpu. They all ran fine. Just a solid reminder that you can actually make a fun and good looking game without asking the player to spend hundreds to thousands of euros on future land waste.
Or in the case of Borderlands 4 and a plethora of other Unreal Engine 5 titles: they’re optimized for nothing and there aren’t even options to turn off most of the expensive graphical effects, despite the engine being able to scale down to mobile devices.
This is absolutely unacceptable and if this happens with nearly every big release, then that also speaks badly of the engine itself. Similarly to how languages like C++ are very powerful and can be used to great effect... and people almost inevitably still write code that has memory safety issues. That comparison should make a few ears perk up, my point is that fewer developers should use Unreal Engine 5 if they can't use it well (same as with the languages).
Frankly, I place more trust in studios that have their own engines or use literally anything other than UE5, like what happened to KDC:2, a modern game that looks good and runs great across a wide variety of hardware. Or how they fixed Cyberpunk 2077, it took a while to get there but now both the visuals and performance are quite good across the board.
> In parallel I don't understand gamers with 15 years old hardware leaving bad reviews or whining when a game chokes above 720p with minimum settings.
Because they bought the game. After decades of PC gaming, it's totally absurd there is no system that tell you how bad or how well a game is going to play on your system. And if it's too difficult to make, how can we expect regular people to know themselves ?
As soon as what you have in your machine doesn’t literally match the stated system requirements, you’re on your own. It’s up to the user to research and understand which CPU or GPU is ”better” or ”worse” than the required one. These things are nontrivial when comparing between generations and across tiers, not to mention across different vendors.
A knowledgeable user might be able to predict their performance reasonably well, based on publicly available benchmark databases, but you still can’t really get a good estimate FPS unless you find someone with exactly your hardware setup who benchmarked the game (and is willing to share).
Most minimum/recommended game specs reference mainstream gaming CPU/GPUs, and most gamers know the strength of their own hardware relative to mainstream components.
If you're a very casual/young/inexperienced gamer then sure, you might have trouble comparing your own system with the min specs.
Which is so bad it barely means anything for lower-end PCs. I played and enjoyed plenty of hours on Elden Ring while rocking hardware well below the minimum requirements
Steam could probably build in a system to guess the performance if there was some benchmarking data, but game performance can change dramatically after release between updates to drives or the game itself.
I think one factor to this is that PC gamers are hostile to telemetry, and couldn't give a damn if the reasoning for it is advertising, real world feedback on game design which would feedback for future patches or the next game, or a mutual benefit of "hardware like (this) generally performs like (this) at low/med/high quality preset".
> I think one factor to this is that PC gamers are hostile to telemetry
Is there any data to support this? IME most PC gamers I know don't give a shit about telemetry. They are stock Windows and Android users, love Google products, etc.
They only care whatsoever when it comes to adblocking, because they don't want to watch ads.
Steam makes it easy to get a full refund for a game you don't like for any reason. So there's no risk in trying an install of a game that might not work well on your below-specs device, but then you shouldn't give it a negative review.
Unless most of the problems come later on, after the 2 hours game time.
I've heard about multiple games that where steamdeck verified but the performance choppy. If it can't hold a steady 30fps, a game shouldn't be steamdeck verified in my opinion.
It could be survey based. Heck, it could be coupon based. Something like:
1. Enroll is the discount program by running steam hardware survey. Steam holds onto your system specs.
2. Steam offers discounts for games that have insufficient benchmarks for your rough system.
3. For these games, steam collects performance data (either 5 minutes of benchmark before, during the game, first run, or maybe when the PC is idle (screensaver mode)).
There's all sorts of way they could do it. I'm guessing a large portion of people would be fine with a "Folding at home" style system, that just runs benchmarks for screensavers (with some coupons or whatever granted).
> In parallel I don't understand gamers with 15 years old hardware leaving bad reviews or whining when a game chokes above 720p with minimum settings.
IMO it's because a lot of these newer games just don't need that much horsepower. BG3 is not one of them, but looking at the broader industry.
A lot of times were seeing maaaaaybe a 5% bump in fidelity or graphics quality in exchange for 400% less performance.
Like ray tracing. Does Ray tracing look good? Yes. But not that good. Its not the PS1 to the PS2. I've seen baked lighting indistinguishable from Ray tracing in 99% of scenes.
Its just not a good trade off with modern games usually. Unless they really optimize them.
The only people still optimizing games is Nintendo from what I've seen.
There is an interesting discussion about the need for ray tracing in one of the later Digital Foundry videos. The argument goes that sometimes baked lighting is impractical due to the size of the maps and how much dynamic lighting you need. The latest Doom game is one such game where light maps would be 100s of GBs. But I guess most other games are fine with baked lighting.
There's also much cheaper methods of dynamic lighting that aren't real time ray tracing. You can approximate, you can cheat, and it will look almost as good.
> I don't understand gamers with 15 years old hardware leaving bad reviews or whining when a game chokes above 720p with minimum settings.
Depends on what the game can be reasonably expected to run on. Most games don't even approximate what would be technically possible on today's hardware and waste your electricity on lazy coding instead. "15 years old hardware" is what was cutting edge when Crysis 2 and Skyrim came out, so that's not a good excuse in the majority of cases.
> In parallel I don't understand gamers with 15 years old hardware leaving bad reviews or whining when a game chokes above 720p with minimum settings.
I game on 1080P and never have issues with any games I play, though I am on a 3080. It's definitely people trying to max out every setting for their 4K monitor that they overpaid for. I might be giving 2K monitors a try soon on the other hand.
As for the Deck... it's not a powerhouse, but it's still impressive how much it can run with decent tweaks. BG3 on a handheld at all feels like sci-fi to my teenage self
> I don't understand gamers with 15 years old hardware leaving bad reviews or whining when a game chokes above 720p with minimum settings.
15 years old? Have you seen many examples of this (I have not) or are you exaggerating to make a point?
Regardless, some very popular gaming hardware from 10-12 years ago is still in use and still very capable in modern games, so long as they allow tuning the graphics down. People running an i5 3570K and RX 480 at 1080p don't generally expect to get the imagery or frame rates of a modern gaming rig, but they are reasonable to expect roughly 60 fps with (for example) low textures and shadow detail, no reflections, static lighting, etc. Perhaps this is what you meant by "minimum settings", but:
While low-spec options like this have been the norm in 3D PC games practically forever, several very popular games released in the past 5 years have adopted anemic options menus that have negligible impact on performance at the low end. To someone with much experience tuning for older hardware, this is a striking and disappointing change. Especially now that gaming hardware upgrades are far more expensive than they were, and more people are struggling just to pay their living expenses.
The change is almost certainly unnecessary. It smells like the developers just aren't putting any effort into it anymore.
And it's not merely disappointing; it's also wasteful, both by pushing older hardware into the landfill and by denying opportunities to reduce power consumption.
Of course they don’t, it would be crazy to say they would support all the different possible distros and configurations that people might run, when the majority of users are in steam deck. But that doesn’t mean it won’t run, just that if you have issues, they don’t promise to fix them. Seems reasonable to me.
BG3 already ran well enough on Linux, so I imagine this will only make it run better, official support or not.
I'd expect it to work anyway. Under Steam at least. There is nothing special about the Steam Deck/SteamOS that's not available on other distros when running Steam, afaik.
Well, most game companies will only tell you the game only works on something very specific, say Ubuntu 24.04, and everything else is untested/unsupported. That doesn't exclude the game will work perfectly fine on other distros, which is usually the case.
Proton version will always work better if someone does not show an example and encourage the usage of native support. With Proton you are guaranteed to never reach the optimal potential, or get full advantages of the Linux/Wayland ecosystem. While with native versions you have at least the chance to get in there.
It is like judging someone for taking an advantage of the new CPU instructions that accelerate processing because general instructions are already good enough.
Native doesn't automatically mean better - quite a few examples of games running better on proton than with native executables(and yes then we can start arguing that it just means the native port is done poorly, but I'm just saying don't assume native will always run better).
It seems like a similar argument around the popularity of third party engines, whether studios should use Unreal, or whether they have the expertise/resources to change to and use another engine, or make their own bespoke engine, and if that will produce better results.
I think that is not fair comparison. Proton adds additional layer which can be completely removed and affects the runtime performance. Switching different game engine changes the layer implementation, instead of removing.
When Proton started to get good, there were multiple stories of small game studios just dropping their bespoke Linux builds because the Windows->Proton version ran much much faster and required zero effort from them.
That'd be nice, though at the moment I hope that if the update instead breaks something on Linux -- a distinct possibility --, I can go back to the Proton version which has been working pretty much perfectly.
Something I wonder is if this new version is a linux build specifically targeting the deck hardware+OS setup, have Larian now committed themselves to following whatever Valve does in future for changes to that setup. In any case, they've got a fallback which is the windows version on proton, but it's inverting how Valve has trained many to behave which is to make just a windows version and delegate linux support to them.
There's also been persistent speculation about whether Valve would take on the burden of releasing SteamOS as a general distribution anyone can install on their own hardware (which I think is unlikely), which could in turn affect how Larian has to treat this port even if that is just communicating what it is and isn't.
> The demographic for Linux has a higher than average enjoyment of cuteness, especially with regards to anime
... nope. I loathe the art style of anime, and whatever I see using it I associate immediately with trash.
> While some Linux users may be anime fans, Linux is a production grade kernel used in servers, embedded, distributions and even your local gas pump. ASCII anime art undermines the professionalism and neutrality such a project requires.
Even less is needed to generate danger: I found myself checking my phone twice during a car trip because when listening to music through the USB-C to Jack dongle, it believes I am blasting music at full volume through my ears and decides to cut off the volume to 10% after 20 minutes.
Don't, and I mean DON'T decide things for the user.
reply