Hacker Newsnew | past | comments | ask | show | jobs | submit | lholden's commentslogin

I ended up doing something similar a few years ago. Picked up a MacBook Pro M1 Max back when the M1 stuff was new to replace an aging Lenovo running Linux. I actually really loved my Lenovo + Linux, but the M1 was new and shiny and I desperately wanted better battery life.

The hardware was great, but life on a Mac always felt a bit convoluted. Updating the OS was especially frustrating as a software developer because of all the interdependent bits (xcode, brew, etc) that often ended up breaking my dev environment in some way. It also always amazed me at the stuff that was missing. Like, how isn't the default terminal app fully functional after all these years? On the plus side, over the time I used it they did add tiling and the ability to hide the notch.

Finally at the start of the year I moved back to Linux and couldn't be happier. Had forgotten just how nice it is to have everything I need out of the box. The big thing I miss is Affinity Photo, though that looks like it's in the middle of dying right now.


Exactly! I too bought the M1 Macbook Air in 2021 because of its great battery life. I wanted a powerful device for hacking on personal projects at home (I use a Dell running Ubuntu at work) but every time I opened it there was always something frustrating about OS X that made it unsuitable for dev stuff (at least for me)

* Finder - this is my most hated piece of software. It doesn't display the full file path and no easy way to copy it

* I still haven't figured out how to do cut/paste - CMD + X didn't work for me

* No Virtualbox support for Apple Silicon (last checked 1 year ago)

* Weird bugs when running Rancher Desktop + Docker on Apple Silicon

But still Apple hardware is unbeatable. My 2015 Macbook pro lasted 10 years and the M1 is also working well even after 4 years.


> * Finder - this is my most hated piece of software. It doesn't display the full file path and no easy way to copy it

View -> Show Path Bar to display the full path of a file.

When a file is selected, press Option-Cmd-C to copy the full file path. Or just drag the file anywhere that expects a string (like the Terminal, or here). That strikes me as quite easy.

Cmd-X, -C, -V work as expected, what exactly is the problem? (Note that macOS, unlike Windows, doesn't allow to cut & paste files to avoid loss of the file in case the operation isn't completed. However, you can copy (Cmd-C), then use Option-Cmd-V to paste & move.)

Now, that might not be completely easy to discover (though, when you press Option the items in the Edit menu change to reveal both "tricks" described above, and contain the keyboard shortcut).

At any rate: when switching OS, is it too much to ask to spend a few minutes online to find out how common operations are achieved on the new OS?


FWIW, Virtual box did get ported to Apple silicon, but long time Mac software developer Parallels has a consumer grade VM management software. Theirs supports directX 11 on arm windows, which is critical for getting usable performance out of it. Conversely, VMware's Mac offering does not, making 3d graphics on that painfully slow.

There's also a couple of open source VM utilities. UTM, tart, QEMU, Colima, probably others.


Reg finder: you can drag and drop the little folder icon into other apps, which will insert the full path.


In Finder moving a file is mac+c like copy and mac+shift+v or something like that. mac+x and mac+v works for text.


I have an M1 air and 8th gen intel Dell (openbsd) and I’m much happier wit the Dell for hacking on stuff. MacOS is pretty much a nightmare if your workflow is not apps and IDE centered.


What’s missing in the terminal app?


It is maybe one of the most featureless terminals out there. Slow, poor color support, weird and frustrating permission interactions, limited font options, incomplete terminal emulation, etc.

It has improved a bit over the years and is generally fine if you just need to knock out a few commands. But I don't find it to be a very pleasurable experience compared to the alternatives. It feels very much like Apple implemented "just enough" and no more.


For the tmux-ers, tmux integration. iTerm2 integrates really well with it, for both local and remote sessions.


Download iterm2 and see for yourself


iTerm2 is a must. This is probably the only Mac app I miss on Linux. Kitty and Ghostty are missing so many important features, they fill like hobby proof of concept terminals to me. The closest alternative for Linux IMO is Wezterm.


It’s a TextEdit vs Pages thing. Try Ghostty, Kitty or iTerm2.


The one that always surprises me is that there is absolutely no image editor of any kind.

But really, I just don't use that many desktop apps (or at least, not generic ones) so I don't have much of an issue on MacOS.


I frequently edit images in Preview on my Macs.


Yeah, you can resize and change format and that's about it. No drawing tools of any knid.


It does have drawing tools, as well as tools for working with exposure, sharpness, color, text, shapes, selection, etc. I’d suggest exploring the features in Preview. It can do a surprising number of things with images.


Preview is also great for saving and applying wet signatures.


Preview very much does have drawing tools


The lack of a native simple image editor does indeed suck. I've been using Paintbrush for years and it's good enough

https://paintbrush.sourceforge.io/


Pretty sure you can run Asahi on that? Might have been worth the effort instead of swapping out the machine as it's still pretty capable.


I didn't actually buy anything new for my transition back to Linux. I have a gaming system that had traditionally been running windows. It's a powerful system, but has always been a "toy" running Windows for playing games. Last year I moved it to Linux and have been incredibly happy with the move.

These days I am also now working from home full time, so it kinda hit me. "Why the hell am I trying to work from this MacBook when I have my really great gaming desktop that runs Linux now?" Moved my work over and have been incredibly happy.

I'll have to give the Fedora Asahi Remix a go on my MacBook Pro though. That's a great idea!


You can run Affinity Photo on Linux via Wine, though


Took me a moment to realize this isn't related to 86Box, the low level hardware emulator.

Cool project though. We needed something like this in Linux.


I used this for a couple years many many years ago. I even made several dock widgets for it for various purposes. The source code for these widgets even helped me get my first programming job!

Good memories!


My mom lived in a historical house when she was a kid in the 60s. Since then, the house has become a museum. There are a lot of "artifacts" on display that "came from the 1800s" that are actually just toys my moms brothers made. My mom got a good laugh about it when she took me to visit the place.

I'm sure these finds must have dated in some way to verify the authenticity, but I always think back to seeing my uncles toys on display as if they were historical artifacts when I see stuff like this.


A lot of these historical house “museums” are a pleasant diversion for tourists more than anything else. Note how they are all haunted - ghost tours are pretty easy money


I have never come across a house museum that claims to be haunted. Must be a cultural thing that doesn't exist in Belgium. Possibly because loads of things are ancient here anyway, no need to embellish with more nonsense I guess.


yes, probably an american thing. i am very much not a fan of these “ghost tours”


>> sure these finds must have dated in some way to verify the authenticity ....

What happens if the uncle used very old wood or cloth to make the toys. Will the dating technique be able to find the actual age ?


Dating is done with more than just material analysis. Evidence of tools used to make the toy, techniques for things like joins and stitching, etc. can all be indicative of methods that can give at least a lower bound. How applicable method differentiation is to this specific case obviously depends on a number of things.


Has your mom got in touch with the museum to tell them that, so they can improve? If not why not?


I honestly don't know. I imagine not.


It’s not their mothers responsibility to correct their incompetence, or (imo) more likely negligence and information falsification.


A) I didn't say it was their responsibility

B) I don't wanna assume malice where incompetence will do


If you walked into a old house and found a random toy, would you automatically assume the toy is as old as the house?


If you saw something broken that's not your responsibility would you fix it?

I know the answer in your case, but believe me that a lot of people would.


With his rhetorical question, he's saying it must be malice because no museum operator would be that incompetent. They're actively making a false claim to their customers. They could have just not said anything about the age of the toys since they know they didn't verify it. I think you can see this must be the case since you didn't answer his question.

In my country, if a business makes a factual claim about its products, it has to have already verified the correctness of it to some reasonable level and have the documentation so show that. There's no room for this "oh, I just assumed it was true because I'm incompetent" excuse.


> you can see this must be the case

What I can see is the case is that you're both way too confident in your inferences for a bunch of mind readers. Why not just call them up and find out? Oh right "not my job".

And btw "no X could be that Y" last time I heard that line was in a sitcom.


If you care so much, why don’t you call them?


Because I don't have their contact.


Is that another way of saying “not my job”?


> If you saw something broken that's not your responsibility would you fix it?

If it were a genuine mistake and not fraudulent, yes.

> I know the answer in your case, but believe me that a lot of people would.

Weren’t you just talking about not assuming malice?


I wasn't assuming malice from you, I was assuming that either too lazy or too self-centered. No malice.

You can't get one in today huh.


I’m not sure that you see my point.


Maybe they just made a mistake.


Most people think of swap as "emergency memory in case I run out of memory" and while it's true that it can get used in this way, it usually serves a much more critical purpose in your OS's ability to reason about and use memory.

For a good article on why this is true for Linux: https://chrisdown.name/2018/01/02/in-defence-of-swap.html

I believe that most operating systems are going to make use of memory in a similar manner.

With that said, I'll turn off swap on devices that have unreliable storage. (Anything using an SD card)


I've always hated one bit of that article. It doesn't address the "low" and "no" memory contention cases separately, even though they're quite different and it's possible to get a system to stay in the "no memory contention" case all the time (unless there's a memory leak, but swap just fills up then and provides no benefit).

> Under no/low memory contention

> With swap: We can choose to swap out rarely-used anonymous memory that may only be used during a small part of the process lifecycle, allowing us to use this memory to improve cache hit rate, or do other optimisations. > Without swap: We cannot swap out rarely-used anonymous memory, as it's locked in memory. While this may not immediately present as a problem, on some workloads this may represent a non-trivial drop in performance due to stale, anonymous pages taking space away from more important use.

This is under the low/no memory contention. For the low memory contention case, this can make sense, but for the no contention case it's nonsense. There's no more important use, all uses are fulfilled and there's still free memory (that's what "no contention" means)!

So clearly that very page has presented a case where swap is useless: when you have enough RAM to ensure there's never contention.

Swap also only extends the amount of memory by the size of the swap partition. If you've got 64GiB of memory and an 8GiB swap partition, you could just as well have 96 or 128GiB of memory and reserve an 8GiB zram for swap.

Indeed, Fedora changed to use zram instead of swap-on-disk by default in Fedora 33[1].

Swap does allow some fancier optimizations to memory layout, but again swap-on-zram is better for this than on a disk if you've got enough RAM.

The big benefit of swap is on laptops where hibernation may actually be desirable (assuming encrypted swap & disk) and RAM is harder to come by. A laptop with 96GiB+ of RAM is a LOT more of an expense than a desktop with the same.

One huge disadvantage of swap-on-disk that article neglects to mention is that sensitive data from RAM can be written to persistent storage (swap) and henceforth leaked more easily (assuming unencrypted swap). Swap must be encrypted if it's disk-backed.

[1] https://fedoraproject.org/wiki/Changes/SwapOnZRAM


Scheme, Racket, Clojure.


Awww. "Please don't leave me! I bring you gifts!"

I had a cat that would gift give if I was away for a while too. I always figured it was a bribe to not do it again. He also liked to play fetch, but it predated the gift thing.


This isn't related to SWEET16 directly, but a workflow I enjoy for 6502 assembly is to use the zero page as a data stack. With a small set of macros and functions you can easily pass a very large number of values to a function this way. This also makes 16 bit (or 24, 32, etc) math a lot easier as well.

You can use the actual stack for this, but it gets weird with JSR, especially when dealing with nested function calls. Of course, on systems that use the Zero Page for other uses (Like the Apple II) this isn't really an option. You can use a different location in memory, but it isn't as efficient.


A few years ago, I wanted to switch my leisure time computing to a device that would basically keep me from being able to work when I'm not actually working. I'll often sit in front of a screen from the couch or whatever and I wanted to make it impossible for me to keep working.

I ended up buying an ipad for this and have been using it ever since. I will say that while the iPad has achieved the goal I set out, I do find it pretty frustrating to use at times.

- You are basically stuck with Safari in some form. You can install an alternative browser, but the "engine" behind rendering is basically Safari.

- The limited amount of system ram can become an issue while browsing. Now granted, I have the last iPad model before the M1 came out and it has a little less ram than these do. But, things can get really weird when you have multiple tabs. Some pages will even stop functioning until you reload them. Occasionally you have to restart Safari (though rare)

- Many native apps have an iPad version, but they are just a stretched out version of the iOS app and look terrible. You can mitigate this by using the "slide over" feature to dock the app to the side of the screen.

- I have the 12.9" iPad and it ends up being a weird middle ground. In landscape, especially when docked to the magic keyboard, it works pretty well. But it feels awkward and heavy (especially one handed) in portrait. When docked to the magic keyboard though, it's very back heavy and will often tip over if not supported.

I'm at a point where I am wanting to do a hardware upgrade, but having a really hard time deciding on "what". The 12.9" iPad I currently have is such a weird middle ground. A smaller device would be nice for reading technical books (or D&D stuff) in portrait. A model with more memory might sort out some of my browsing issues. A ultralight laptop would solve many issues and fit many of my use cases but not all of them. I'd love to just have one device, but I kinda feel like I'd be better off with two.


Your issues are valid and as a never-mac user for 10 years Ive been baffled watching tech friends buy and use Apple products.

Google says your ipad is 1.5 lbs, my thinkpad X1 nano 13" is exactly 2 lbs and does everything iPad does, WITH A KEYBOARD! 1 tb, 16 gb ram, even a sim card slot. Oh, and actual PORTS, for a COMPUTER, imagine?

It honestly looks like brainwashing, I've interrogated smart friends about why they use apple and I can NEVER get a straight answer. I think it is because they don't want to say the truth, they use Apple so other people know they can afford Apple. Thats it. The products such, the marketing is hype and they have no direction as a company.

Also, the Apple VR headset is idiotic and sure to be a flop.


I appreciate your stance, though it's dripping with the zealotry you are complaining about. Just in the other direction. :)

I like the nano, but it suffers from the "I may find myself doing work on it when I'm trying to not be working" problem. The X1 series are great though, I've been a big fan and user of the ThinkPad X1 Carbon for several years.

I will say, my iPad had nothing to do with any kind of love for Apple. Buying the iPad was to prevent myself from being able to use it for work, and I do think it's served that purpose pretty well. It also beats the hell out of my X1 Carbon for watching movies.

But I don't really understand how people can seriously claim they use it to entirely replace a laptop. It's just too awkward of a device for that use case.


> Also, the Apple VR headset is idiotic and sure to be a flop.

As someone who spent a lot of time looking into it, as well as talking to a couple friends working at Apple who use it near daily, I have a feeling that this comment is gonna age about as well as the infamous “why cloud storage, just use rsync, it is that simple” comment.


I have the opposite problem. I kind of wish you could get more work done on a phone, perhaps it would replace some of my pointless scrolling time....

To actually use anything digital for leisure, I use a TV or an e-reader to avoid going in the scroll hole, which is like counterfeit leisure.


Which accessories would you recommend for the large iPad. Which keyboard. Cover. Pen. Docking station?


I have the magic keyboard, which doubles as a case. It's actually a pretty decent keyboard, though it certainly adds to the weight of the device. Pretty easy to recommend though.

I have the apple pencil and it works pretty well for drawing, but I don't otherwise use it for anything else.

I don't really have anything else accessory wise :).


This is so funny to me and will never understand the logic in buying a more expensive stripped down laptop, then spending MORE money to upgrade it back into a laptop? Take away all the ports, then just buy a dock to add the ports back. Take away the keyboad and mouse, no problem, just buy a wireless trackpad and keyboard! Oh, speakers suck cause they are too small, just get out bluetooth speakers! They are made by Dr. Dre!!!

Are we just showing off how much money we can waste at this point?


My wife who is allergic to all things computers loves iPads. If I let her touch a PC or MacBook it is instantly broken and full of malware.


Very misleading article title.

The licensing changes target big commercial usages outside of game development. (With revenue thresholds, similar to how it already works right now for game development.)

For example, up until now Unreal has seen use in vfx for movie and tv production. The licensing model for Unreal was primarily oriented for game development, which meant that this wasn't generating any revenue for Epic unless that company opted into the optional support plan.

Unlike the crazy situation with Unity, these changes are being announced in advance without affecting usage of previous versions of Unreal.

(Not saying I like or care for subscriptions for software. But context helps understand what's going on here.)

I'm surprised they didn't make this change sooner.


Based on your bullet points here, I'm not sure what you find misleading?

The headline doesn't specifically say outside of games, but I don't think that makes it misleading, especially because the "for all" makes it clear that it's only about some cases. And people generally know that games pay.

Being announced in advance, not affecting previous versions... none of that is implied otherwise by the headline.


The title, especially in light of the stuff that went on with Unity, makes one think that this will affect a much wider group of people than it actually does.

Unreal was never "free for all". For game development, there has always been revenue thresholds.

The new licensing is around commercial use outside of game development, and will also be revenue threshold based. Meaning, just like with game development, if your project is making you money over X threshold, then the licensing kicks in.

The title is misleading.


> The title, especially in light of the stuff that went on with Unity, makes one think that this will affect a much wider group of people than it actually does.

If you look at the title and think of a different company, that's not the headline's fault. It doesn't even try to reference Unity.

Also what I said in my previous comment is relevant here.

> Unreal was never "free for all".

I accept that it's bad wording, but I don't see how that misleads anyone unless you thought the engine was completely free.


"no longer free for all" implies that it was "free for all" and the change is making it no longer "free for all". In other words, the title is presenting information to the reader that is literally not true.

For your personal reference, here is how the dictionary defines the word "misleading". (Cambridge and Merriam Webster, respectively)

> causing someone to believe something that is not true

> to lead in a wrong direction or into a mistaken action or belief often by deliberate deceit

> to lead astray : give a wrong impression

I would say that the title manages to hit on all 3 of these definitions, with a possible note that perhaps the author "misspoke" rather than intentionally creating a deliberate deceit.


I think the sense in which the title is being criticized for being misleading is not the interpretation where the engine was completely and entirely free. I'm sure a few people thought that by accident but it's not what people are talking about when they bring up comparisons to Unity. That particular wording issue is not something that gets a top comment callout. There isn't any motive to cause that particular confusion on purpose.


> If you look at the title and think of a different company, that's not the headline's fault. It doesn't even try to reference Unity.

Things don't happen in a vaccuum. The Unity fiasco is still fresh on everyone's mind. For any company anywhere in the gaming sphere to change their prices so soon - is going to draw comparisons to Unity. The fact multiple people are discussing this with you should prove that.


Do they need to explicitly contrast the situation with Unity in the headline to avoid being misleading?

Headlines don't have a lot of space!

And as I said in a different comment, the subheading seems to address the main complaints, and in most situations you'd see the headline and the subheading together.


I think the mere fact this is an engine license change happening so close to the Unity fiasco that yes, it is impossible to talk about this without automatically invoking thoughts of what Unity did. Some news and blogs will take advantage of this association and use it for clicks.


Are you suggesting it's impossible to avoid being misleading, or do you have a solution in mind? Since you didn't really answer my question.

If it's the former then I think that absolves the author.


I just read through this chain of comments and I do think there is a lot of talking past each other.

I do not think the author was purposefully being misleading. I do not think they need absolution for anything. I think it may be interperted as misleading by some (as it did for the original commentor) based on the reader's recent experiences with Unity and the latest drama; which is not a property of the article, the headline, or the author - but the reader.


>Headlines don't have a lot of space!

"Unreal Engine starts charging for non-game development".

It's not about space, and we know it's all too easy to bury the lede and leave the internet to lash out as it is oft to do. But I don't blame the author. I know in larger sites editorial will make the title without context of the writing independent of the contents in order to maximize traffic.


>"Unreal Engine starts charging for non-game development".

Exactly. Surprised that argument was coming from an 2010 account.


> If you look at the title and think of a different company, that's not the headline's fault. It doesn't even try to reference Unity.

Being aware of current events and how they may shape how people come to conclusions is a skill that not everybody possesses I suppose.

The headline is most definitely misleading and poorly worded. A better headline would be "Unreal licensing change targets tv production use cases" or something like that.

As the other user pointed out, Unreal Engine was never free for all. So wording it like this misleads the reader into thinking about the most commonly talked about use-case, video game development. And when they read "no longer be free for all" it leads them to believe that people that currently don't pay for Unreal Engine may now have to. Hence the title being "misleading." But hey, journalism is a skill that you work on over time. The editor should have caught this.


> I don't see how that misleads anyone unless you thought the engine was completely free.

That's how i understood it, was surprised, clicked the thread to find out more.

Sample size of 1, but I was (maybe accidentally) misled.


It was very much intentional, don't blame yourself. Or I don't know, blame yourself for not reading the article? Either way, this is why an accurate non-click sit headline is necessary, there are many many more people like you and some like to shout and spread misinformation on social media.


"Unreal Engine will no longer be free for non-gaming companies" is 100x less misleading. You are right in that technically speaking it's not misleading, but to me it feels like it's skirting the line of lying by omission. Obviously now with hindsight I can see how the "for all" changes the meaning, but it wasn't obvious (at least to me).


No, the headline is blatantly trying to elicit a negative response towards Unreal i.r.t. Unity's recent pricing change.


Blatantly!

Please explain how this headline connects to Unity at all?

Yes we can assume a lot of people reading the headline will know about it... but I don't see this blatant connection. Would they have to explicitly say it's not like Unity in the headline to escape this?

They even put "non-gaming creatives" in the subheading. Maybe whoever posted should have copied that? If you post the link on most social media you'll automatically see the headline and the subheading.


It's very obvious clickbait. It's clearly exploiting the Unity news.

Clickbait can only be analyzed in terms of the title, not the sub header or content. Because the title is what gets reposted and makes people follow a link.


The headline assumes the point. If Unreal Engine will no longer be free for all, that implies that, right now, it is free for all, like e.g. Godot. Which is just emphatically untrue.

It's like saying (to use the classic example), "I will no longer beat my wife." I never did, just like Unreal was never free for all.


Yep, it seems pretty fair.


>I'm surprised they didn't make this change sooner.

Same here, I thought they were also earning big money from TV and Hollywood. As they have been constantly improving non-gaming usage in every release.


Rather than make excuses for more revenue, I think it's apparent that programmers cannot trust commercial engines or their business models to not up and change one day to suit Epic's revenue.

In-house development doesn't suffer from this issue, and you'll have full control over the code.

Unreal doing this around the same time Unity pulls its shit isn't a coincidence.

As an aspiring game dev I kinda want nothing to do with these tools that want to jerk you around on pricing and aren't absolutely crystal clear on costs.

Blender is free and can do a ton.


>In-house development doesn't suffer from this issue, and you'll have full control over the code.

Proprietary engines suffer from plenty which you haven't stated - can be a mess to read or understand; many hacks done to accomplish a certain feature because it helped ship X feature

- tons of tribal knowledge. If you've worked with a proprietary engine before, you already know documentation will be lackluster and to little fault of the engineers - there's so much to know about the engine that developers don't have the time to chart out what everything does in the engine while pushing out fixes and features.Often, you need to poke the principal programmer who's been with the studio since its inception to understand how a certain long-existing feature works. That's a major point of weakness for the studio!

- Engine limitations! Ask the bethesda devs on their experience building multiplayer for Fallout 76[0]. Imagine building multiplayer in an engine that has never needed to support it. That's a huge refactor and a ton of time spent doing that when it's already handled by Unreal Engine. Developers will need to maintain that engine in the future so the pain doesn't stop after the game gets shipped!

Your post sounds like someone who hasn't worked in game development before. I advise listening to GDC talks, noclip documentaries, and more if you want to get a better understanding of what game development actually looks like. It's a lot more complicated than "your change in price policy makes me mad" (by the way, most AAA studios already have contracts/price agreements with these engines given the amount of revenue they generate for Unity/Unreal).

[0]:https://www.youtube.com/watch?v=gi8PTAJ2Hjs


Game development is not exclusive to business, but you are correct I have not worked for a firm to make a game. Nothing about that arrangement attracts me, especially given the abusive nature of the industry and the frequency that they go through crunch, lack any real worker protections, no unions, etc.

The way AAA studios make games and do business puts me off as well, so pointing to them as an example doesn't really change my outlook. I already don't buy their games and disapprove of their business models.

If I was interested in being exploited for my passion I would consider entering that industry, but as it stands I will be going solo dev.

No game dev company out there seems to treat its staff well during a game's development, so even if I wanted to work on a game as part of a team, I'd be looking at a poor work/life balance and a stressful work environment. I'm too old for that kind of BS.

If I can't build and release the game myself, then it simply isn't good enough to release. I cannot trust collaborators to not take control of my projects, nor would I entrust creative ideas to a for-profit entity without my cut.

Long story short, I might work in the industry if it was a healthy one. Because it's not, and I still want to make a game, it falls to me and only me to make it happen. That's kind of comforting, knowing your failure or success ride on your own action instead of someone else's. Nothing is more disappointing in a group project than failing because of someone else's fuck up.


>Nothing about that arrangement attracts me, especially given the abusive nature of the industry and the frequency that they go through crunch, lack any real worker protections, no unions, etc.

I'd beg to differ on this point. Lots of changes have been made in game development culture including less crunch culture[0]. Worker protection/unions aren't exactly something that's afforded to many white collar jobs in the first place, not sure why that would be an expectation here. Even so, there have been improvements to this - e.g. the Game Workers Alliance. I encourage you to ask developers this question today.

>No game dev company out there seems to treat its staff well during a game's development, so even if I wanted to work on a game as part of a team, I'd be looking at a poor work/life balance and a stressful work environment. I'm too old for that kind of BS.

There are game companies that do treat their staff well! I don't think it's fair to make blanket statements like this when there are a ton of studios with a ton of varying cultures. It's not like solo development isn't stressful or immune to crunch either, even if you choose your own hours. Solo development calls for highly varying skills - it's one of those things you underestimate until you've actually tried it.

>If I can't build and release the game myself, then it simply isn't good enough to release. I cannot trust collaborators to not take control of my projects, nor would I entrust creative ideas to a for-profit entity without my cut.

Nothing good in this world gets built in a vacuum. A hyperbole, potentially (e.g. Stardew Valley, Rainworld), but game development really is a road best driven with a team - people to help out in different disciplines, lighten the load on others. Finding a good team is hard, but once you do, it's hard to want to forgo them. I don't think I can convince you on this front, but the vast majority of solo developers who don't release a game should be proof enough.

No hard feelings from me - I just wanted to clarify what the game industry is actually like today. The Kotaku articles can be frightening, but talking to people in the industry today and getting thoughts from different roles (e.g. producers, designers, engineers, QA, artists, etc.) and different industries within game dev (indie, AA, AAA studios,etc.) would help form a more informed opinion.

[0]: https://twitter.com/GrantPDesign/status/1402325020890652672


I appreciate your input but I'm not in it for a career. I'm in it for personal satisfaction. If I can't build a game on my own then I'm not good enough to call myself a game developer.

A team can't bring that satisfaction to me. Kudos to those who enjoy working in groups. For me, I end up doing more than my share of work and correcting others' mistakes. At that point, you may as well make it yourself. People are more of an obstacle to my progress than they are an enabler.


I mean, why would you expect them not to charge you money for something they clearly spend tons of money in developing? They are a business.

The bait and switch is real, but after N of them we have to start asking ourselves whose fault is it if we fall for the N+1th one.


In my case I haven't fallen for anything, but this community seems to have a problem with pointing out it's mostly businesses doing this behavior. If you stay away from commercial software, this crap disappears.

It shouldn't be normal to expect to be exploited imo.


It was a rhetorical "you" ;) Businesses will do business things, I don't think we should really be surprised when they decide to not subsidize everyone anymore.

In other words, a company giving you a "free" product like Unreal should be assumed to be a loan that you will have to pay in the future.


> but this community seems to have a problem with pointing out it's mostly businesses doing this behavior.

Wow. I was typing "It is fairly recent behavior, at least it wasn't like this before 2014."

Then I realise that was nearly 10 years ago......


It isn't a coincidence. But the situation is larger than unreal nor Unity. It's the beginning of Q4/end of Q3 and this tends to be when companies make new initiatives and it's been no secret that big businesses's low interest rate borrowing has been done for months now.

Same reason why we have yet another flurry of layoffs happening.

>In-house development doesn't suffer from this issue, and you'll have full control over the code.

Yet many AAA studios have at least dabbled with Unity/Unreal. A few have switched entirely. Engine programmers aren't cheap to keep in house and it's much easier to have entry level workers come in with existing engine knowledge than teaching them on the job. Even if this all feels shitty, full control for a business may not necessarily be the answer.

>As an aspiring game dev I kinda want nothing to do with these tools that want to jerk you around on pricing and aren't absolutely crystal clear on costs.

I wish you the best of luck. That's my endgame. But I'm not at a point where I can disengage from big corporate and I have bills to pay. I'm laying the groundwork slowly but maybe in a decade.

>Blender is free and can do a ton.

Blender isn't a game engine. And lender's attempt at making a game engine is exactly why it can be harder to switch from big corporate than it should be. It's a lot of moving parts and is hard to maintain. Open source's biggest weakness is interest, since there is no financial incentive to keep supporting a free product.

That said, look into UPBGE as a spiritual successor if you rely a lot on Blender for development.


I am in 2D for now, so anything SDL based is enough. But, part of what's put me off of 3D is the business side. Modeling tools are difficult to use and take years to learn adequately. I'm not interested in paying for a sub while I'm learning, and terms in a license that are subject to change do nothing to inspire confidence in any particular solution. This is a social and ethical problem imo.

As you pointed out, doing it correctly requires experienced developers who will stick around. I think that's a more rewarding and better cost to spend money on. At least the worker won't try to modify the terms of what you're building on.

I'll check out that project sometime. 3D is still a ways out for me but any libre software that can make it easier to learn sounds great.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: