So yes and no. I often just let it work by itself. Towards the very end when I had more of a deadline I would watch and interrupt it when it was putting implementations in places that broke its architecture.
I think only once did I ever give it an instruction that was related to a handful of lines (There certainly were plenty of opportunities, don't get me wrong).
When troubleshooting occasionally I did read the code. There was an issue with player to player matching where it was just kind of stuck and gave it a simpler solution (conceptually, not actual code) that worked for the design constraints.
I did find myself hinting/telling it to do things like centralize the CSS.
It was a really useful exercise in learning. I'm going to write an article about it. My biggest insight is that "good" architecture for an current generation AI is probably different than for humans because of how attention and context works in the models/tools (at least for the current Claude Code). Essentially "out of sight out of mind" creates a dynamic where decomposing code leads to an increase in entropy when a model is working on it.
I need to experiment with other agentic tools to see how their context handling impacts possible scope of work. I extensively use GitHub Copilot, but I control scope, context, and instructions much tighter there.
I hadn't really used hands off automation much in the past because I didn't think the models were at a level that they could handle a significantly sized unit of work. Now they can with large caveats. There also is a clear upper bound with the Claude Code, but that can probably be significantly improved by better context handling.
Imo there is nothing wrong with generating code with AI, it's the effort spent on supervising the quality of product that matters.
But that requires you to have certain levels of knowledge on that domain to begin with, which is not something you can just "vibe" your way out, at least for now.
Curious, if you don't mind answering, do you mainly uses Ubuntu or Nixos, and which one do you liked more ATM?
Regarding Steam, do you install it with distro provided or through Flatpak?
What is the spec of your machine that you do Linux gaming on? I've noticed a notable performance penalty (around 10%, even higher on GPU heavy games) when running games with Proton, which is mainly why I haven't dropped Windows yet.
I try to use debian, since it's a bit older (read: stable) than ubunutu and I've found that if something compiles and runs on debian it'll run on ubunutu and others but the inverse is not true.
I quite like CachyOS currently. I see no performance penalty (but I also have only a 75 Hz monitor and I haven't tested VR games all that much yet). Currently I'm playing through Kingdom Come Deliverance 2 on ultra with no issues.
CachyOS provides packages for Steam, handles nvidia drivers for you and they even provide their own builds of proton and wine, allegedly compiled with flags for modern hardware + some patches (not sure how much they help though - before Cachy I used Pop OS and also had no problems with performance).
Cachy is based on Arch though, so unless you're ready for your system to potentially break with an update - maybe used something more stable (again - I quite liked Pop OS, it was extremely stable for me)
I've been using Arch for 1-3 years now, as far as I can remember the only time that my system "break" was caused by pacman lock got stuck somehow. Aside of that it's pretty stable in general.
> I've noticed a notable performance penalty (around 10%, even higher on GPU heavy games) when running games with Proton, which is mainly why I haven't dropped Windows yet.
I don't mean to dismiss your comment at all, but I'm surprised that such a low overhead would be the primary reason holding you back from switching. The difference between, say, 100 FPS and 91 FPS seems so negligible in my mind that it would be pretty near the bottom on the list of reasons not to switch to Linux.
If you don't have an adaptive sync +variable refresh rate) monitor and everything set up to use it, and don't like screen tearing (you enable vsync wait), overrunning the frame budget (e.g 16ms for 60hz) can mean dropping down to half the frame rate.
But I'm hunting for reasons here. A gaming setup should be using adaptive sync so those concerns mostly go away. But there may be problems with Linux support.
I remember back then my laptop broke during the beginning of COVID, and I was left with a smartphone that is incapable of doing Termux stuff.
To cope with that I have ended up making some toys like Discord bot that evaluates code, requested access from Insomnia 24/7 to SSH into Linux environment for programming purposes.
It was fun experience and I've ended up learning a lot of programming stuff before I've even started my study in university for computer science.
The issues mostly link to repositories that were created more than a year ago. So I think what happened is that they already forked the repo and made changes on top of it to get it to work on HarmonyOS, but there was no bandwidth for ongoing maintenance. Someone finally realized that this is unsustainable and demanded that the changes get upstreamed, but the people charged with implementing this directive don't know how you upstream changes. Hence this hamfisted attempt.
I remember researching about early era of internet while trying to make a game for a game jam about online shopping, and damn, it sure is a deep rabbit hole.