"I found that using only one monitor allows me to focus more on what I’m doing, and I don’t miss anything about a multi-monitor setup."
This.
I had the same realization maybe 7 or 8 years ago and it is true for me. With multiple monitors it's like moving your hands away from the keyboard all the time and looking for the mouse. Same happens for me while looking for anything across the 2 or 3 monitor setup. It just weakens your focus... If I'm on a laptop (and I usually am), I just use the laptop's screen. Getting used to a different screen size while working with my laptop setup would also slightly slow me down when I'm bound to use only the laptop.
Except you have huge monitor and you lay stuff in multiple side by side windows which makes the whole argument very moot. And when you use laptop you are kind of forced to use single screen because multiple screens is inconvenient at best.
When I started development 15" monitors were standard and 17" were a luxury. Setting multiple monitors was a must or you would have to constantly switch between editor, terminal and documentation.
If you look at the setup that's exactly what is happening. On a single screen there is browser with documentation (presumably), and there is four (!) IDE/editor/terminal windows.
It's not about how many monitors you have but rather what you are doing with them. You can put Netflix as one of the windows and still get distracted on a single monitor setup or you can have bad sight like me and require two 27" monitors side by side so that I can magnify fonts and have documentation, IDE and terminal at the same time with no strain.
> If you look at the setup that's exactly what is happening. On a single screen there is browser with documentation (presumably), and there is four (!) IDE/editor/terminal windows.
Just to clarify: there are two i3 containers side by side (each using 50%). The left one’s active window is a browser, the right one’s active window is Emacs, in which I have two buffers side by side at the top, and compilation mode and magit status side by side at the bottom.
I have three monitors, but I only use one (a 24" one, the second is for when I want to watch some show or movie while doing stuff on the main one, and the third I basically never use). I have my apps always maximized/fullscreen in the main monitor and I alt-tab between them. I never could get into having multiple apps on-screen at the same time, since I can only focus on one.
Writing this comment, though, I realize that a better alt-tab switcher would be a godsend. I keep getting confused with window order sometimes and I'm not sure why, maybe I should write a better switcher.
Try i3 as a Window manager. Put every app in a separate single window, and instead of alt-tab you use, e.g: alt+1 terminal, alt+2 browser, etc... You can have as much workspaces as you need and assign applications to specific workspace. This way I always know where every app or category of app exists. And this mostly works on default i3 config, no learning curve.
A better alt+tab switcher would be great! Why isn't that a thing? I'm imagining something where you could use alt+f1, alt+f2, alt+f3, etc. to switch to specific windows rather than just rotating between everything.
Long time ago I made a python script to manage my 2-monitor i3 setup.
The idea was to be able to quickly select things to be made visible on both monitors.
The two monitors are named primary and secondary.
To select something to be visible on primary you would press the combination (for example alt+f1) and the selected desktop would be placed on primary. Whatever was on primary would be pushed to secondary.
To select both primary and secondary you would first press what you want on secondary and then what you want on primary.
The idea is that people typically have what I call "scenes". Scene is particular arrangement of windows on both monitors.
This means that you can quickly learn your "scenes" and get them to be visible very quickly, in a fraction of a second, without having to hard-code the scene in your config file.
On Win10 you can use win+1, win+2, ... to select the apps pinned to the application bar, it launches them if they're not launched; if they are launched it switches between active windows. That sounds like what you're after.
Sorry if this is really well known I'm new to MS Windows as a user.
I use dual monitor, but have two virtual desktops giving me 4 "screens" to layout windows in. I keep the same window in the same place, but often have to use 3 docs at once and so would love a 3rd screen. Then I'd have reference material on 1), doc I'm responding to on 2), doc I'm writing in also on 2), citations on 3).
If you can live with switching between different applications rather than windows, this is the default behavior on Windows 10 and Gnome 3 using win+1, win+2, ….
I work the same way and having a keyboard shortcut set for each one of my Apps has been game changing for me. Worth trying out to see if it fits your flow:
KDE's window manager, Kwin, supports this functionality built it.
Just right click on any title bar on a window, go to "Special Application Settings" -> "Arrangement and access" -> "Shortcut". You can manually type "F8" in the shortcut and it'll work just fine.
You can also go to "System Settings" -> "Window Rules" to manage all rules you've defined.
Same here. I full window everything. And use my second monitor basically only for YouTube/Spotify. It’s really useful when I need to keep some reference up too.
Oh you're going to love this: There's an extension that launches custom programs with the current page as an argument, and I use it to launch mpv, which is then configured to open full-screen on the second screen with a high-quality stream of the YouTube video I'm on.
I have also configured the window to always stay on top and to resize to a small window on the bottom right, so if I press escape while the window is focused, I get picture-in-picture.
Also, mpv exits when it's done, so it's a great way of watching a video in full screen on the second monitor at the press of a button.
I don't remember the name right now and I'm not at the computer, but anything that will launch a program will work, the extension itself does very little.
The main issue for me is finding in which monitor the focus/mouse is
So yes, while in some situations (like debugging a complicated app, or for some "standby" information) multiple monitors are a must, for general use it ends up being too distracting.
"Except you have huge monitor and you lay stuff in multiple side by side windows which makes the whole argument very moot."
Not my case. I don't lay my stuff around. My apps are in the menu, running ones on dock (Mac or Linux) and when I do need to have few windows open at the same time (second one either being terminal or mobile emulator or whatever) I arrange them to fit the same screen. My smallest screen is 15" on my Thinkpad, 17" on MBP and an external is Dell 23". Had no issues so far with my workflow.
For dev work though, typically I need see multiple things simultaneously: input (IDE/editor), output (console/browser) and instrumentation (debugger etc.)
I'm one-monitoring these days (WFH) because my laptop can't output to more than 1 display without a docking station -- I had left mine at the office. The context switching is tiring to say the least. I'm making do by tiling my windows but with a 24" display, there's just not enough room.
I think the author's setup works because he's using a 31" monitor and a tiling window manager. Most window managers don't tile as nicely.
That said, there are tools like Terminator and tmux, which are always useful.
I've come to love the ultrawide 34" format @3440x1440 for "I want things side by side". Most of these can work as multiple monitors and split the screen if you plug in two cables, but you'll obviously have zero gap between them.
Same, that's what I have been using for GIS work, media production, and some light coding. Only thing I regret is not being able to play Xbox games at 21:9 (unless you know of a way).
When I started at my current company they gave me a single 43"
4k monitor. It felt ridiculously huge at first, but now I'm not sure I could ever go back to anything smaller. At the distance I sit from it, the ~102 PPI is perfect for 100% scaling. I bought a 43" Samsung TV to use at home.
As you mention, I find using a laptop screen for development really painful now.
Actually, I like having two screens. However, I arrange them in vertical order and one is my primary screen and the other is like additional space I can use when I need it.
For many tasks, one screen is enough/better, but sometimes it makes things a lot easier when you can look at them without having to rearrange your windows.
Yeah, one big screen is great. Only my kitchen table isn't big enough to fit a large screen as I need to sit at the short end. Because the landlord decided it was great to buy one of those weird tables with a little storage table thingy [1] below it so I keep hitting it with my legs.
I have bad eyes, and use HUGE fonts and can basically only get 1 window per monitor...
so I have 3 monitors setup:
middle for editor
left for browser/docs
right for email, slack, irc
Similar to you, I've had to increase text sizes gradually over the past decade. Currently I have 3 monitors, from left to right:
Portrait 1080x1920 22" Dell. Landscape 4K LG. Landscape 4K LG. Plus the macbook pro screen, but I hardly ever put stuff on it.
I'm not a fan of the LG monitors I got, model "27UD68-W" in 27 inch. The standby light is bright white and flashes slowly, you have to switch the screen off to stop it. Which is a huge #firstworldproblem, I know, but no other monitor I've used has done this.
I wish I had gotten a higher-refresh-rate screen with less than 4k resolution. Would really like to play some games at faster than 60fps.
>Similar to you, I've had to increase text sizes gradually over the past decade. Currently I have 3 monitors, from left to right
I find using a dark themes for everything really helps...Maybe I'm returning to my roots, but I use a lot of Ambers, Greens and Yellows for good contrast without blinding myself. If your a VI(m) user, I found the 'elflord' colorscheme comes pre-installed on a lot of machines (just ':colorscheme elflord' to try it..) Its a decent starting point.
On the other hand, I never seem to have enough light on my workbenches. I've fallen in love with ultra-bright LED fixtures from Home Depot for work area lighting... Dam screws get smaller ever year!
(extendable magnetic 'wand' is awesome for finding them when you drop them...)
> I'm not a fan of the LG monitors I got, model "27UD68-W" in 27 inch.
I run 3 x LG 27UK650_600. Most of my gaming tends to be MMORPG stuff, so I don't need anything more then 60Hz. But I am wondering how they'll handle CyberPunk 2077
To to fair i3 is so good that having multiple becomes unnecessary. It's much faster to open a second or third window in the same screen than looking up to the other screen.
For what it is worth, not too long ago I came across the concept of “portable monitors”. Took a risk on a 4K 15” model and it ended up being decent quality. Skinnier than a #2 pencil and perfect size when propped next to my 15” MBP, without getting in the way.
edit: Almost forgot to mention this — it is powered via USB-C cable. One end to the monitor and the other into my 15” MBP (13” MBP may not have enough “juice” to power).
Agreed. I’ve just dumped my second monitor as well. I’m now down to one Iiyama 22” 1080p unit. I’m considering replacing it with a larger 4K monitor but am demotivated to do this because I don’t want to futz with it all now.
The WH-1000XM3 are just about the only recommended bluetooth headphone anymore that makes switching devices that much of a pain. Most will either automatically switch between a few active devices or allow you to switch by just initiating the connection from the target device.
Have you looked at installing the patched pulseaudio-bluetooth modules to gain LDAC support? :)
Have not looked at any fancy pulseaudio setups. I don’t want to maintain them on my machines over time, as I use these headphones with 4 different devices :)
Good news is that there's some more activity towards upstreaming them; at least in my experience, they Just Worked(TM) as far as the system automatically selecting LDAP or AAC as needed.
"Probably another macbook setup" "Oh, nope, lots of terminals." "Hey, is that i3?" Then I realized who the author was.
Honestly: I tried i3 and it wasn't for me. The psychology of slightly overlapping windows (i.e. "just put this over here, out of the way but still visible" as a way of making a physical reminder for myself) is just too much part of my mental model to give up.
Also want to give you a big thank you for creating i3. After using it for a long time I can’t imagine going back to a setup without it. I almost always need 2 windows side by side and it makes that workflow so effortless.
I have one, also 31.5" which is 3840 x 2160, without any scaling, and the text is sharp as fuck. Given my experience the only thing I'd do if I bought another monitor given what I know now is to buy one same res but larger because 3840 x 2160 is a lot of pixels in a small space. A single black pixel on a white background from 60cm away is very, very close to invisible (just tried it).
I guess it's one of those things that you need to use for a while to 'get it'.
Just like switching from FHD to 4K (on a laptop). You won't be blown away immediately when you first use it, until you go back to FHD and suddenly you realize how much nicer the 4K screen was.
I run a 43" 4K display for work an I wish it was 8K, because when switching from my 15" 4K laptop to the 43" 4K desktop setup suddenly the desktop monitor doesn't look that crisp anymore.
It is so sad that for us not in the Apple Ecosystem it is impossible to find a desktop HiDPI display. We are stuck in 4k for the past 5 years. This Dell is the only "option", but I don't know how many of us can afford spending $5k on a monitor with a lifetime of ~ 3 years that has not shown great reliability.
Meanwhile, Linux has made the lives of 4k display owners unlivable due to the lack of non-integer scaling. A 5k display with 200% scaling at 27" would help the situation, but we don't have that option.
Many times I have considered jumping on to the MacOS boat, just for the high dpi display availability & scaling support advantage.
There are some very hacky ways that involve special add-on cards for the motherboard, special motherboard model requirements, gpu connector requirements and even special software to adjust the brightness.
It is doable, but it is expensive, and it may or may not work properly. One example that I remember is one where some users who attempted the above setup reported that they need to unplug & re-plug the monitor after it sleeps.
LG does not support non apple software/hardware for that monitor. You can hack your PC with TB cards and special motherboards that support these cards to show signal, sometimes. This is far away from "works fine".
It's still ridiculous. The only people I see using these monitors to their full ability are sitting less than a foot away from the monitor. Even then, kinda overkill.
If you sit really close to your monitor then whatever. Can't imagine it's good for posture or your eyes to be focusing so close so often.
I prefer numbers and measurement units to trademarked, fuzzy descriptions of experiences. Since we're talking about display density, DPI is the term I want to see used.
That's all well and good, you're entitled to your preferences. But it doesn't change the fact that it's not a 'fuzzy' description of experience. DPI alone isn't enough information to describe what it is meant to define.
Again, retina refers to the minimum resolution, at a given viewing distance, where individual pixels are not recognizable.
The part about the viewing distance is essential. As I sit at my desk, my monitor is about a full arms length away from me. When I use my phone, it's distance from my eyes is about 1/3 that of the monitor (ish).
Thus, the DPI required for my phone to have its pixels indistinguishable is a lot higher than what my monitor needs to achieve the same. So in reality, DPI is a fuzzier description of experience than 'retina' is, even if it is an annoying, trademarked, marketing buzzword (which it is, no arguments there).
Is he me? Same CPU, CPU fan, SSD, monitor, RAM, case, WM, linux distro.. Except I never made the switch to vim from emacs.
In all seriousness, I have great respect for Stapelberg and I thank him for posting this. I love seeing other dev's setups and learned a few things in his post.
That's a lot of money between screen and GPU just to get crisper fonts (as the rest of the GUI is scaled x3 and thus you don't get that much more screen real estate). Is this really worth it for you? I regularly switch between a 5k iMac and regular 1920x1200 display, and sure, I can see pixels, but I don't find myself caring that much.
Considering how many hours a day he likely uses his machine, it may work out to something like the difference between a few cents and 10 cents an hour.
That's only going to be true if the $4000 extra they spent is used for 40,000+ hours. It's likely closer to be a dollar an hour more for slightly crisper text. (Assuming they use it full time for 2 years)
The monitor/GPU have a lot more than 2 year lifespan. Depreciation is likely less than 50% in 2 years.
And he didn't pay for slightly crisper text, he paid for higher productivity. At $100/hour that 25 to 50 cents is trivial to make up. Which is why devs should never, ever skimp on hardware.
Okay, first of all let me say that I don't want to poo-poo anyone's choice of environment. If it "sparks joy" for hours every day, it's probably a good personal investment.
On the other hand, it does require additional resources being wasted, both for producing the items -- monitors/GPUs indeed do have a long lifespan, which means that the previous ones would still work -- and increased energy used per hour. So saying that any $ spent on hardware can't be wasted for devs...
And whether it will actually result in increased productivity is a good question. Readability, eye strain, enjoyment all factor in. Plenty of studies that only focus on one aspect, so easy to cherry pick a conclusion...
Again, I don't believe this is more wasteful than a few spa days or a vacation, so good for the OP. My initial post was about how that works out for them (or others chiming in). I find myself not too affected by this, i.e. my last big jump was from a 21 CRT to a 24 inch Dell, especially when it comes to the simple shapes of monochrome fonts.
“ For redundancy, I am backing up my computers to 2 separate network storage devices.”
Hopefully he understands that this is not a true backup....he’s basically created a more complex RAID 1 running two NAS...
For example a flood, a House fire, theft, or a power surge is probably going to lose/fry both then everything is gone. This is a perfect example of thinking you have something backed up but if you store it in the same place you just created a more complex RAID 1...
If you want a backup you need to store a copy in a different location preferably a different region because if you put it at your friends house down the road they are also likely to be hit by the same disaster (i.e flood, tornado, hurricane, war, etc.) as you are.
TFA explains why he didn't use RAID1: "I put in one hard disk per device for maximum redundancy: any hardware component can fail and I can just use the other device.". If the motherboard dies on one device, he still has a working device.
A backup to one computer is better than none.
A backup to two computers is better than one.
A backup to four computers in two geographic regions is better than two local.
There's no end to how good a person's backup can be. Everyone has their limit.
That’s definitely an improvement and a cheap solution. I always have to remind people of offsite backups... I had to learn the hard way and I’ve now migrated to Backblaze for one of my backup copies for my cold data .
I’m a couple of weeks into using a single giant (32”) monitor at macOS’s 3k resolution, and I’m finding it works pretty well with a 3x2 set of windows for code, terminal and docs. The center third is usually a single emacs window with two frames, web browser on the left, and inspect window or terminals on the right.
I’m also really liking the usb-c power delivery, so it’s one cable to the work laptop, and from there, it’s got a usb hub for the kB/mouse.
The killer feature is the built in KVM switch, so by switching the input to hdmi, I can switch the keyboard/moose to my mini and use it when I’m not at $day_job.
Absolutely! Every now and then you pick up some methods or tools you hadn't thought of, that make yourself better.
You just have to remind yourself to bring the critical eye so that you don't start wearing black turtlenecks believing it'll make you a business genius :)
Why Kinesis doesn't produce higher quality build variant of its keyboard? One that doesn't feel like a flimsy/cheap plastic. It seems they are charging premium just for the layout and PCB.
I don’t think the advantage feels flimsy. Certainly I wouldn’t want to be hit around the head with one (but maybe this is a bad metric for “flimsy”). I think the premium is largely for the difficulty of putting all those keys in that 3d layout. I think the old advantage 1 had dome switches and was quite a similar price to the cherry-switch advantage 2. The maltron which also has a bowl shaped layout is also expensive. But also the only bit of the body of the keyboard you interact with is the rest for the heel of your hand and often people put something on these anyway (e.g. gel pads, tape to reduce slipping, ...) so it seems bad to optimise for it.
I got one of their 'freestyle' keyboards years ago..
complete crap - I swear its rubber dome switches :-P
I got an ErgoDox EZ last year, put some nice heavy kaihl box switches in it...now I just gotta find time to learn it (and dvorak/colemak).
Currently using a Vortex Cypher with the split spacebar, I tore down and rebuild - new switches, added leds (south facing mounts so they're useless but I figured I'd try).
btw - have you seen/tried anything like the 'tractyl' [0] or ultimate hacking keyboard[1] ?
I used to use a trackball full-time, and I'm wondering how well a lil 'thumball' works...
Tractyl looks pretty neat, but I think I'd miss my thumb keys too much.
I looked pretty closely at the UHH, but I think it would be the same issue as the Ergodox -- I'm too used to the scooped keybed of the Kinesis, which makes the reach between e.g. K and 8 short enough that I notice when it's not there on a flat keybed.
You could have the trackball on one thumb and the thumb cluster on the other... I have a couple of crooked fingers from breaks that didn't heal completely straight, and I think I'm getting arthritis. So I'm trying to maximize usage of my 'strong' fingers.
I should really try a kinesis sometime. Everyone that has one seems to love them...
I got ergodox few months ago. Still getting used to it. I feel like this other ken keyboard promotes typing with your hands rested. ASFAIK this can be a culprit for RSI
I second the Logitech MX Ergo "thumb ball mouse". Once you're transferred your mouse motor skills to this (takes about two hours) you'll never go back to a moving mouse. Especially, if you've got a sore shoulder, or elbow or wrist do yourself a favor and try one of these. (Full Disclosure I own no stock in Logitech, just a very happy customer of this product.)
Thirded :). I've switched (to its predecessor) over 10 years ago and never looked back. I even works perfectly well with hectic games after a bit of practice. Additionally I'm using keyboards without numpad to further reduce the distance reaching the "mouse".
Given that this is the closest thing to a monitor discussion thread I've seen on HN, I'd like to ask if anyone has a recommendation for what I should upgrade to next.
My progression has been the 2560x1440 Thunderbolt display to the Dell P2715Q, which has been my main monitor at work and home for the past 5 years.
I've tried to upgrade from the P2715Q three times now, and always end up going back to it because I'm dissatisfied by the quality of the other panels in comparison.
I want a larger screen that can support 4K, or even 5K, with comparable panel quality. I avoided the LG 5K display sold by Apple because I heard bad things about it, but never actually tried one. It's 27" as well, so I think it would still be too small, even if the panel quality and resolution are better.
I'm starting to think I should just get a TV and use that as my display, but I don't know which would best suit my needs for writing software. One nice thing about a TV is that it would be much easier to do returns/swaps if I need to play the panel lottery for something that will make me happy. I have never done this with a previous display, since I always bought second-hand from people who had already done that for me. The original owner of my P2715Q apparently returned theirs three times before they were satisfied.
I've toyed with the idea of getting an Apple Pro display, but I really don't want to spend more than $2000 on a monitor.
The UP3218K described in the article seems like it would be a good upgrade, but it definitely exceeds my price range. I would be open to spending that much on a monitor if I knew I wouldn't be replacing it for at least five years, but I can't know that for sure. Although funnily enough, that's what has happened with my P2715Q. It's the only piece of computing equipment I haven't replaced since I first started using it.
I am also of the opinion of needing only one monitor because I can only focus on one thing at once. Whatever is not the work I am doing when I focus, goes into another desktop space. I am also a power user of an OSX window manager called Spectacle. The tools and practices that help me focus are key.
I've always been curious about higher-end USB microphones (the Rode is ~$230) compared with a potentially more upgradable XLR setup like:
- Behringer UM2 USB Audio Interface
- Behringer Ultravoice XM8500 Dynamic Cardioid mic
This would give the upgradability of an XLR audio interface and a seemingly decent mic for about $125. I imagine once everything is plugged together, it similarly comes down to just plugging in a USB, and I'd be surprised if it wasn't plug-n-play on Linux. I don't know how well the audio quality would compare, though. I'd be interested to hear more about the challenges of getting a working setup with XLR audio gear that are alluded to in the post.
On switching peripherals between both work and home computers, I've been using for a few years now a UGREEN USB Switch. It's around £21 for the USB 2 version at the moment in the UK. So it's basically what the author uses but you connect both machines there and click a button to toggle between machines. On mine there's 4 ports only but maybe that's enough for most people - I'm only using half.
One minor benefit of TFA to me this morning is I had a 120mm case fan that has been vibrating / buzzing, but I've been delaying buying a replacement just due to not wanting to choose the wrong SKU (again).
Scrolling through the size I see the recommendation of the Noctua NF-A12x25. Probably 3x what I paid for the fan that is being replaced, but it could be the only component in the case that stays during the next rebuild!
I do not have any experience with it. My expectation would be that the IP network quality itself is good (provided by init7), and hopefully the underlying platform is stable and well-provisioned enough.
The hardware section makes it sound like he's rebuilding his computer multiple times a year. Do people really find upgrading that often is worth the cost and the bother? What sorts of workloads do people deal with that having top of the line hardware even matters?
Some folks like having shiny hardware and it’s partly a hobby rather than a need in almost certainty barring specific niche professional needs. I upgrade my workstation setup quite rarely because my work provided laptop is what I use 90%+ of the time. Top of the line is one thing (Xeons and Threadrippers or EPYC) but pretty powerful cost effective CPUs like a 3900X or 9900k are another. Even though I do ML stuff occasionally I still do it as a useless hobby and it’s not worth bumping up to a Threadripper or a crazy GPU, especially when I can use a cloud machine for a fraction of the price and as a bonus force myself to make it somewhat repeatable and deployable for someone else.
If one’s been building machines in the past year or so, we’ve just had the biggest performance gains in CPUs in literally 10 years with the latest AMD CPUs. I went a long time with an E3-1230 but got really bored honestly after 5 years and built an i7-4790k ITX machine which wasn’t as big of a change as going from 8GB of RAM to 16 GB or to my first SSD. I got a 3900X this past year and the substantial performance gains are not a big enough reason for me to upgrade yet again compared to something like the new Ampere GPUs for even hobby machine learning purposes like mine. In the future a bigger reason for me to build again would be based around space efficiency reasons where a small case could fit on top of my desk and drive 4K AAA games, encode HEVC video day and night, and all with a 600w SFX PSU.
Interesting about switching from the 3900x back to Intel.
I recently built a desktop around the 3900x. I’m happy with the performance but coming from a decade on MacBooks. Feeling the FOMO about my choice now :)
Cool thanks for sharing. My work from home setup has a legal marijuano vaporizer within reaching distance to help deal with the bullshit from managers & other bumbos in our digital corp.
I’ve used a Kinesis Advantage at work for 7 years now. Started after I began to get wrist discomfort once I started programming full-time. It’s been wonderful, and removal of the discomfort and pain is worth the $300 several times over. I bought one for home shortly after. It does take a bit to get used to typing on. It took a few days to get from 6 wpm to ~35 wpm, and about two weeks total to get to ~80 wpm. I can usually type a bit faster on a “standard” laptop keyboard, but 80 wpm is still plenty fast. I’m a Vim user, so I mapped the End key to Esc (keyboard has hardware mapping built-in).
I do experience the stuck-key issue referenced (annoying, but easily worked-around and infrequent), going to look into swapping the PCB as linked.
I wish there was a way to try this keyboard without committing to buying it for a month or so. It seems like there will be a learning curve and you might not even like it at the end of it.
If you have a keyboard programmable with QMK you can make the upper layers work in hardware too though (you still need to decide to which OS layout you want to map it)
I'll answer it literally: seeing what equipment other engineers use can give ideas for one's own serup. stapelberg is an effective engineer, so I'm glad he shared it.
It’s interesting because the author is a software artisan, like many of the people who are responding in this thread. Personally I love reading about the tools used by other people in the same trade. The choice of tools highly influences ones productivity, which is probably why I find it fascinating to read about his choices.
I read this as “Steven Spielberg uses this: my 2020 desk setup” before I clicked through. I was so amazed that Spielberg used a tiling window manager before I went back and figured out my mistake!
This.
I had the same realization maybe 7 or 8 years ago and it is true for me. With multiple monitors it's like moving your hands away from the keyboard all the time and looking for the mouse. Same happens for me while looking for anything across the 2 or 3 monitor setup. It just weakens your focus... If I'm on a laptop (and I usually am), I just use the laptop's screen. Getting used to a different screen size while working with my laptop setup would also slightly slow me down when I'm bound to use only the laptop.