I bought the Asus 6K ProArt on launch, replacing an older 4k 27" Dell monitor. The new monitor is definitely an upgrade, but not as great as I was hoping. The matte coating is by far the worst part of the monitor. It's not bad enough to return the monitor, but the graininess is noticeable on white windows. I've definitely enjoyed having the extra screen real estate over the 27" monitor, and the extra resolution has been very helpful for having a bunch of windows open in Unity.
This year at CES there were a number of new monitors unveiled that compete in this space. There's a new Samsung monitor (G80HS) that is a 32" 6k with a higher refresh rate than the LG or Asus. Unfortunately it has the matte coating instead of glossy, so clarity will suffer.
Also of interest are the new OLED offerings with true RGB stripe subpixel layout. This should fix text rendering problems on systems with subpixel antialiasing. Both Samsung and LG are making these OLED monitors with the true RGB layout. There will almost certainly be glossy coatings offered with these panels, and they'll have higher refresh rates than IPS.
Debating a getting that proart to replace my 27" 4k. Do you find the productivity benefits to be meaningful? I'm wondering if I'll just end up making everything bigger and not benefiting or having to move my head too much
I bought the 6K ProArt on launch, replacing an older 4k 27" Dell monitor. The new monitor is definitely an upgrade, but not as great as I was hoping. Like you said, the matte coating is by far the worst part of this monitor. I would say that it isn't bad enough to return the monitor, but it's definitely noticeable on white windows.
I've definitely enjoyed having the extra screen real estate over the 27" monitor, and the extra resolution has been very helpful for having a bunch of windows open in Unity.
This year at CES there were a number of new monitors unveiled that compete in this space. There's a new Samsung monitor (G80HS) that is a 32" 6k with a higher refresh rate than what you'd find with existing offerings. Unfortunately it has the matte coating instead of glossy, so clarity will suffer.
Also of interest are the new 27" 4k offerings with true RGB stripe subpixel layout. This should fix text rendering problems, especially on Windows. Both Samsung and LG are making these OLED monitors with the true RGB layout. There will almost certainly be glossy coatings offered with these panels, and they'll have higher refresh rates than IPS. The main downside will be brightness for full screen white windows. I think the Samsung panel is a bit better than LG in terms of brightness.
This is akin to how I've (technically?) stepped back from a 5K 27" to a 4K 32". Likely due to scaling and how far I sit from the screen (about 24" -- average I think) things look the same? At least, I don't notice that the 4K is any worse.
Me being me, I can't help but think I should have a 5K or 6K or whatever, but the price is... high. So I figured I'd try a 4K 32" since the OLED was cheap and the result was this post because the subpixel pattern messed with me. But now for the replacement I'm looking at a simple (but nice color / high end) 4K 32" IPS LCD.
And having been using one for the last day, I'm pretty content with it. It's like everything I wanted from the OLED without the eye strain.
I actually had a 5K iMac that I sold when I got the Mac mini. As I was deciding on the display I looked at doing that, but I wasn't super keen on the unfinished look. And IIRC it was going to cost about $250 in parts at the time. I was able to get the ASUS for about $700 and sell the iMac for ~$300. So it was really only about $150 more to not DIY it and have a more finished final package.
It is a really neat looking project, I just determined it wasn't for me.
It has gotten a bit cheaper, a lot easier and somewhat better lately.
But I agree that it does not make that much sense because you end up with a product that has many flaws and is a bit annoying to use.
The main factor is being able to sell the iMac for that relatively high price. I can't figure out why they are still so expensive because most of the early 5K models are kind of useless nowadays (on low end version, the compute just cannot cope with modern media/files at such a resolution). But maybe it's people converting them to display driving the market...
The person who bought mine was a family friend who wanted a large display for her kid to do 3D printing stuff. Since he was just going to be running a slicer and some basic modeling stuff, it seemed perfect. I got a bit of cash, he got a computer with a good display, and it was a general win all around.
Ah yes for those use cases it makes perfect sense.
Apple excuse for stopping the big iMacs was that when you bundle up the display with the compute it makes it hard to upgrade and the whole thing become useless.
But in reality it just looks like some bad "reasoning" to force people to spend more on a less elegant solution that probably won't get upgraded that much.
At least an old school 5K iMac can have some secondary use case, like you demonstrated or even just to watch movies, do some light document editing and such.
And you can convert them to displays if you really want, but that should have been a built-in functionnality in the first place.
I guess this is why they still commend a relatively high-price, a good large display still has many uses even if the compute is weak.
The studio display is kinda useless outside of Mac use, so even though it's great quality its really not a good deal.
I really hope Apple finally release another big iMac because I won't get another Mac Mini or a Mac Studio. I like macOS (less so nowadays) but the whole point of the Mac was the integrated hardware for base/mid-range power. Their small desktop boxes that cannot take any upgrades are really pointless as a desktop because you end up with cable galore and not much space saved. This is so stupidly inelegant, I can feel Jobs rolling in his grave.
They added front accessible ports, so that's something I guess, but come on, they stink of greed and profit maximisation at all costs...
Will the move to CoreCLR give any speed ups in practice if the release build is complied with IL2CPP anyway? On all the games that I've worked on, IL2CPP is one of the first things that we've enabled, and the performance difference between the editor and release version is very noticeable.
Yes, you can relax logic gates into continuous versions which makes the system differentiable. An AND gate can be constructed with the function x*y and NOT by 1-x (on inputs in the range [0,1]. From there you can construct a NAND gate, which is universal and can be used to construct all other gates. Sigmoid can be used to squash the inputs into [0,1] if necessary.
This paper lists out all 16 possible logic gates in Table 1 if you're interested in this sort of thing: https://arxiv.org/abs/2210.08277
I have aphantasia and have been practicing meditation with the goal of improving the condition for a couple years. I have seen some minor improvements - when I'm in a pretty relaxed state I can see some visuals, but am not able to control the stream of images.
I haven't been working on this quite as much recently since there seems to be a connection with the meditation causing an ocular migraine with aura.
In my experience the main benefit of functional programming is function purity. I am completely fine with mutation inside of a function since the all of the mutation logic is self-contained in a single small block of text.
I think everyone should take a shot at writing a non-trivial functional program to see the benefit. Once you understand what makes it great, you can apply what you've learned to the majority of OOP/impure languages.
I'm not so certain that non-desk jobs will be safe either. What makes the current LLMs great at programming is the vast amount of training data. There might be some other breakthrough for typical jobs - some combination of reinforcement learning, training on videos of people doing things, LLMs and old-fashioned AI.
I do networked game development on Windows and I've found the clumsy program to be very valuable to simulate adverse network conditions. You can set it up to simulate arbitrary network latency, packet loss and so forth.
Perhaps the reason modern programs use so much memory vs what I remember from the Windows XP era is precisely because we went to 64 bits. Imagine how many pointers are used in the average program. When we switched over to 64 bits, the memory used by all those pointers instantly doubled. It's clear that 32 bits wasn't enough, but maybe some intermediate number between 32 and 64 would have added sufficient capacity without wasting a ton of extra space.
> Imagine how many pointers are used in the average program. When we switched over to 64 bits, the memory used by all those pointers instantly doubled.
This is a very real issue (not just on the Windows platform, either) but well-coded software can recover much of that space by using arena allocation and storing indexes instead of general pointers. It would also be nice if we could easily restrict the system allocator to staying within some arbitrary fraction of the program's virtual address space - then we could simply go back to 4-byte general pointers (provided that all library code was updated in due course to support this too) and not even need to mess with arenas.
(We need this anyway to support programs that assume a 48-bit virtual address space on newer systems with 56-bit virtual addresses. Might as well deal with the 32-bit case too.)
I agree that's wasteful, but if software were only 2x bigger, we'd be in really good shape now. Unfortunately there are still another one or two more orders of magnitude to account for somehow.
SGI used three ABIs for their 64-bit computers.. O32, N32, N64. N32 was 64-bit except for pointers which were still 32 bits for exactly that reason - to avoid doubling the memory needed for storing pointers.
This year at CES there were a number of new monitors unveiled that compete in this space. There's a new Samsung monitor (G80HS) that is a 32" 6k with a higher refresh rate than the LG or Asus. Unfortunately it has the matte coating instead of glossy, so clarity will suffer.
Also of interest are the new OLED offerings with true RGB stripe subpixel layout. This should fix text rendering problems on systems with subpixel antialiasing. Both Samsung and LG are making these OLED monitors with the true RGB layout. There will almost certainly be glossy coatings offered with these panels, and they'll have higher refresh rates than IPS.