I'm unfamiliar with the majority of these customization apps as I remain on Windows 10. Are the two apps mentioned (ExplorerPatch and StartAllBack) the most common user tweaking utilities atm?
Many other utilities are there. Most are normal apps. Some tweak registry. Taskbar replacement uses deeper tweaks, hence the claim that they are creating security issues.
Since this is on the front page I guess I'll post the resource that the graphics programming industry actually uses (note, I'm an author): https://raytracing.github.io/
It's included in the "Useful Websites" in the article above.
Also note that graphics is large enough that there no longer exists a one-size-fits all solution to learning graphics. If you want to learn graphics I'd recommend finding a mentor.
It is mostly older Millenials/Xennials with professional jobs. They earn more than 2x the national median income, but have a net worth that's less than 3x their income or $1M (whichever is larger).
I imagine it's the same way they measure scoville units.
It's sensitivity.
How much of the substance is required for a group of people to consistently correctly guess which of two identical water glasses have been spiked with the substance.
If it's like this then 1/3000 is required compared to sugar.
> The reason gamers want higher framerates is mostly for lower latency,
"Gamers" are absolutely not a monolithic group, but in general, they mostly want higher frame rates because higher frame rates does a better job of feeling and looking fluid.
> faster reaction times,
60 fps is 16 ms per frame. The end to end latency for a decent video game setup will be ~50 ms, and it's pretty well decoupled from frame time. The actual frame time will only constitute a small fraction of the overall latency. Most setups will have an end-to-end latency of 100ms. So the difference between 60fps and 120fps is even less cogent.
> and a bigger advantage in online play.
Most gamers play in 30fps or 60fps on console. There is definitely a lot of folks playing framerate sensitive games (like CS:GO) on PC which do benefit from increased framerate, up to ~300fps (above that gets complicated). But the vast majority of gamers are limited by skill and will see marginal benefit above 60.
> DLSS 3 'fakes' it's higher frame rates. As a result you might have 120 fps, but it's going to feel like 60hz.
Visually it looks a lot like 120. Digital Foundry has a video on this. Outside of definitely noticeable visual artifacting 120 fps with DLSS3 looks and plays like... 120 fps. Saying it feels like 60 is a wild thing to say.
It's possible that all games today running DLSS3 poll the input at the true frame rate, rather than that with the frame gen, but if we're talking about the difference between a 60 poll per second versus a 120 poll per second that difference in latency is only going to affect the top tier of players in the kinds of games where millisecond latency matters (an inherently niche set)
You could say that all gamers want the absolute best possible 300+fps setup to their online play, but where they're not primarily limited by skill, they're going to be predominantly limited by budget. Doubling your framerate from raw compute alone will more than double your cost.
There are valid criticisms to DLSS3. I've never heard or seen anything you're mentioning here.
Source: I'm a graphics engineer in the industry (Not Nvidia)
I play fps shooter (Overwatch 2) weekly at minimum. I’m far from being ”pro”, advanced hobbyist rather. When Windows spontaneously decides to reset refresh rate back to 60, I’ll notice it the second I aim by flicking my mouse. In addition to strobe-effect of low fps, the increased input lag in mouse controls is absolutely noticeable and really messes my aim. So no need to be pro to enjoy (and notice) increased responsiveness in fps games as long as you’ll clock a few hours weekly to it!
I know nothing of DLSS (I use AMD parts) but 100ms sounds extreme to me. Like way above playable. Did you try Nvidia's reflex experiment? At 85ms it feels like the mouse is stuck in tar. I read that the increase in aim precision was near 60% and often up to 80%(!) in the low Vs high latency.
I don't know what my current systems total latency is, but I use a mouse with custom internals and an LG OLED screen for the lowest input lag. The difference between 60 and 120 Hz is huge even with my at least low-ish input lag system.
This isn't meant as a criticism, as I stated I have no clue how DLSS feels and I quite enjoy playing games on my PS5 too (with LG OLD and VRR), but the 100ms just sounded off to me.
Actually, before posting I just had a quick look at Nvidias article. Their example of worst system latency was 77ms and the best is 12ms.
> because higher frame rates does a better job of feeling and looking fluid
But a higher frame rate also does a good job of reducing input latency with the same number of frames in flight. Just try a game which uses the system mouse pointer and where you can move things around with the mouse. The difference in distance between the mouse pointer and the dragged thing at 60 vs 120fps is quite dramatic, even more so on 30 vs 60fps. This is just an obvious visual representation of what is a bit more subtle when using a first-person control scheme (which many non-hardcore gamers probably don't pay much attention to, but they still might wonder why one game feels "laggy" and another feels more "crisp")
I'm pretty sure digital foundry had indicated that DLSS 3 adds latency vs turning it off, so in the 60fps vs 120 fps example, it would actually feel like 45 fps.
It's hard to say. The pace of change is happening so fast, it's hard to predict.
Disclaimer: Massive hand-waving here
But, I think these exotic materials will be stepping stones to other things. I don't think we can get next-gen AI until we can build skeumorphic neural chips with memristor-type arrays with massive densities.
You can't get to nano-tech until you can predicatively arrange atoms. And once you can do that, you can only other exotic materials.
High temperature (or room temperature) super conductors will allow massive magnetic fields for common applications -- not just high-end MRI's that require cooling.
Graphene is so weird it can pretty much do anything.
I think the under-pinning for all these advances is material science. For me, it will be tech that looks like magic.
The alternative is that corporations can (or rather, can continue to) factor in selectively breaking laws as a line item in their operating costs, which is unappealing for certain people for a variety of reasons.
Escalating punishment based on repeat offense until we find a level that is convincing enough to stop you from offending is not a particularly new idea.
Fines are fine balance between motivational and liquidating.
I think 6% is quite a lot, even if one has 40% margin. Investors will be highly distraught and seek remedies from the current management. But for instance at 20% they will blame the regulators and push the company to fight in courts.
In any way, government wants to motivate change in behavior not taking companies out of business.
What was it like working on these projects? Was it clear that the performance per watt was falling behind Apple’s expectations, and if so how was that felt in your workgroup?
Yeah. Just look at the performance per dollar ratio on mobile Intel chips right around Cherry Trail. Bay Trail and Cherry Trail lines were very good deals, and then core-m threw all that right out the window.