I have three requirements that I would like to satisfy in a new laptop:
1. AMD Ryzen, Zen 2 or later (meaning 3000 desktop, 4000 mobile or 5000 any). At this point assume that I’m not concerned about class of processor, though I think H-class is probably my ideal.
2. Minimum screen resolution of 2560×1440, preferably 4K or similar. (And since I’m talking, ideally a squarer aspect ratio like 3:2, but those are so rare I won’t even bother looking. But I do really like my Surface Book’s 3000×2000 display.)
3. No NVIDIA GPU, because I want to use Linux and NVIDIA hates Linux, and I want to be able to run Wayland also. So either integrated graphics only (with a U-class or H-class APU), or an AMD dGPU (which would be the only option on the higher-end desktop CPUs). Sure, with U-class or H-class I could just disable the NVIDIA GPU, but the dead weight and giving money to NVIDIA for something I will never use galls, quite apart from the mild nuisance of figuring out how to disable it.
I looked through all the manufacturers I could find in Australia, and all the Clevo OEMs I could find worldwide.
I have not found a single device that satisfies all three of these requirements. Any two, sure, but not all three.
A related peeve lies with the H-class APUs: why can’t I get one without an NVIDIA GPU? I have found only one place selling laptops with H-class APUs and no dedicated NVIDIA GPU: TUXEDO, in their Pulse 14 and Pulse 15. But even then, they’re only available on paper—order now and you won’t get it for over three months since they’re currently out of both 4600H and 4800H APUs. (For myself, I rule TUXEDO out for not shipping to Australia, anyway; any forwarding service arrangement would be a bother and probably expensive, and leave me with a useless warranty.) But you’d think that more than one manufacturer would look at how light yet powerful a laptop they can make this way and do one without a discrete GPU. Look, the Pulse 15 with its 91Wh battery even advertises over 20 hours of battery life when idling at minimum brightness (and the Pulse 14’s 47Wh battery, 12 hours).
These things make me sad. I hope the portfolio expansion mentioned here will include something to satisfy me.
Matches everything you mentioned, and has some other good points as well
> 4800u (not h, but 4800u is within 10% and this laptop you can put to use 25w power mode closer to 4800h)
> 2560x1600 - so good res, and also tall aspect ratio (16:10 not 3:2 but still tall), also checkout reviews, this is a very good quality screen, beautiful vivid and crisp
> no nvidia gpu - 4800u has 8 graphics cores of amd navi internal graphics, perfectly powerful for general use
It's a good laptop with good build quality (aluminium body, great screen) and I'm debating whether to buy one, but I'm actually waiting as I don't need a laptop now.
There's a few shortcomings from my ideal, which is only 13.3", and the usb-c is only ver 1 5Gbs, also I heard the keyboard is average where I like an exceptional keyboard. But basically this is an awesome little workstation. And if you can get that sale price I posted, it's exceptional value.
Ah man, you are describing my ideal laptop as well.
I don't have a solution for you, but I have had good experiences with System76 in the past and am hoping that upcoming laptop refreshes should start to hit those points or me as well.
I have a system76 but it uses an NVIDIA GPU for the external monitors.
It works fine when plugged in, but I basically don't use it as a portable computer because it's such a pain to switch back to Intel graphics. There are some options for hybrid graphics but it limits the options I have for external monitor orientation.
Graphics switching remains somewhat painful in the Linux ecosystem. There are options but nothing that quite achieves the seamlessness of Windows or Mac solutions.
Did you get the System76 with PopOS? I was impressed with my last one with the desktop menu that allowed graphics switching (albeit after a restart).
Which I think is just a wrapper around the nvidia drivers, but at least I didn't have to install it myself and the menu location is a tad more convenient.
yeah the restart is the problem, it adds a lot of overhead if I'm just running to a meeting or something. There is a Prime option but you can only use external monitors in standard landscape and I use mine in portrait.
2160×1440 (185ppi) is a touch on the low side, though admittedly very close to 2560×1440 on a 15.6″ display (188ppi; I confess I had that resolution more in mind for a 13″ or 14″ chassis). But definitely far less than the ~275ppi of my Surface Book’s 3000×2000!
This definitely looks an interesting machine, but it seems they’re not selling it in Australia, so alas, I can’t get it. Otherwise I’d be very strongly considering it.
They were selling it in Australia via ebay from the official store, but all stock disappeared shortly before Black Friday. The 4800h was 1750 aud.
I found a video review showing a great screen, and the machine is overall pretty good. One strange shortcoming is the use of 2666 Mhz RAM instead of 3200, and for Ryzen of that generation the RAM speed is important to overall performance, and also the RAM is used for graphics memory for the iGPU. Very strange to use that slow RAM but benchmarks still seemed reasonable.
I looked at the matebook pro in jbhifi with the 3000x2000 screen, best screen of any laptop there. Unfortunately no AMD option for that one.
I have a huawei mate book x pro and it’s great! It’s got the 3:2 aspect ratio and a higher resolution screen, touch, 16 gigs of ram, etc. it is super nice, but it does have nvidia.
> No NVIDIA GPU, because I want to use Linux and NVIDIA hates Linux
Is this a voting-with-your-wallet position for open drivers / better Linux support, or have you experienced issues in the past with the proprietary drivers?
I ask because Linux with an Nvidia gpu is my daily driver (desktop), and I’m likely to build another system like that again, was just curious what I don’t know.
I’ve never tried to run Linux on a machine with an NVIDIA GPU, or on a machine with two GPUs. But from what I have read, this is the impression that I get:
It’s quite common to have a bit of trouble getting anything working, and you will run into problems far more often than with any other brand of GPU (which will just work perfectly).
Many things won’t work with the proprietary drivers, because they implement everything their own way, and so software has to be written against the NVIDIA proprietary drivers, rather than using the standard tools that everything else uses like OpenGL and Mesa.
https://arstechnica.com/gadgets/2020/04/linux-on-laptops-asu... is the sort of thing I’ve heard of: there, the NVIDIA GPU is trouble from start to end. Admittedly it might have gone more smoothly with something other than Ubuntu, with more recent versions of kernels and other such stuff (I like Arch, so this wouldn’t be a problem for me), but it’s still just bad.
But then too, I mentioned Wayland: because of how NVIDIA made their own world on Linux, the proprietary drivers are completely incompatible with Wayland—basically NVIDIA hardcoded support for X (again, rather than using the standard approach everyone else settled on) and haven’t done so for Wayland.
If you use Nouveau instead of the proprietary drivers, you’re left with a GPU missing a substantial fraction of its functionality, stuck at a low clock speed in power-draining mode. But I think you might be able to use Wayland.
I gather that Linux has problems in any dual GPU environment, but that the issues are far worse with NVIDIA than with anything else. If you’re operating a single-GPU machine and are content to use the proprietary drivers, I gather it’s not such a problem, though many applications may be unable to use GPU acceleration.
I welcome any corrections to what I’ve written here. As I say, this is all just hearsay.
If you're trying to do anything with the latest hardware, it's always going to be a pain on Linux. Even Windows doesn't have flawless support with every game on day one.
> It’s quite common to have a bit of trouble getting anything working, and you will run into problems far more often than with any other brand of GPU (which will just work perfectly).
That's a joke. I've been using Linux on Nvidia since the Riva TNT. It was always Radeon that had garbage drivers and really didn't give two shits about Linux. Not sure how much this has changed with AMD owning them, but I never bothered to look at Radeon since Nvidia was always the best card and always worked. It's almost never more than just doing a single package install.
> software has to be written against the NVIDIA proprietary drivers, rather than using the standard tools that everything else uses like OpenGL and Mesa.
I think you fundamentally do not understand Mesa or OpenGL to say this. And no, nothing has to be "written against" Nvidia. It's literally OpenGL (or Vulkan, today). Unless you're talking CUDA or something, which I don't touch.
This all just sounds like so much FUD that you wrote.
I certainly don’t understand all that I’m writing of—it’s based purely on what I’ve read, which has mostly been casual rather than deliberate research. It’s almost certain that I’ve made errors.
> The proprietary NVIDIA doesn't provide the same user space API as the open source drivers. While the open source drivers allows the display server to use the Generic Buffer Manager (gbm) and Kernel Mode Setting (KMS) APIs to manage hardware buffers, set modes, and queue page flips, configure hardware planes, the NVIDIA driver forces the display server to treat it differently. Instead of these APIs, the compositor uses a combination of KMS, to set modes, and EGL (EGLDevice & EGLStream extensions to be precise) to indirectly queue page flips by linking an EGLSurface, corresponding to an area of the screen, with a CRTC of an EGLDevice, using an EGLStream.
My OpenGL and Mesa remarks are very probably wrong.
The impression that I have received is that the window manager is probably the main thing that needs to be aware of these differences and NVIDIA ignoring the standards (this is why Sway doesn’t support NVIDIA’s proprietary drivers: https://drewdevault.com/2017/10/26/Fuck-you-nvidia.html), but that anything wanting to actually use GPU capabilities may well need to be aware as well.
—
What I have heard on the GPU drivers situation is that Radeon used to be terrible (and NVIDIA less terrible despite doing things its own way rather than the standard way), but that they overhauled it all completely so that for at least the last five years it’s been great.
For what it's worth I've had a fairly easy time with nVidia on Linux. My daily driver is an Ubuntu 20 machine where I just selected the nVidia binary blob option from a menu and it just works, even through kernel upgrades.
But this is also just about the easiest setup. I might feel differently if I were on a laptop with dual graphics cards.
just a few datapoints, the Dell Ubuntu machines (4+) we've received, I just do the "Additional Drivers", click Nvidia, reboot and stuff like TensorFlow just works. CentOS (RHEL) same.
I think if you start messing with one of distributions not on the CUDA download paper, you get stuck easily, but at least for work purposes, I don't see the point of using something other than RHEL/CentOS/Fedora/Debian.
I see, I guess I must just be on Nvidia’s happy path: I use a Debian distro similar to Ubuntu at work and Ubuntu at home with X windows and have experienced no problems; I didn’t know that there might be dragons if I try to stray from that.
Not OP but I'm also avoiding anything with Nvidia in it. Half of the reason is voting with my wallet, as you mentioned. This other half is feeling much more confident that the hardware will be supported and compatible with Linux in the long term.
OpenBSD obviously doesn't have NVIDIA support, but amdgpu(4) works pretty well. I'd also prefer to use AMD's own GPU, not only because I can't use NVIDIA, but also because I don't support NVIDIA's business practice.
I've run intel/nvidia with Arch Linux for the last 14 years.
I upgraded recently from an intel/nvidia setup (GTX 1070) to a zen3 w/ a Radeon 6800.
Honestly, both nvidia and AMD have excellent support on Linux. Both hardware decode, play steam games, are stable, support the latest kernel etc..
However, latest steam games (and proton in particular) is better tested against Mesa (AMD OSS drivers), so it's the place to be for gaming on linux.
AMD also tears less on Linux, and supports Wayland (though Wayland support could be the reason for less tearing).
Either way, I'm fully AMD now and wouldn't go back. I see zero downsides, and a few upsides.
About better linux support:
VDPAU does not have browser hardware accelerated video decoding yet. It seems both AMD and Intel based on VA have this feature already.
However, I had to choose NVIDIA only because of CUDA.
That requirement could be a roll of the dice. Get the laptop with the Ryzen processor and hope that the BIOS just happens to have ECC enabled and buy the right DIMMs. It's not something you're likely to see as a bullet point in the feature list except on some kind of hideously expensive "industrial" laptop, which would probably fail all of the other critera.
beware NVidia hybrid GPU's in laptops. I have had 4 such laptops, all 4 suffered from graphics problems. 2 ended up dead. 1 works great but the Nvidia GPU is no longer detected (???) And the last one still works, but graphics crash about every 4 hours when the gpu is under heavy load (I think due to heat)
My Surface Book is 267ppi (3000×2000, 13.5″). It’s very good, but at typical viewing distances it’s still unquestionably lower than the human eye is capable of resolving. I doubt there’s much value in going higher, but it’s not useless, either—if all else was equal, 400ppi would certainly be more pleasant to look upon than 267ppi.
4K on 17″ is 260ppi, which I reckon is pretty good. On 13″ or 14″, full 4K is probably a little bit of overkill at present (315–340ppi), but 1440p is definitely less than I’d like even on a 13″ monitor (226ppi).
Of course, it’s always a matter of balance, because all else is not equal; higher resolution means higher cost, higher power consumption, higher memory usage, and higher processing requirements. But I’d definitely prefer 4K to 1440p even at 13″.
External monitors are typically used farther way from the eye compared to a laptop screen - have you tried 4k at 24" ? It may achieve what you are looking for.
But it’s unlikely to be great at Linux out of the box. The Surface families have a history of doing things their own way, which means being poor at running Linux until people reverse-engineer things. Per https://github.com/linux-surface/linux-surface/wiki/Supporte..., Surface Laptop 3 (AMD) still isn’t perfect: like most of the Surface families it requires a special kernel for most stuff to work, and the touchscreen and pen support don’t work (admittedly functionality outright missing from most laptops, so perhaps not a big deal), and if you suspend it, you need an external keyboard to wake it up again!
Also I’m shocked at the price hike from 16GB RAM, 512GB storage to 32GB RAM, 1TB storage: it goes up from AUD 2931 to AUD 4399. A $1,468 increase. I would consider $468 not unreasonable (even though the retail cost delta on the actual parts should be under half of that), but it’s like they hoped you wouldn’t notice them slipping an extra $1,000 onto the price. But then, given that the second and third configurations increase the first’s $1,699 by $425 to increase 128GB of storage to 256GB (that’s more than even Apple charge for such things!), and then by another $255 to increase 8GB of RAM to 16GB, perhaps I shouldn’t be surprised. Still am, though.
I didn't see Linux support in your 3 points, though!
As an owner of a surface 4 pro, I feel the pain of almost good Linux support - and is a little surprised and dismayed at the rate of mainline kernel support.
Also agreed on the pricing model - but in my experience the hardware is very good. Arguably, I prefer it to Apple hw (except for the m1 cpu, that looks nice).
Eh, I reckoned it was kinda implied in my reasons for wanting no NVIDIA.
I also didn’t say “must have more than 4GB of RAM”. :-)
(A funny fact about it all is that I purchased the Surface Book a few years back simply because its hardware was so good on paper for what I wanted to do that I was willing even to switch from Linux back to Windows. Admittedly I would never have done it without WSL existing, but still. Anyway, a few years later I’m hankering to get back to Arch Linux and i3, or perhaps Sway now. The Surface Book has been good hardware, except that unit #1 was developing some problems at the age of 19 months, #2 was basically DOA, #3 had Battery 1 disappear after four months, and a couple of weeks ago #4 at the age of 2¼ had its Battery 1 die in a more unpleasant way: the computer will spontaneously lose power typically 1–4 times per day. But I have probably used it an average of over 10 hours a day, and regularly very heavily at that, except for its dGPU which is almost untouched.)
So, I looked on the AMD shop online and it looks like the MSI Bravo 15 satisfies all of these requirements, it is surprising that there was only one laptop in the AMD store that seems to satisfy this.
It’s the resolution I care about more than the aspect ratio. A squarer aspect ratio would just be the icing on top. But yeah, if you’re happy with 1920×1080, then it’s not too difficult to find machines satisfying the other two requirements.
I think it's just that unfortunately a lot of these laptops are geared towards gamers, and pretty much all gaming laptops have 16:9 aspect-ratio, and they also usually use high refresh-rate screens, which are also almost entirely in 16:9 aspect-ratio.
I have heard, and keep on hearing of, many tales of problems with NVIDIA drivers on Linux, and there’s some functionality from the GPU that simply isn’t exposed in a usable way on Linux.
But I can only think of hearing of one problem with AMD (integrated or discrete GPU) drivers within the last five years, and that one was promptly fixed (it was a missing break statement that caused I think it was the RX 570 to be misclassified). But generally speaking, provided you have a recent enough kernel, I gather that it all just works, perfectly.
Now admittedly AMD GPUs are less common than NVIDIA ones, but even taking that into account the evidence is overwhelmingly against NVIDIA.
Everyone using CUDA on Linux are usually using Ubuntu + Nvidia drivers, so no it doesn't sucks. The only down side is that its closed source and does not work with Wayland. Overall Nvidia has just more people working on those drivers than AMD.
What I repeatedly hear is that even with the proprietary drivers, it’s not terribly uncommon for people to still have serious problems, and that not everything can use the acceleration (e.g. from elsewhere in the thread, “VDPAU does not have browser hardware accelerated video decoding yet”).
I’m by no means sure, but the impression I’ve received of AMD GPUs is that all functionality of the GPU is available and functional. And no one ever seems to have trouble getting normal things working.
It's so dumb that there's a few zen2 SKUs under the 5000 name. They skipped 4000 on the desktop so that mobile and desktop could have the same name for the same generation only to immediately screw it up.
My thoughts exactly. Couldn't they have released those Zen2 parts as refreshed 4000 series, which they are?
This feels like they're trying to scam less knowledgeable consumers. WTF is going on at AMD?
Also, the Zen3 iGPU is still Vega not RDNA2. WTF AMD?! They could integrate RDNA2 in the console Zen2 APUs but the PC Zen3 chips are still running a GPU from 2017 which is based on an architecture from 2012! How is this possible, it just seems crazu to me.
I don't know if "scam" is exactly the right word, though the sentiment is likely close enough. At those lower levels anyone buying those laptops is going to have much more power than they need. Someone buying a Ryzen 5 4500U to do regular spreadsheets, email, documents, video conferencing, etc. will have ample horsepower. And a Ryzen 5 5500U is not much different. The clocks (base, all-core turbo) might even be lower, though the GPU is slightly better. However, for that kind of user, they might get better battery life, and if they fire up games, a better experience.
This isn't ideal if you upgrade every generation - that wouldn't be worth it. But in those $500-600 laptops, it probably doesn't hurt. I suspect they have Zen 2 chiplet supply and buyers in that price range will be plenty happy. If they really need more CPU power from Zen 3, they could step up to Ryzen 5 5600U. So you need to be better educated on those little nuances, but only if you really need the CPU power.
Presumably anyone buying that doesn't dig into these details reads reviews and looks at the benchmarks on those reviews based on programs they run that are CPU intensive. Otherwise, they probably won't be affected greatly by the difference.
It's not ideal but given supply chain issues, it might be a necessary evil.
I remember the lowest sku's almost always being a last generation part. which seem to be more about economics than uninformed buyers. You can buy a last generation laptop with a 4500u however there is merit to buying a new laptop with all new features the laptop itself provides. Also, the design might be 'last generation' however that does not mean its exactly the same. which is important because architectural changes can and have caused problems that were adressed in later revisions of the same cpu. You'd be surprised how much effort most people put in to researching consumer electronics decisions. their knowledge might scratch the surface of what technical people know months in advance to a cpu launch but it is significant nonetheless.
Rebadging last-gen products into the current-gen lineup is a genuine value add for me as a customer.
If I'm buying a new product, I want to see a lineup of exactly the products a given company thinks are competitive, ordered by price or performance. I don't want to have to pore over review articles and youtube videos to figure out how many performance grades correspond to how many generational jumps and cross reference to older generations. I just don't.
"I don't want to have to pore over review articles and youtube videos to figure out how many performance grades correspond to how many generational jumps and cross reference to older generations. I just don't."
If they named things properly and wouldn't mix generations within a single generational series marketing campaign (5xxx series), you wouldn't have to - as a consumer, if you wanted to buy the best, you would buy the 5000 series. They would still carry 4xxx stock - if you wanted something at a different price point, you'd buy the 4xxx series.
AMD's choice to mix generations within the same series makes things more confusing, not less confusing.
That's only true if generations don't overlap in performance. If they do overlap, either the company "ports" the old products into the new lineup or you're in for considerable homework.
What is the point then in up-branding a chip with identical performance that spans two different process generations?
Do the two different generations have different TDP envelopes? I though the 5xxx and 4xxx series both use Vega - what is the appreciable product difference between Zen 2 and Zen 3 if the performance/TDP/graphics performance is the same?
I think you are kinda answering your own question here? the difference between Zen 2 and Zen 3 is that they are different architectures. there may not be an appreciable product difference between a mid/high bin Zen 2 part and a low/mid bin Zen 3 part. this is why we have SKUs; they take a spreadsheet worth of details and compress them into a rough total ordering of price/performance.
the point is that it's annoying to compare 4000 series parts against 5000 series parts to figure out what is the best budget AMD CPU. outside of enthusiast circles, no one cares about being on the latest architecture. they care about what is currently the best performing part within their budget.
I would bet it's more likely that the OEM who is using and asked for those chips demanded they be a 5000 series part so that customers weren't skipping it over because of "last gen".
I completely agree it's dishonest, I'm not sure you can blame AMD though. If Dell (I'm not saying this was Dell), came to you and said: either this chip is 5000 or I don't carry ANY of your chips, you give them what they want.
To me this shows AMD has learned its lesson: without the OEMs onboard you aren't going anywhere. Selling individual chips to gamers is profitable, but that only goes so far.
They probably also learned it from their time in the GPU arena, where such shenanigans have been in play for a long time (although usually moreso by the green team)
The difference in thermals and power really just impacts boost clocks. It's not a very significant difference otherwise, at least not in the CPU space and especially not in single-core performance.
A max-draw zen3 desktop core is only 20w. You'll see that in these laptops, or near enough to make no difference.
There are just two models with AMD and HiDPI displays:
Microsoft Surface Laptop 3,
Asus ROG Zephyrus G14. It seems that the laptop market simply ignores AMD's mobile CPUs for middle-, high- end laptops.
Not HiDPI, but the ThinkPad T14 AMD is a pretty good business laptop with AMD Ryzen 7 Pro. Also with extensible memory and replaceable NVMe SSD as a cherry on top. There is also the less extendable but slimmer T14s.
Source: I own a T14 AMD. I am pretty happy with it after 13 years of MacBooks and returning a MacBook Air M1. With 8 cores it's fast for development and the fan is not very loud (nowhere near as loud as Intel MacBooks),
I recently received the ThinkPad P14s AMD (same as T14) that I ordered back in November. I only paid about $800 US for this, with some corporate discounts that most people can easily find. Comparing it to my Macbook, I'm pleasantly surprised. It's only 1080p, but I really don't mind it at all because it's matte instead of glossy. And the keyboard is pure bliss compared to just about anything.
> with extensible memory and replaceable NVMe SSD
Not only that, but there is a WWAN slot that you can add an additional NVMe drive to. You can find 512GB 2242 form factor drives on AliExpress for about $50. There are 1GB ones for about $100. That's my current experiment. Hoping to dual boot Windows and Linux.
But this thing has an ethernet port. In 2021. And all the other ports you'll ever need. Mine came with an Intel Wi-Fi 6 card installed in a slot which would appear can be upgraded or replaced as well. Just insane utility. I even like the soft feel of the case more than the cold aluminum of the Macbook.
Lenovo does not sell ANY AMD-equipped Thinkpad in Spain as of right now because they are all sold out [1]. The Ryzen 7 version in particular was sold out a mere 1 week after launch and hasn't been available since.
To add insult to injury, they have announced 5 new thinkpads (X1 Titanium Yoga, X12 Detachable, X1 Yoga Gen6, X1 Nano, X1 Carbon Gen9) but all of them are intel-based. [2]
How is Linux with it now in 2021. I'm currently not needing a new laptop, having the T25 and just replaced a US keyboard to it, it'll be great for the next 10-15 years.
But, if I need a laptop from the company, I'd definitely look into the AMD series of ThinkPads, if I could get a good Arch Linux experience with it.
Yeah I think the OEM landscape is rapidly evolving for AMD.
Before Ryzen 4000 / 2020, you could pretty much only get budget laptops with relatively cheap chassis, abysmal screens (1366x768!), low-end graphics, spinning disk HDD, that sort of thing.
Starting in 2020 Ryzen 4000 you started to get reasonably good 1080p Ryzen laptops, but they seemed quite artificially limited to RTX 2060, rarely over 300 nits, often 16GB or lower, no more than 144Hz in most cases. If you'd shuffle over to Intel, you'd find RTX 2080, 500 nits, 32GB, 300Hz screens, and QHD/4K.
Already in 2021 we're seeing (announced) QHD screens, up to 360Hz, RTX 3080, 32GB.
(I've been looking for high refresh and prefer gaming laptops, so I'm less clear on what's available in business laptops with 4K, etc.)
Even during the Ryzen 2000 generation you could find reasonably priced laptops with 1080p screens, though the compromise would be a tiny SSD or spinning rust.
I'm wondering at this point whether it's TSMC supply constraints. Intel still wins in sheer number of fabs producing their product. If you can't actually get enough AMD chips to ship your laptop...
Design is a pipeline and updated models are a large component of the pipeline.
AMD CPUs weren't really compelling for high end in mobile until the Zen2 Ryzen 4000 APUs released last year. There weren't a lot of high end models to update to put that in, so it took a while to get those chips into nicer models.
TSMC supply limitation and pandemic induced demand for computing devices doesn't help.
This probably comes off way off base but it's worth mentioning that Apple's new MacBooks also offer greater perfromance than most Ryzen 4000 chips (faster than all in single core, faster than most in multi) while offering pretty HiDPI stuff. Granted no Linux or W10. So the market clearly is there for them to hit.
These things are so frustrating... 5700U probably performs worse than 5600U in real life scenarios. I liked that in the previous laptop generation I could just tell my girlfriend to look for a 4000x AMD CPU, and she got an amazing laptop deal at the end.
Now you have to also make sure the 2nd digit is even... I feel like this has happened in the past, the odd / even signified a different architecture.. don't remember if it was Intel or Amd though
Is it just me or does it feel like in the past 5-7 years it doesn't really make that much of a difference when new CPUs come out? My laptop is a i7-4800MQ and it still feels incredibly snappy. Anything I need to do for work I just farm out to AWS. All of my gaming is on Stadia. Personally its just easier for me to use the cloud for all of these things instead of cycling through devices all the time.
CPUs have been progressing faster in the last few years, mainly because of AMD's Ryzen line. In laptops, 4 cores are now standard, and if you can gen one of the AMD 4000U CPUs, 8 cores with good single-threaded performance are relatively cheap. Same thing on desktop, you can get 20%+ more performance by upgrading the CPU.
Sure, it's not the same as 486 vs Pentium, but the Ivy Bridge-era CPUs that were used for such a long time are now equivalent to low-end CPUs, it's no longer universally better to get an old i7 vs a new i5. That said, if you're happy, there's no reason to upgrade.
I'm in the process of returning an i3-9100 based Dell Precision. It's a true 4 core and despite no hyper-threading on the benchmark sites it beats my Haswell Xeon E3 in both multi and single threaded tests. In reality however it's a dog. It gets bogged down for the lightest workloads.
It's also using MORE electricity from the wall, as the old PowerEdge T20 system was an ATX12VO design with a Xeon 1275L.
I'll extend your comment to say that old i7 is universally better than new i3 (and I should have known better).
I definitely disagree. When having a video conferencing app open can use 100% of a single core, 2 cores just isn't enough for anyone even remotely multitasking. Same with a browser running some demanding JS for whatever reason or system updates.
No, that's literally been the case with Intel processors. Especially mobile ones. There's practically zero difference between Haswell and Whiskey Lake (5 years newer) core performance. Even Ice Lake was a minor improvement.
But the power consumption has dropped, heat generation has dropped (a lot of it because they abandoned FIVR), IGP performance has increased and they started supporting DDR4, so overall performance has increased.
I personally lament the death of the mobile socket. I hope AMD still has socketed mobile processors, but I doubt it. They're only useful for people who want to upgrade or fix their laptops themselves, people who want to resell their old laptop/parts, and the environment.
Your observation holds in desktop. In laptops, the thermals of the 14nm generation were horrible, essentially you have to deal with a whining fan or cpu throttling, or both.
AMD's 4000 series has much improved upon this, and I assume that the latest 10nm Intel will be much better.
Performance-wise I agree with you. Minor changes, probably more noticeable changes come from the massive adoption of fast SSDs (1-2GB/s).
There have been absolutely massive gains in desktop performance, if you're doing CPU bound tasks locally with work can be split across cores the high number of cores that are now available. The performance for my workloads is about 4x what it was, from 7 years ago, for the same price.
They've progressed notably in terms of total draw power and therefore battery life. Admittedly, not much of a difference when it comes to the overall system responsiveness in basic user workflows. Quoting from the article:
> Ryzen 7 5800U as its most efficient mobile processor to date, citing 21.4 hours battery life on a 53 Wh battery during 1080p video playback with Wi-Fi on, or 17.5 hours in MobileMark 2018’s battery life test
As others pointed out, Zen 2 has upped the game. My Ryzen 7 4800H benchmarks very neatly in line with my Ryzen 7 2700X desktop (which is Zen+ so half a generation older.) Basically I give up very little switching between my desktop and laptop. Up until a few years ago I only ever gamed on my desktop but since I got this laptop (80W GeForce GTX 1660 Ti) 1080p laptop gaming is quite good.
Do you do PC gaming on Stadia or is that all Chromecast+TV type gaming? Personally I'm much more of a PC gamer, and while I don't have the experience to have an opinion on Stadia, I'm currently assuming having good hardware locally is the better experience.
In the posters case, going from their i7-4800mq to a Ryzen 5800u, would result in roughly +50% single core performance while also doubling the amount of cores, resulting in ~3x the performance at 1/3 of the TDP.
stadia i mainly play with the controller and stream it to either my ipad or my thinkpad. i would love to use the chromecast i have but i don't have much free time where the kids/wife aren't using the tv and even if they weren't it might not be good to have doom eternal playing with my 3 kids toddling around
This is absolutely true in my case as well - on the desktop/laptop. I use Fedora fwiw also so Windows/macOS slowing down over time isn't applicable in my case.
That said, it is definitely not true in the case of server hardware.
I have a home lab with a few R620s that have 2014 era Xeons in them, and while there are a lot of parallel cores, single core performance is abysmal compared to newer in-class hardware. For CPU-bound single-threaded tasks, I actually spin up a Linode and use that, rather than my on-prem hardware.
I would also note, I have only Intel. I'm going to build an AMD rig here within the next few months though because I've had numerous people tell me that it's noticeably better.
Intel CPUs still give 16 PCI lanes on the consumer chips, despite any modern graphics card using that entirely.
(Meaning your nvme drives all compete somewhat).
Compared to phones which have real, huge, generational improvements every couple of years which is sometimes jarring.
However, this is different.
AMDs previous Ryzen laptop line performed incredibly well, both in power consumption (which equates to battery life) and thermal envelope (which, also equates to battery life) while also delivering jaw dropping performance.
I am a “devops” and farm out a lot of compute to the cloud, but that comes at a cost (iteration times, price, bandwidth, not all flows can be remote). And gaming does too (stadia has mixed results- though I am a fan).
I would posit that your workflow has become remote likely _because_ of this stagnation.
Intel has really been holding a monopoly on CPUs. AMD's YOY performance gains in Zen really are exceptional. Their mobile processors blow intel out of the water. They are more efficient and performant.
I still have a 6700k which is plenty fast enough for anything I do. But in reality it's quickly getting outpaced by Zen. The only reason Intel has been getting faster in the past few years is because of Zen and now Intel is shitting the bed.
The next few years, especially because of Apple Silicon, are going to be really interesting for desktop processors. I think we're going to see some really crazy innovations.
The AMD Ryzen 7 4800H is around 35% faster than the Intel Core i7-4800MQ at single-threaded workloads, up to being about 3× as fast when loading all cores (e.g. compiling code). I gather that the 5800H should be something like 25% faster again (~19% IPC improvement, 5–10% clock speed bump). Compared to the i7-4800MQ, that should be up to about 70% faster single-threaded, and 4× as fast with all cores loaded.
That’s H-class APUs, since you were talking about an Intel MQ. The U-class APUs seem to be surprisingly close in performance, mostly only 10–20% slower, with a third of the TDP.
I am still using a SandyBridge core i7 laptop from 2011, and my only real gripe with it is its comically large bezel around the monitor by today's standards. I can't game with it, but general usage and development is not an issue.
I knew I was getting a big architectural jump when I bought the Sandybridge, but I wouldn't have believed you if you told me that in ten years I would still be happy with its performance and still using it.
Yes, cpu performance no longer increasing by factors of 2 with new generations, so a CPU stays relevant for a lot longer, unless you have compute heavy jobs that can leverage the new core counts. (compilation sometimes fits that description)
I'll be honest - I doubt it qualifies as compute heavy, but I'm definitely fantasizing about getting some of these new CPUs in a firewall appliance. Some NICs offer various offloading options but according to PFSense's website they caution users that support is mixed at best. Then there's VPN acceleration: there are no AES-NI-like CPU instructions to accelerate Wireguard's crypto choices.
Not to mention network speeds even at the consumer end are slowly increasing. So having enough beef to push 1 gigabit NICs is good but planning for the future and adding 2.5 gb or higher is better. And finally having enough compute to layer fq_codel or better on top along with some monitoring all adds up.
Like a lot of projects the demand will act like a gas and fill whatever volume is given to it. Given more compute I'll find a way to use it. :-)
>So having enough beef to push 1 gigabit NICs is good but planning for the future and adding 2.5 gb or higher is better.
I'm using I5-2500K's for firewalls. 1 gigabit hardly moves the CPU usage. I have all NIC offload turned off.
Btw. I would really consider OPNsense instead if you are building custom firewalls. Pfsense is an atrocious company. Trying to kill off OPNsense by buying domains that tell lies and AFAIK no one outside Netgate can actually build from source. They are not open source even though they claim so. They are also moving to using DRM.
I've been drag racing various operating systems after my little protectli box proved only capable of pushing 500Mbit/s using PFSense + FreeBSD. Even the usual TCP tuning tricks didn't help. Meanwhile every OS I've tried except OpenBSD proved able to saturate a dedicated one gigabit link in a VM under ESXi.
You wish! Intel only very recently (e.g. 3 years ago) brought 4 cores to mainstream in the mobile segment.
Low power mobile chips were dual core until Q3 2017 when Kaby Lake refresh was released and brought 4 cores to the i5 and i7 line-up. Even then, entry level and mainstream versions (i3 in particular) remained dual core configs.
I have a cheap Thinkpad E-series from 2012 with Intel i7-3612QM 4c/8t CPU. At the time it cost a third less than a Macbook. As far as price was concerned it was mainstream.
I recently had my work laptop refreshed. The most noticable differences are that the new one is thinner, lighter, and has better battery life. Performance is roughly comparable and was excellent with the 3 year old machine as well.
> Is it just me or does it feel like in the past 5-7 years it doesn't really make that much of a difference when new CPUs come out? My laptop is a i7-4800MQ and it still feels incredibly snappy
Your CPU is a 4C/8T 47W one. Not bad for being in a laptop. My Thinkpad for example is a 2C/4T 25W one, from the 7xxxU series. Which doesn't feel snappy at all.
Your CPU would probably be handily beat by a recent 4-core or 6-core Ryzen, and it would be done so while outputting far less heat and fan noise.
From geekbench you CPU has: 780 single-core, 2784 multi-core. For a 25W Ryzen 5 4500U: 1079 single-core, 4260 multi-core.
I ran a 4700mq from when it came out. I waited to a similar processor to come out that was better. took a long while and I upgraded to an 8750h because it had more cores and was being sold in 500-600$ machines. There is now an 8 core chip of the same quality that i the next to most recent intel generation, this will be my next upgrade. My 4700mq machine runs win10 and ubuntu linux just fine. I dont do heavy compute work on it anymore, but I did back when it was current.
I guess if you're not gaming on hardware. Although Steam just released numbers and their base is bigger than ever, VR went up a lot too. Can't do that without newer CPUs. Also means you're handcuffed to the internet to do anything.
I really hope there will be more high quality non gaming laptops with good availability. Especially in the upper tier it seems most manufacturers kept their Intel designs (not sure if due to availability, costs of adapting to the AMD platform, pressure by Intel or a combination of these).
In January 2020 at CES, Lisa Su said AMD had 100+ laptop design wins. This year she said they had 150+.
I am not sure how this compares to Intel, but I saw two slides for Intel's Tiger Lake and Rocket Lake, and one said 40+ design wins. The other said 40+ refreshed designs.
Oh I get that! I have different preferences (think HP Omen 15, Lenovo Legion 5) but those weren't available at a good price until late summer (and really only when sales occurred). Anyone wanting QHD or 4K was out of luck last year. Wanting Ryzen but also wanting a 500-nit screen is not a great feeling.
I'm actually selling my HP Omen 15 to my brother because I discovered it uses PWM for brightness, and I have a very high sensitivity - if I turn brightness down from 100% I get a headache within an hour or two. (I mostly prefer 100% brightness but when on battery I'd like to turn it down.)
Fingers crossed 2021 is a big improvement from 2020 for Ryzen laptops.
Since people are probably curious: the Ryzen 7 5800U (8-core, 15W TDP) will probably be close to (but not equal to or better than) Apple's M1 in performance (based on specs and desktop Zen3 Geekbench scores). But they will probably have significantly worse efficiency. With idle / low-CPU workload, it's probably not too bad (17-21hr on 53Wh battery), but if you're compiling on your laptop all day I'd expect much worse battery life than Apple's M1 laptops.
The only C++ M1 time and battery benchmark I remember is from Matthew Panzarino[1]. He found that it took the M1 Air 25 minutes to do a fresh compile of WebKit, and that the Air had 91% battery remaining after finishing the compilation.
So, roughly assuming that you could compile WebKit 10 times before the laptop runs out of juice, that'd give you 4 hours 10 minutes of battery life.
> but if you're compiling on your laptop all day I'd expect much worse battery life than Apple's M1 laptops.
On the flip side, you get 8 full fat cores instead of 4 fast & 4 not so fast, so your compile jobs might go a bit quicker. Have to wait for the benchmarks to find out for sure.
Note that intel has always excluded “turbo” from the tdp and amd switched to doing so with the zen2 rollout (and continues now). I don’t know if apple does the same or not.
I really hope they can compete. As much as I would love Apple to pull ahead by a mile, it’s not going to help the computer space that much.
I have some Android friends who are super happy with their phones. When they ask why I’m upgrading my iPhone again and I say “it’s got a lot better performance” they usually say something like “why do you need a faster phone....my midrange 3 year old android phone is blazingly fast”.
It’s not until we lift all users in general that we will be able to create new and innovative software. I like to compare an old school project where they had to create a spell checker...but this was back when you didn’t have enough ram to keep the dictionary in memory.
More memory and faster CPUs make yesterday’s challenges trivial for today’s developers.
Reading this, I can only assume you want to be one of those people that Ubuntu targeted years ago, who can plug their phone into a dock and have the same power as top-tier desktop computer.
Otherwise, I don't really see the point. Every machine has a different purpose.
My telephone is for phone calls, texting, extremely light emailing, checking weather, using it as a GPS, occasionally taking a photo or a quick video.
My Surface Book 3 is for actual on-the-road document work, programming, light gaming, and digital painting.
My Ryzen Threadripper / RTX 3090 workstation at home is for no-shit, actual income-generating work and super-heavy-duty gaming.
My Panasonic Lumix DC-S1H is for videography, but my Canon EOS R5 is for highest-quality stills, and my Sony A7C is for travelling.
Every one of those cameras has different purposes, different use cases. Just like the rest of my hardware. There is no "one size fits all" anywhere, for anything.
> More memory and faster CPUs make yesterday’s challenges trivial for today’s developers.
Today's developers are by-and-large utter shit at designing software. Its usually poorly thought-out, poorly-documented, and poorly-maintained. The number of truly exceptional software programs out there is incredibly sparse.
It's hard to agree completely with this. On the one hand you can accomplish just about anything with a $150 phone, and I used one for many years. Recently though my midrange 2 year old phone could barely keep up with certain basic apps, and a upper-mid phone has been night-and-day higher quality usage experience.
I suspect a mix of Wirth's Law plus a lower level of polish on things like drivers on cheaper phones (inexplicably poor WiFi stability, for example)
I know that the situation with mid-range phones was not always so rosy. I was myself buying mostly high-end phones in the past.
But recently my friend bought a Motorola for 200 Euro - Snapdragon 730, 6 GB RAM, 128 GB UFS 2.1 storage, pretty good main camera + (so-so) ultrawide, (gimmicky) macro camera, 5000 mAh battery. I was very impressed what can 200 Euro these days buy. I'm pretty confident this phone will do just fine in 2 years too ...
But really, What do you need more cpu power on the phone for? email, whatsapp and a browser is about everything I can envision my phone for. Nothing innovative I can think of is going to change anything about what people use and need their phone for. Not that I'm against progress, better screens, lower latencies and faster loading are always appreciated.
For many people, their phone is their primary computing device. Sometimes their only computing device.
Large numbers of people use their phones for games, video/photo editing, videoconferencing (which can involve real time image processing for virtual backgrounds and appearance filters).
At the low end of the phone market, phone hardware performs significantly better today than it did two years ago for those tasks.
I guess I'm far from the target audience, video/photo editing anything bigger than cropping images and text supplements is going to feel frustrating on a mobile phone. Mobile Games always felt suboptimal for me and I wouldn't buy a phone based on gaming performance (Is that a thing?)
Realtime img processing is an interesting one. I dont see any precedent that that's a big usecase, most people just set and forget zoom calls but you've brought up a good feature.
I wonder how they will tackle the availability issue since on the desktop side, it's close to impossible to get a 5900x/5950x CPU at MSRP or close to MSRP price.
I assume they cannot even supply enough parts to fill the demand from laptop OEMs .
I also wish I could replace my desktop PC with an H-class CPU setup. The iGPU is enough for my needs, but I could definitely use the 8C/16T in a reasonable power envelope. Intel might do it, but it's probably be going to be a very expensive NUC platform targeted at gamers that also want a dGPU. In the end it's cheaper and easier to build a mini-ITX setup using desktop parts...
If you're not dead set on the mobile chip, Lenovo makes a ThinkCenter M75q Tiny Gen2 with a Ryzen 4750GE [1], although the ship dates look ridiculously far out at 5 weeks or more.
I just ordered the larger SFF model (M75s Gen 2) this week with a Ryzen 7 4750G and it is scheduled to arrive within 14d.
MSRP is pricey, but in Canada, I was able to score mine for less than half of MSRP due to a sale and some other promos.
It's frustrated me is their mobile CPUs are gamer centric, I still can't find an interesting/decent/ready-to-ship notebook for developing since 4000 series APU launched. I appreciate if somebody can point me one to buy.
No troubles this side.
Ryzen 5, RTX2080 daily driver on Ubuntu 20. Even use steam for Linux mostly in the off hour. Very little noticeable diff with game performance on win10
I have been waiting for Lenovo to release the Legion 7 Slim with the 4900H to replace my aged Lenovo Thinkpad 540p, but a 7 Slim with a 5000 series would be drool worthy for me.
I run Arch on the HP Envy x360 with a ryzen 4500u and it's a very smooth experience, I've had no issues. I did have Ubuntu on it at first and it may have been a bad install, but it was terrible. Half the time I'd turn on the laptop and something different would be broken.
I have a 4700u, some device issues at first (touchpad, namely) that work now (as is the case with all brand new laptop models) - AMD cpus are beasts though - can't wait until they start putting them in higher build quality laptops.
Ryzen 3 3100 was launched in May 2020 about ten months after the Ryzen 7 3700X, so I would expect a pretty long lead time following the Ryzen 7 5800X launch.
Still if you're shopping in this category, I'm not sure it's worth it to wait for next generation.
Ryzen is consuming 45W, M1 is said to consume 13.
Ryzen is on TSMC 7nm, M1 on 5nm. It's said that TSMCs 5nm is 1.8x more dense than 7nm, 30% more efficient and 15% faster.
So if both were on 5nm, one could extrapolate that the single threaded performance would be similar, but ryzen would still lose on battery life. But it's a speculation.
As to whether it's fair, Apple can pay premium for 5nm exclusive access, and then charge the users hundreds of dollars per 8GB of RAM or storage upgrades.
AMD is selling the chips to the OEMs which have to make money themselves, which means that using cutting edge nodes might not make sense economically.
But that doesn't matter, both are businesses, it's AMDs fault that they didn't rush to 5nm and are comfortable at staying behind.
The ones to lose will be the premium laptop manufactures, that sell laptops at 1000+ usd. As the agressive Apple marketing will most likely cut into their sales.
It would be interesting to see if the rumors, about Intel using TSMC 5nm this year, are true. Intel had the single threaded perf lead, even on their less dense 14nm node, vs AMD on TSMCs 7nm. Could be that on TSMCs nodes they would be faster than both Apple and AMD, but still probably at a higher TDP than Apple chips.
My understanding is that there is little benefit in upgrading the iGPU to RDNA2 because of the bandwidth limitations imposed by sharing ddr4 with the CPU. We should see some real improvements in iPGU performance once ddr5 rolls out late 2021/early 2022.
Probably just a no need type of thing. The Vega 8 is plenty fine for normal desktop usage, and they already had it integrated into Zen2. So just plop in the new Zen3 CPU and call it a day?
Which is for comparison basically what they did on the desktop side. The Ryzen 5000 desktop CPUs use the exact same IO die as the Ryzen 3000 line.
1. AMD Ryzen, Zen 2 or later (meaning 3000 desktop, 4000 mobile or 5000 any). At this point assume that I’m not concerned about class of processor, though I think H-class is probably my ideal.
2. Minimum screen resolution of 2560×1440, preferably 4K or similar. (And since I’m talking, ideally a squarer aspect ratio like 3:2, but those are so rare I won’t even bother looking. But I do really like my Surface Book’s 3000×2000 display.)
3. No NVIDIA GPU, because I want to use Linux and NVIDIA hates Linux, and I want to be able to run Wayland also. So either integrated graphics only (with a U-class or H-class APU), or an AMD dGPU (which would be the only option on the higher-end desktop CPUs). Sure, with U-class or H-class I could just disable the NVIDIA GPU, but the dead weight and giving money to NVIDIA for something I will never use galls, quite apart from the mild nuisance of figuring out how to disable it.
I looked through all the manufacturers I could find in Australia, and all the Clevo OEMs I could find worldwide.
I have not found a single device that satisfies all three of these requirements. Any two, sure, but not all three.
A related peeve lies with the H-class APUs: why can’t I get one without an NVIDIA GPU? I have found only one place selling laptops with H-class APUs and no dedicated NVIDIA GPU: TUXEDO, in their Pulse 14 and Pulse 15. But even then, they’re only available on paper—order now and you won’t get it for over three months since they’re currently out of both 4600H and 4800H APUs. (For myself, I rule TUXEDO out for not shipping to Australia, anyway; any forwarding service arrangement would be a bother and probably expensive, and leave me with a useless warranty.) But you’d think that more than one manufacturer would look at how light yet powerful a laptop they can make this way and do one without a discrete GPU. Look, the Pulse 15 with its 91Wh battery even advertises over 20 hours of battery life when idling at minimum brightness (and the Pulse 14’s 47Wh battery, 12 hours).
These things make me sad. I hope the portfolio expansion mentioned here will include something to satisfy me.