If anyone out there is in a similar situation, building a new computer, especially if you have a lot of parts from an old computer to cannibalize, is actually quite easy and much less expensive than buying a new pre-built PC.
Don't be intimidated! If you can turn a screwdriver, you can build a PC.
(For an upgrade, you can often buy just the motherboard with CPU already attached, potentially also upgrading the power supply and RAM. Or upgrade the drive if you're running out of space. Or upgrade the graphics card if you want to play games. Everything else is pretty static and just plugs in.)
Easy things include installing M.2 drives, RAM, hard drives. (Just read the motherboard manual to ensure you use the right slots/ports for optimal performance.)
Medium difficulty might be installing the motherboard into the case (getting the rear I/O ports aligned just right) or installing the CPU with thermal paste into that motherboard, along with the heatsink.
Hard difficulty is just getting all the wires run without obstructing airflow (or just looking "ugly"), and the little wires from the case to the motherboard in all the right places. Also if you install AIO (all-in-one cooling), you may have to try a few things to make sure everything lines up and it all fits nicely in your case.
None of the above are difficult if you're experienced, but on your first run it might be tricky. And this doesn't include setting up the OS, tweaking and optimizing things... or knowing what to do if something doesn't work as expected. Sometimes you'll get a bad stick of RAM, but that usually gives off symptoms of half a dozen other issues, though with experience you start to default to that assumption, and find ways to rule it out earlier.
Having done it dozens of times, I find it quite easy, but I also know that when things go wrong, they can be scary, and occasionally expensive. (A friend once dropped a CPU and bent the pins irrecoverably!)
> the little wires from the case to the motherboard in all the right places
This is often underrated difficulty, especially for whose hands aren't small. I've never failed, but just physically difficult. I get sweat while plugging connector located to small space, and takes bunch of time, especially with bigger air cooler. ASUS Q-connector is great tool but it's not widely adopted.
I love building PCs and haven't bought a pre-built system ever in 30+ years. That said, thankfully, for inexperienced folk there are innumerable videos on YouTube explaining each step of the process.
I admit I sometimes watch these videos to see if there are any new tips/tricks/secrets that builders have figured out for e.g. working with a new case and getting all that wiring done correctly.
PC Building is the easiest thing out there. Right along with soldering, which has a similar reputation being "hard". If you can wear pants in the morning, you can build a pc and solder anything, even 0402 components and sot-23 packages with zero experience. You can't fuck it up even if you tried. Millions of people build IKEA furniture. PC Building is much easier than that.
Its definitely no where near as hard as a layman can think, but I remember my pc causing me some stress when building it due to fixing random issues or trying to fit things in properly.
That's the actually difficult part. Everybody can reasonably follow instructions to put together a PC barring any mental or physical disabilities. But if something doesn't work for whatever reason, layperson don't have the requisite knowledge or experience to properly diagnose and fix the issue.
This is something nobody ever mentions when they say building your own PC is easy.
Also the power connectors for graphics cards.. depending on your PSU and card there's what? at least 4 combinations of plugging in 4 pins, 8 pins (or was it 2+4?), or not plugging them in at all. Why are there extra cables that look just like those? Etc.pp.
But not being able to troubleshoot components because you only have one PC or one compatible PC is the worst.
Not if you have a Corsair One like the article author.
On that particular model, even updating the SSD is a pain in the ass since (IIRC) the 2.5in one is at the bottom of the case and the m.2 is on the back of the motherboard which has been perfectly engineered to fit in that case.
It also uses water-cooling on the CPU and for the upper models the GPU, using a custom loop that was created specifically to fit in the tight dimensions of the case. So good luck upgrading either the CPU or GPU easily.
sounds like every mITX. My m.2 slot is on the back of my motherboard. That drive has been full for more than a year and I don't care enough to take apart my entire PC just to replace the damn thing. Going mITX was a mistake for me. I wanted to build a NAS and reuse parts from my mITX last year. Only thing I could salvage was the CPU and RAM. Now I have an old CPU on a new MB for my NAS and a newer CPU on my old mITX board for my main PC. SFF really limits your upgrade and reuse paths going forward. Limited PCI slots, limited SATA ports usually, limited RAM slots, the case is just too small and horrible to work with. Looks pretty sitting on a desk though.
In order to support that, they had to make a lot of custom components. This means that you wouldn't even be able to use the water coolers, for instance, since the loops are tiny. To get to the m.2 slot you'd have to basically disassemble the whole thing, and you weren't the one that put it together in the first place so good luck!
They're absolute beasts in a tiny form factor, which is the only real thing going for them. They cost far more than any mITX build.
Another problem with SFF: proprietary case and/or power supply designs. I got bitten by that one with an old Shuttle cube. Great-looking design, but it was not mITX, it was its own thing. When it became too obsolete, there was no upgrade path.
But ATX cases are pretty obnoxiously large, especially for anybody who has been buying HP or dell or the like desktops, where the Full ITX cases look mosterous in comparison.
microATX size is probably more of a sweet spot. Smaller, but at least you are not completely .
I would have done microATX on my last system update, but I was going with Ryzen, and the only microATX boards available at the time with the better chipsets just had poor overall quality, failed to pack in a lot of rear IO to compensate, etc. I think the intel side microATX offering may have been better.
I like my mITX case, but I'm only going to open up again if something breaks. I accidentally put my m2 SSD in a PCIe 3.0 slot instead of 4.0, so I'm not getting the full bandwidth. That's just how's gonna be for a while.
The sheer amount of Youtube videos showing someone building a PC makes it much easier than it was in the pre- youtube days- you can follow along with the people in the video as they are building their PC. If you buy a popular case you can probably find a video of someone working with that exact case.
You can also look up the fastest CPU that your motherboard supports and buy a used one on eBay. A top-of-the-line CPU that cost $1000 years ago will be trading for far less today. This is particularly true of Xeon CPUs pulled from retired servers.
Retired servers with Xeon can readily be bought used. Be aware the RAM will be ECC, which can also be bought used. Some motherboards are set up for two CPU sockets.
I have three older server-class machines like this. With SSD they are workhorses.
To add, registered ecc ram sells for rather cheap used. But servers in general tend to use much more power and generate much more noise than their home/office equivalents.
>This is particularly true of Xeon CPUs pulled from retired servers.
Too bad xeon CPUs have different sockets/chipset requirements compared to desktop CPUs, so chances are the best you can do is upgrade from a 2/4 core to a slightly faster 4 core.
The Intel Xeon comes in desktop compatible versions. Up to Coffee Lake(8000 in desktop numbers) they were sold under the Xeon E3 brand, E5 is the aforementioned the incompatible server chip. Comet Lake had them rebranded to the Xeon W and it does become slightly more confusing, but you can definitely still buy Xeon chips that run on the last desktop socket: https://en.wikipedia.org/wiki/Rocket_Lake#Workstation_proces...
Check your motherboard's CPU support and run the model number through Intel Ark to make sure it's only has 2 "Max # of Memory Channels".
The linked article also says "These CPUs support ECC memory and require Intel W480 or W580 chipset". That seems to suggest that unless you already have a workstation motherboard, those CPUs aren't upgrade options.
If that's the direction you want to go, retired Xeon-based workstations (HP Z, Dell Precision, etc.) also sell for less than new high-end desktop CPUs, have well-engineered, if proprietary, case designs, and still allow you to bring your own storage and PCIe cards. Power consumption may or may not tip the scales against their favor, though, depending on your location and applications.
> I'm guessing that's just typical Intel behavior requiring constant motherboard updates with almost every tik/tok CPU release?
Yep, but even within the same tik/tok cycle there are different sockets/chipset combinations.
>And... when was the last time Xeon had a 2/4 core?
My point is that most people probably have a desktop motherboard, presumably with a 2/4 core CPU. Upgrading to a used 8 core xeon on ebay is not an option because xeon chips use a different chipset/socket. Therefore the highest available option for most people are 4 core desktop CPUs.
I came here to say the same, and to add that with a good case (I've a Fractal) there is plenty of room to work.
I'd go as far to claim that it's almost impossible to mis-assemble. Throw a bunch of computer parts on a bench and treat it like it's a jigsaw puzzle. You'll observe that it's much easier to assemble as a) there are many fewer parts and b) they can only go together the right way.
I built my new desktop four years ago and it's been purring along ever since. It runs near 24/7 and has all the local VMs involved in my life and work.
> I'd go as far to claim that it's almost impossible to mis-assemble.
You may still need to know how to read the fine MB manual to not leave performance on the table (which is what you're trying to solve in the first place).
I don't know how brand-spanking-new motherboards are; I remember that around the time when Intel's 8-9th gen core CPUs were all the rage, they had a ridiculously low number of PCIe lanes, even though some motherboards came with multiple 16x sized PCIe ports. Only one actually had the 16 lanes, the others had at best 4.
You may or may not care about this, but seeing how OP talks about GPUs, he may.
One fun bit is that you need 2 RAM sticks, and to install them (usually) with 1 socket of gap in between, or you're leaving memory bandwidth and CPU performance on the table (especially on AMD).
Honestly, I don't think it's any easier than in the late 90s.
If you research well enough and get components that work together (or are just lucky), then it will work, if you put it together correctly, otherwise it won't.
I mean, there are Youtube videos that probably help if you've never done it, and then you still sometimes make these small errors that you make if you don't do it all the time. I certainly had to reassemble everything twice due to some cabling mistakes (new case) and not having done it in like 7 years because the hardware just won't be obsolete so fast.
But I can totally understand why people would not find it easy. Mainboard manuals are still mostly a joke of a few sketches, you can be lucky if your case has a single sheet of instructions, there are now even more different sorts of screws, etc.pp.
I don't remember IRQ problems since the removal of the last ISA network card, with just one sound card it worked most of the time. Maybe I should've written very late 90s/early 00s to be fair :)
I have to agree, it's like building legos except you're getting paid $1k to do it (saving on prebuilts!) although of course there can be good prebuilts as well.. but in my experience they're usually cutting corners somewhere.
As a kid, I always saw brands like Alienware like fancy brands for rich people and never understood their value (kinda like paying €100 for a t-shirt). I'd never met anyone who came close to affording such things either.
As an adult, I understand that not everyone has the know-how or time to assemble their own computer, but I still don't get why somebody would by from Alienware instead of having your local computer shop assemble one for you. The price difference is still ridiculous. Is it just good marketing?
I've been using an old Lanboy 2 case for years, I just update the MB and CPU. Swapping out parts is really easy IF you have some instruction materials to help out. (I don't need instructs anymore because I'm done it so many times now.) Adding memory, as mentioned by others is easy. Buying pre-built computers is both expensive and frustrating because they become outdated, parts wise, fairly quickly. When my computer becomes slower or new tech comes along I just do the updating myself and save a ton of money, not to mention learning something new.
I have to agree. Fit and finish on cases has come a long way in the last decade or two. I do agree with GP that there used to be plenty of sharp edges and burrs lurking inside cases demanding their due of blood from the fool unwary enough to poke around inside, but that has greatly fallen by the wayside.
Also have large hands! I have had issues with screws - especially when my case was under my desk and I was trying to swap out a GPU like that. Though in the building process, where I place the case on a table or counter, I have a lot less issues with the screws! (And none of those issues involved sharp edges!)
My most recent build was a micro atx case. The hard drive cage was especially a pain to work with. I've been thinking about a mini ITX but I'm not sure my poor knuckles could handle it.
The giant power connector that connects to the mobo. The one I have practically cold-welds itself in place: getting them to separate is on par with getting two stubborn lego flat 2x2s to separate. You can kiss your nail goodbye while trying to wedge it between the plastics.
One of the best parts about desktops is being able to upgrade pieces at a time. I build my computer about 5 years ago. It's been slowing down a bit over the last six months and I realized that I only had 16 gigs of ram. That was pretty good at the time. Now 32 is much more standard. Upgraded to 32 gigs for ~$120 and its been great.
I have been doing this for over 15 years. The only original component at this point is the case. My strategy has been:
1) buy modern motherboard w/ cpu and memory at mid point
2) year or two later, max out memory
3) another two years later, i buy most powerful cpu that is supported by motherboard
4) repeat from step 1
I also upgrade hdds / video cards as needed. It seems these components are upgraded more frequently than mb / cpu / memory. Though, my video card has been plenty fast for a few years now, probably because I don't play as demanding games any more. I also didn't skimp on the power supply and it's been performing well. I use a UPS to make sure the power stays clean.
When upgrading the motherboard, I also evaluate the technology space to see if it makes sense to upgrade any other components. For instance, I may buy into NVMe in the next cycle depending on what my research reveals.
I'm still hanging on to Windows 7. I'll probably upgrade that in the next cycle but it's been smooth. I use VMware for software that needs newer version of Windows.
The next AMD CPU to be released will almost certainly have a new AM5 socket. I'm waiting for this reason alone, considering the amazing support AM4 has had.
Both RAM and CPU sockets have changed within the lifetime of a build; any replacement in the set {CPU, Mobo, RAM} necessitates replacing the entire set, now. (Like the author, I'm at the EOL for a machine, and … all of those might have to go.)
Yeah, maybe i'm old (and/or just have not conducted hardware upgrades as often as i used to years ago), but it feels like nowadays, the lifetime of builds and associated hardware has grown to be...shorter lifespan than in the past. Or, maybe its just me? I almost never play games anymore, and have long switched to linux for my daily com0puting needs, so admittedly that has allowed me to stretch things waaay longer than in the past...but hardware just feels like it needs to get replaced more often and has plain less life than in the past. Oh well.
Yeah, a bit, but it sort of depends? I have some parts that have lasted quite a while. The CPU in my current build is ~2012 and is fine; even stats wise its respectable, as a lot of the improvements of recent years have been rolled back by Spectre et al.
But manufacture warranties are crap these days. Used to be a machine was warranted for 4 years, and a good one would last 10. But these days it seems like they're 1-2 years. Part warranties aren't great, but the stuff I've built seems to hold up.
The 10 year old machine above current has a failure. I think it is the PSU, but I'm not sure. But in the 10 years it ran, the only failure it had was a HDD died like 4-5 years in. And that drive was a used datacenter drive, that when sold to me had a "it's been abused in a datacenter, 30 day warranty" on it, so it did fantastic. And I think its replacement is also now out of warranty, too, though I'd like to see it last a few more years, ofc.
> ...the only failure it had was a HDD died like 4-5 years in. And that drive was a used datacenter drive...
Wow, i have heard that datacenter drives (more than any other datacenter equipment) gets trashed the most...so that's awesome that it lasted even that long! Nice!
There are some exceptions to that, the AM4 socket my system uses for example has been used for quite a long time. I could upgrade my R7 3700x CPU to something like a much newer R7 5800x without changing the motherboard.
Intel on the other hand seems to change sockets basically every CPU generation, making significant upgrades more difficult.
And one of the unwritten things of this article: He's going from a "difficult, but probably doable" scenario when it comes to upgrading piecemeal to "hope you like what you get" scenario with the Alienware he had ordered.
It's not 100% non-upgradable, but there are a lot of bespoke oddball parts in them, notably the motherboard and power supply.
I do wish these companies were more upfront when it comes to how upgradable systems are. Some may not care, but I'm the go to tech support person for a bunch of people, and I hate having to tell them they can't upgrade their machine like they want to, despite being a desktop in principle.
Yeah, I do the same. Although my 1070 is very long in the tooth thanks to the last couple of years of GPU madness. And now that prices have come down significantly, it's not really a good time financially for me to buy a new GPU.
1. You'll own nothing, rent anything you want and be happy
I had no idea that you could now rent a cloud video card. We all basically started renting our music from Apple/Amazon, then that moved to movies with Netflix etc but hearing you can rent desktop components feels like a big shift to me.
2. The Second Life GoogleEDU talk
There is a talk given by one of the founders of Second Life where he mentioned that they took, at the time, a very bold step of rendering the physics of the game on their server farm and the piping the video feed to the user. He mentions that this happened to be at the same time that cable modems became popular. Had that not been the case, he goes on to say that SL probably wouldn't have happened. At least, not the way it did.
Given the above plus the "it's still cheaper to fedex hard drives" etc, what will consumers who need more data that can be streamed but don't want to own hardware will do? (consumers could also include the SOHO folks as well)
That's not quite correct, though, at least with classic SL. While a lot of the calculations are done server-side, the actual graphics are rendered on the client. It's not a video feed.
I have used NVIDIA's service on and off since 2018: https://taoofmac.com/space/blog/2018/09/30/1600 - I did that because I'm (privately) a Mac user, can't stand fan noise and like to game, but have mostly moved to Game Pass for multiple reasons (full disclosure: Microsoft FTE with kids who prefer Xbox games to PC ones).
The experience of streaming a full blown AAA title to anything (provided you have good connectivity) is pretty great these days.
I think that some people don't factor in the amount of their time it takes to buy a new computer and get it set up sufficiently so the old computer is no longer needed. In addition to the cost of a new computer and environmental concerns, also remember the value of your time.
BTW, the guy who wrote this article John Scalzi is a very good sci-fi/humor author - I have a bunch of his books.
that is one good reason why i don't rush to buy a new computer.
when i replaced my computer the last time, the old one was 10 years old. (and we were moving, so i dropped the old computer before and got the new computer after the move. that way i offset the work on the new computer with not having to bother moving the old one. (i kept the monitors though, and i took the disks with the data))
i built the new one myself.
technically i could have reused the case, but i wanted a smaller case anyways. (i really miss the pizzabox form factor. current cases are either to big, or they can only hold one disk (i need two for mirror raid))
but now i am moving again, and this time i can't take the monitors with me. they are just to large, and shipping is not an option. i probably can take the case because it is a smaller one. but instead of buying new monitors, i am considering just getting a laptop instead. i'll need one anyways so i can travel for work. but that means external disks (2x5TB) if i don't take the case too. though choices.
John Scalzi:
i read redshirts some time ago and recently the kaiju preservation society. very enjoyable!
In the comments he mentions he has a mac mini m1 dedicated to music. I realize he's probably in the apple music ecosystem, so he can't just swap them, but it feels like he could be putting that hardware to better usage if he actually had an issue.
I've been using the same cloud gpu service as well for about 2 months. It's able to run on Manjaro installed through snap. The service picks up input pretty well, including audio from your mic, but like the author said it will depend on the speed of your internet connection. The audio/video resolution will adjust so sometimes it looks and sounds like a 480p youtube video. It has been very pleasant to use for a city-building game I enjoy playing. I'm more inclined to try other sim-like games since I don't have to worry about storage. It uses Steam and other game marketplace accounts to check if you own the games you want to play. It may not be good for multiplayer games like cs:go or dota; I've been disconnected from a match and since the game is running on cloud I'm still "connected" in-game appearing as afk.
Overall, I've been pleased with the service and plan to use it on a laptop for an upcoming vacation. Of course, nothing beats having a bare-metal GPU. I don't play as much as I used to anymore, so it's definitely a viable service for people who are sunsetting their interest in video games and do not want to pay for a gaming rig.
This one is tricky because everything is loaded into memory. If you have DLC that also gets loaded into memory. Same with mods and asset packs. I found with 16gb I had to be quite choosy about how many mods I had installed or else the game would run out of memory and crash.
Instead of cleaning out "some" apps, I would have suggested a complete wipe and clean install of his OS (after backing up, of course).
Better yet, an SSD is always a great speed upgrade (I'm assuming he has a mechanical HD), as well as adding/upgrading RAM. They can really breathe life back into old computers.
I'm not a gamer, and this is an honest question. Isn't streaming now just as good as local - assuming you have performant internet? If so, what's the attraction of running locally? Just to ticker with hardware? Running offline? Don't most all games now require an internet connection?
I signed up for Stadia and was really impressed with it. I played dozens of hours of Destiny 2 and really enjoyed my time.
Then I installed Destiny 2 on my home computer and streamed it to my work computer. (I got permission, it was during lunch, etc.) The lag was noticeably less.
Likewise, I had the LOTR Lego game installed on my home PC, and streamed it to a laptop in the same house. The lag on the timing-related events was absolutely noticeable.
Sure, you can play from the cloud and have a lot of fun, but it's definitely not as good as playing locally, and can't be because of the lag.
To further elaborate on your "probably never," many people like myself live quite a distance from the usual locations of both cloud servers and game servers. If you're not in the east or west coast of the US, it's probably thousands of miles. You can have the fastest gigabit fiber connection and it won't matter. Response time will never be acceptable simply because of distance and the limits of the speed of light.
I imagine that many people think internet latency is better than really it is due to the sophisticated predictive technologies currently used in games.
Input lag will likely always be a differentiator between streaming and local. Most games are generally not designed to handle input lag at all, especially input lag at network speeds. The streaming services today have put in extraordinary work into this by using things like input prediction and "input smoothing" and those tools make games more playable than not, but it's certainly not an exact match in play to how the games were designed to be played and sometimes it shows, even when you have performant internet and good quick ping to your nearest streaming server.
Of course, it especially shows as network performance degrades. (I tried streaming a game the other week on cell tower tether from a hotel with bad wifi and the input prediction was a bit like trying to semi-blindly navigate a PowerPoint or HyperCard deck of the game. Felt like classic Myst nodes with a worse game play user experience.)
>Isn't streaming now just as good as local - assuming you have performant internet?
No and it's not really close, especially for first person shooters. That few milliseconds delay makes a HUGE difference in the feel of the game and ultimately how well you perform.
It's not. I demoed Google's beta of assasins creed oddyssey on a fiber connection. It basically works like a youtube video. One second its working pretty well, the next most inconvenient second it boggs down to 144p and you get stabbed in the face and can't ready any text.
One example: Geforce Now's streaming machines skimp on CPU strength, so certain CPU-intensive games like Planetside 2 run poorly. There is no way for an end user to solve this problem.
… like the author, I'd like to rebuild most of my current machine. I think I have a bad PSU, and I did the research, settled on what I thought was the replacement and clicked "purchase" and … the site reliably times out.
The component has even gone through a sale, in the time I've been checking. The price isn't $70, EVGA, it's $∞.
Geforce Now has been a great experience, running Windows-only games on the M1 Mac Mini just works. Still: many games download content, require you to log in, and generaly annoy on first startup with a new computer. With Geforce Now you get each team an (as) new computer...
tl;dr: You can improve performance by clearing up some space from a full storage device. Also, video game streaming exists and can be a more affordable alternative to buying a boutique PC from Alienware.
Counter point: it's okay to turn down the specs on your games. Modern "AAA" PC games are usually pretty good about helping you tune your settings to find a good balance between performance and visual fidelity.
It looks like this author has a Corsair One, which I suspect has an RTX 2080 graphics card. He considers that "questionable for a number of new games [he] wanted to play". As someone who has been using a GTX 980 Ti for over six years, I find that very hard to believe. Maybe he's a very competitive gamer who needs high frame rates but doesn't want to completely sacrifice visual fidelity. But then how is the inherent latency and artifact from streaming an acceptable compromise?
It seems the Corsair One has been around for ~5 years, and launched with a GTX 1080[0].
Though I agree with the overall point that such a GPU should be sufficient to run modern games, unless you're insisting on 4K and higher quality settings with a certain frame rate target... which seems likely; he espouses GeForce Now, but laments having to play in 1080p. So it's odd that he isn't getting enough performance at 1080p out of a GTX 1080 (or higher). At 1080p, his CPU might even be the bottleneck (Core i7-7700K).
Huh, I didn't realize the Corsair One had been around that long. The GTX 1080 would definitely be a more compelling case for an upgrade. Hard to believe it will be three generations old soon.
And yeah, it's odd that a GTX 1080 would struggle enough at 1080p to justify streaming. Latest "AAA" game I've played is Elden Ring and I didn't need to make many visual compromises to get 60 FPS at 1080p. But I haven't played many other recent "AAA" titles so maybe ER isn't very representative. I'm also on an i7-9700K, so your CPU bottleneck theory might hold water.
Don't be intimidated! If you can turn a screwdriver, you can build a PC.
(For an upgrade, you can often buy just the motherboard with CPU already attached, potentially also upgrading the power supply and RAM. Or upgrade the drive if you're running out of space. Or upgrade the graphics card if you want to play games. Everything else is pretty static and just plugs in.)