Not long about I accidentally ruined the keyboard on my oldish white macbook, I debated spending cash I really didn't need to spend on a nice new macbook air. But after giving it some good thought I decided to just go ahead and replace the topcase myself.
While deciding this I realized it's rather a shame that in our hacker culture there's still somewhat of a fetishism for having a sleek new laptop, and that it would be much better if the culture valued the patched together, upgraded and heavily worn.
There are many great reasons for this: cut down on consumer waste, more in-tune with the hacker ideal to squeeze the most out of what you have in front of you, interesting performances issues aren't solved with a simple hardware upgrade, and of course Zed's point that we remain more in-touch with what actual users are using.
It shouldn't a point of shame to have the latest and the greatest, but it would be nice more cred went to the hacker in the room with the oldest, most beatup, but still productive laptop
Don't replace it if it works for you, but due to Moore's Law computer hardware doesn't age that well. Older guitars that are well taken care of sound as good if not better than shiny new ones, but after a certain age your computer begins wasting your time.
Most programming time is spent sitting and thinking, but when I act I want the computer to respond as quickly as possible, whether compiling or running tests or opening a website.
That's not to say I don't share the sentiment about computers becoming less hackable and more consumer toys. It's a shame that laptops these days have almost zero user-servicable parts.
Macs are actually remarkably serviceable, thanks to their limited product range, and iFixit.
My only computer is a tweaked 5 years old MacBook Pro, underclocked to 1 GHz for stability reasons. I've replaced the optical drive with a big hard disk, and the primary hard disk with an SDD. Stock 2GB of RAM.
Except for CPU-bound tasks (mostly HD videos, and the ocasional ./configure && make && sudo make install), it flies.
The earliest MBP's probably need that heat paste thing redone, but 2008 and later Core Duo 2 MBP generally last as long as you don't flex the logic board (don't carry it by one corner), and keep it reasonably cool. One of hte older generation MBP had clips for logicboard that were ver fragile.
For linux, the US$7-800 core i5 toshibas and HP laptops at Costco work pretty well, except for wifi on the Realtek chip set. 8M RAM vs. 2M on a 5-year old laptop is a big diff.
My workstation started life as an HP I bought at Best Buy in 1998. I've just upgraded it piecemeal over the years, and while none of the original hardware remains, I've never replaced it outright.
My two development boxes, also under my desk, are whiteboxes cobbled together from previous iterations of my workstation - the hardware gets handed down, and once it hits four generations old, it gets shelved in the garage. Those pieces occassionally get used to patch friends' computers. I've quadrupled RAM with old unused DIMMs for more people than I care to think about
If that isn't patched together, upgraded, and heavily worn, I don't know what is!
That sort of thing makes a great deal of sense with a "desktop" computer. Laptops are not designed to work like that, and in most cases are not robust enough in any event.
I am guilty as much as anyone of riding my hardware pretty hard and for far too long but you're just moving into silly mode. Case entry alone would make a prudent upgrade.
What do you mean by case entry? I've replaced the case on my primary workstation twice; literally none of the original machine remains now. The current case is a nice big zero-tool-entry type with six drive bays and room for usually-large hardware like CPU coolers and video cards. I think I ended up selling the original case on eBay.
I don't do it out of some "hacker ideal" - I do it because it's cost-effective, and because I get a lot of pleasure out of building my own rigs. It just so happens that that ends up leaving me with machines that are patched together, well-worn, and well-loved.
That's pretty much the point. I'm more about the results I get with the tech than what brand or style the tech is in. Since my software is used by people, I have to try to use what they have but maybe a bit better so I don't waste time.
One thing I didn't mention in that article, since it's not the point of the article, is I really only buy computer equipment when there's a real time/money advantage involved. It's not about "junk machines == bad ass hacker dude". It's more that they're just tools for me and I don't fetishize them.
Now, guitar stuff. That's a whole other bag of fetish fun for me. :-)
I agree with you. That is one of the reasons I kept using my Newton 2000 until last year when it broke because someone accidentally sat on it and cracked the screen.
There are lots of "old" or "outdated" technology out there than can get the job done well. I don't see the need for always keeping up buying more and more stuff as if computers were disposable.
I'm on a similar course - the further I dive into systems, the more I learn to use the ubiquitous software (vi, for example) rather than being picky about my tools and environment.
People with very complex environments make me think they spent far too long developing a theoretical methodology, and too little time actually hacking and using stuff.
This is bad because he was 0% productive for many days because he couldn't input text on the computer where the problem was because it didn't have his editor.
We're 0% productive most of the time. Sleeping, commuting, meetings, lunch. Adding a negligible amount more for tooling is not really an issue.
I'm essentially dependent on Eclipse when doing Java. It takes me < 1 hour to get it set up the way I like it. For the next 2-3000 hours before I need to set it up again, I will outperform and produce better quality code than anyone using vi or emacs, guaranteed. Sounds like a fair tradeoff to me.
Back when I was on a Civil Engineering path, an elderly engineer said "I bet you don't even know how to use a slide rule, what do you do when the power goes out", to which I responded, "I go home, same as you, unless you've got candles in your desk too, and let's not even get into air conditioning".
I bet you can. But can likely use emacs as well. I think if I had to pick one piece of software to live with on my desert island, it'd be emacs (assuming I had a reasonable boot loader and a filesystem to sit under it).
Everyone I see in these kind of articles uses old hardware and small screens. I feel like it is selection bias - it makes for a better story when the rock stars don't use the best tools.
When I moved from an old Mac to a tricked out iMac, my apps compiled 10x as fast. I suppose it's nice to know you can write good software on any box, but it's worthwhile to have good tools for something you spend many hours on each day.
Aside from an SSD, a 2006 computer (with the introduction of the first intel Core2Duo) with 8 gigs of ram is really the last time things got noticeably faster for the average programmer.
Sure, game programmers and the like 'need' to upgrade more often, but for almost everyone else, it's just been a waste of money in relation to what you get.
Laptops are a different story, though I think that's quickly coming about (or it already has with i5/i7s)
I have noticed a tendency for some of the bulkier tools to get slower over time, making it seem like you need a better computer, when in reality, we just need better tool makers (VS 2010 was a notable step back that made you feel like it was time to upgrade).
The more cores, the better. I feel I'm really restricted from targeting the CPUs of tomorrow by lack of parallelism; if you're writing a piece of software you expect to possibly last 20+ years, you really shouldn't over-assume today's limited CPU count, in particular.
Work per clock cycle is improving at a not too shabby rate either. When I upgraded from a Q6600 to an i7 920, build times reduced by about 60%; Handbrake transcode times improved by even more.
I have a mix of ages in hardware, so it's not all old. The linux laptops I have sit on a desk more often, so those have 17" screens and I pair them with a 21" monitor. So, I do use big monitors.
But, that's not the point. I can also code just fine in a 80x25 ssh connections using vim and screen. I agree you should probably do most of your work using the best visual setup possible, but if you simply can't function without 3 apple thunderbolt 29" displays then you've got to rethink how you do things.
"If you're a developer, you need to spend money on a great computer, an awesome monitor, a fantastic chair and a good bed"
Personally, I think Zed's approach is far more responsible/sane. If you "need to spend money on a great computer", you are either doing something very different than me, or very wrong.
If you are working on software that will run on customers' machines tomorrow, it makes sense to do at least a good chunk of development on very average hardware. These days, most client development is web stuff, so you don't often have the excuse of compile times.
But if you're working on software that's supposed to run well on high-spec machines and servers, or targeting machines a few years in the future, then you're better off with a higher spec machine; one with lots of RAM and CPU cores, so you can play around with the various tradeoffs of time vs memory vs parallelism. Or if you have a big source tree - the one I have is perhaps 10GB in size, and takes about 14 minutes to build today - then it makes lots of sense to reduce turnaround time by throwing hardware at it.
For example, I work on a compiler that is used by the build tree. I can't really be sure the compiler is "good" unless it builds the whole tree, and the tree's tests run; if I checked it in as is, the integration server could find the problem, and then I'd be in everybody's bad books. Reducing the build time by 5 minutes, iterated over perhaps 5 or 10 builds in a day, and it starts adding up to non-trivial productivity advantages.
Maybe I'm being a closed-minded idiot, but "i write compilers for a living" isn't something a lot of programmers can claim. I guess I'm working on the assumption that most programmers are building enterprise apps. I don't know why I think that.
I'm working on backend systems for a small startup, and I am writing software that on my work laptop (MacBook Pro, Core 2 Duo at 2.4 Ghz, with 4 GB of ram) takes about 30 seconds to compile. Tests take another minute or so to run. That is 1 minute and 30 seconds too long because my attention is now elsewhere ...
It's certainly possible to develop with less than top quality tools, but when a developer costs $100k-$200k/yr (loaded cost), spending $10-20k on tools vs. $0-2k is pretty reasonable, if it either provides a bump in performance (and I believe it does), or just makes the developer happier.
If you go on-site with clients, it's pretty sketchy to have a taped together Gateway 2000 vs. a MacBook Pro/Air, too.
I'd be concerned if I took my car to a mechanic who was using chinese crap tools, or went to a doctor with duct tape holding every piece of equipment together.
Probably one of the better things to spend money on is having entire spare hardware lying around in case someone's laptop is stolen or fails.
Shouldn't it be worth it to decrease friction in something you are spending 8-10 hours a day on? Personally I can't stand when I run into the physical/computational limitations of a system I'm on. I also can't stand it when someone on my team complains about this.
A nice computer, a good amount of RAM, good input devices, a good chair and a desk are paltry compared to what you should be paying a great engineer or designer.
Tools don't make the talent, but they sure do make the extraction of talent into product/technology much easier all around.
Like I said in another comment, in my opinion, a 2006 (intro of a intel's c2d) computer with 8gigs of ram + an SSD was pretty much the last time your average programmer saw any appreciable gains on the desktop.
Things _have_ moved forward since then, especially if your are multi-thread/core sensitive. For everyone else, even at 8-10hours a day, I'm not sure it'd be considered a great investment for such incremental improvements.
Unfortunately in some (most ?) big enterprise we are still stuck with Windows XP (so 3.5gigs of RAM) and shitty hard drive that are further killed by the antivirus :(
RAM and i/o will certainly make the biggest difference for the average person or engineer. That and HID's and your display.
My computer before this one at its core was not the latest/greatest Mac Pro, but I had upgraded parts (RAM, video, HD) which made the biggest difference.
Well I have an awesome bed, so I agree with the article there. I use a piano bench for my chair though. Keeps me from sitting there too long. I have a lot of computers since I use them actively for different tasks, so I do agree as a professional you have to spend money on good tools. And I also agree having a dual monitor setup is the way to go. I do laptop+monitor for most of my work.
I think the difference though, is I don't immediately go out and buy everything Apple shits onto the market or the latest greatest PC possible when Intel drops a new CPU. I go and get the tool that fits what I need to do, and base the decision on which one will get things done.
I agree hardware is no big deal for development in general. I tend to buy one generation behind in term of hardware. Not only all the bugs have been work out but I also got great bargains. Initial depreciation is steep for technology related products.
Since I use Mac, I've been forced to upgrade sometimes. Like going from my Powerbook (G4 chip) to a Macbook Pro in 2007 (for the intel chip).
Last week, I finally broke down and brought a Macbook Air. It wasn't for the i7 chip, or the SSD (which by the way, I will NEVER go back to spinning drives again); it was the mobility. The Air is the lightness most powerful machine I've ever had. Amazing machine.
My scratched up, stained, and dented MBP sits in a bookcase. On top of my old Powerbook.
I have always been dismayed with us techies and our toys. Like women and their purses. Not that there is anything wrong with that. But its nice to know there are great programmers out there who don't care about that stuff.
But geesh, even I would upgrade from a white Macbook, and I am using a 2006 MBP.
While some toys can definitely be attributed as luxury (you don't really need that iPad), I can say that my i7 Macbook Air made me a significantly happier developer. Incredibly fast for everything I do, be it Xcode or Photoshop. Check. Long battery life and lightweight - great for coffee shops. Check. This is one "toy" that actually improved my productivity.
Also, something should be said about efficiency. If you're primarily coding on a slow/old computer where compiles, etc. take longer, you may be wasting precious time. It's worth having the best tools for the job.
I definitely agree, if I wasn't doing Windows development, I'd be perfectly happy with a machine that was that level spec, Vim and screen don't take up many resources at all. Lousy battery life would be a dealbreaker though.
old hardware might work for him, but not for me. I dont want to wait for my pc, and if i can noticably improve speed with a SSD, a better CPU or more RAM i do so. Whats wrong with wanting fast hardware? I spend 8-12 hours each day in front of these things, so if i can upgrade stuff to make my job easier i do so.
I'm not sure why you wrote that post. I mean, Zed is not saying that having/buying faster hardware is bad, it's just unnecessary for him most of the time. What is that you're arguing about?
Oh, I don't have all old hardware. I have some pretty bad ass gear for power, I just don't go running out to buy the latest crap whenever it shows up. Instead, I find I need to accomplish something, realize my current gear sucks for it, then go find the best option for what I'll need to do. Sometimes that's a monster Linux laptop, sometimes it's a Mac, and other times it's parts from sparkfun.
The point though, is the brand and type of computer doesn't matter for what I do. It's more what I do and uses my software that matters.
Reaper is one bad ass piece of software. I would kill to work for those guys, 'cause damn they could use a new manual or two. I really think a great manual with a video course would turn Reaper into a powerhouse in the market.
Anyway, I do all my audio editing in Reaper and find it's the most intuitive and best performing DAW out there. And, I do it on a Mac. I do not do audio or video on Linux. Linux is just horrible for anything media related.
I enjoy Zed's enthusiasm. While people seem to argue with him frequently or vice versa, he is prolific in his books and software and is definitely helping improve things.
I thought it was a bit odd that he wrote 7 paragraphs about music equipment and software, but didn't mention which GNU/Linux distribution(s) he uses on his "junky" laptops.
I'm not a programmer, but I suspect the detail on the music gear might be because the sound you get depends on the instruments and effects chain.
The code you get out of your coding session, or the LaTeX source for your book, does not really depend on which method you chose to install and update GNU/Linux.
I suspect you're right - I spend a lot more time programming than playing music but have a stronger emotional attachment to my music gear than my coding environment.
Musicians call it GAS, Gear Acquisition Syndrome, e.g. I've got major GAS for an Ampeg amp.
Shiiiit, that's because the music gear is both more fun, has more style, and will actually exist and work well in 20 years. The computers will be gone in like 3 years, so why bother talking about them?
But, to answer your question, I use nearly every OS there is: Linux (Arch, Ubuntu, Debian, Fedora), NetBSD, FreeBSD, OSX, and Windows. I have to use all of these to test different things on various projects. As I get more into writing books though this narrows down to Ubuntu, OSX, and ArchLinux.
I understand the gearhead mentality (I was raised by one), and I actually enjoyed reading about the music equipment. I just didn't expect it because when I hear "Zed Shaw" I think "programmer", not "musician", even though I realize that few people, even programmers, are one-dimensional. I guess it's just one of my many mental biases. Thanks for putting my question to rest.
Don't know why I am answering because my response is most likely inaccurate, but IIRC (specifically from following him on Twitter) he used to use Arch linux for a while. I also remember him complaining about it and IIRC he made a switch. I am sure once he sees the post he will correct me and or answer you.
Nope, you can install awesome on ubuntu and then switch to using it. Best setup is to make a .xsession file and fire up the various gnome services you need. Then have GDM boot up a "User Defined" session (or whatever it's called, I forget).
While deciding this I realized it's rather a shame that in our hacker culture there's still somewhat of a fetishism for having a sleek new laptop, and that it would be much better if the culture valued the patched together, upgraded and heavily worn.
There are many great reasons for this: cut down on consumer waste, more in-tune with the hacker ideal to squeeze the most out of what you have in front of you, interesting performances issues aren't solved with a simple hardware upgrade, and of course Zed's point that we remain more in-touch with what actual users are using.
It shouldn't a point of shame to have the latest and the greatest, but it would be nice more cred went to the hacker in the room with the oldest, most beatup, but still productive laptop