Hacker News new | past | comments | ask | show | jobs | submit | ad133's comments login

Maybe it is, these days? As much as I appreciate all the functionality brought to us by these tools, when I started web-dev circa 2005, LAMP (Linux, Apache, MySQL + PHP) was the go-to for hobbyists.

As much as I look back at the simplicity (Apache config was not that difficult for a small site, at least with Apache 2.0), the part of me that operates production software these days gets anxiety the idea of it all.

And yet, when I wrote a small website to host my wedding website last year, it was indeed Linux, (some webserver), Postgres and PHP, with me copying files manually to FTP. It was probably nginx but you know what, I paid a company £50 for a large amount of storage, bandwidth, a domain and SSL certificate, for year, and everything went dandy. Horses for courses and all that.


I realize the chemistry is different, but in my head the idea of submerging Lithium in _water_ to _extinguish_ a fire is pretty funny.


Well if it is already on fire, what is the worst thing that can happen? Yes you get energetic reaction, but you were having one already and after wards, it is not that big of deal... Just thinking how long does piece of metallic lithium last in container.


And of course it's even more complex when you mix UK vs US English:

- an herb (US), where the "h" is silent

- a herb (UK)

Cue confusion about "an historic".


This is one of the strongest pushbacks against USB-C that Apple have:

USB-C: You break the stem, you have a useless device and functioning cable

Lightning: You break the stem, you have a functioning device and useless cable.

One of these is clearly more optimal considering the cost difference between the two. Anecdotally, I have had problems with USB-C ports that I did not have with Micro-USB and (so far) with Lightning (admittedly I have only been an iPhone user for a year or so).

Of course, this directive is the correct stance and direction - having a standard and forcing it on everyone. It's just a shame the one they chose may be inferior.


You forgot the springs. Lighting has the springs in the device, while USB-C has them on the cable. That is why Apple stopped improving Lighting and developed USB-C. The stem can break, but it is far more rare than a tired spring, which is an inevitability.


The lightning connector standard specifies it must survive through 20k plug/unplug cycles, the USB-C standard 10k.


> That is why Apple stopped improving Lighting and developed USB-C

They did not develop USB-C. That was a thing started by Gruber.


Apple was involved in usb-c, and the first major adopter.


I recently repaired a family member's iPhone. A crappy Lightning cable had been used and the tip metal piece of the plug had somehow broken off and gotten stuck inside the socket.

Like this:

https://www.ifixit.com/Answers/View/163391/lightning+connect...

Here's a decent 2 minute video that explains the problem and the fix:

https://www.youtube.com/watch?v=eujHf-ry8zw


Let me tell you a story about playing The Settlers IV on a 233Mhz 64MB RAM Voodo 3 2000. This game was cool in that it had two audio features: a soundtrack on the CD itself, played using the CD drive's DAC [^1], and you could also drop MP3 files into a folder where the game would play those instead. It was common practice to use a NoCD crack when playing online, because the CD check took long enough you could time out of the lobby, and if you forgot the CD you got booted. That meant most online gamers had MP3 files, and no CD.

The minimum requirements to play this game were a 200MHz CPU w/ MMX and 64MB RAM - I was pretty close to this baseline. So anyway, I discovered that the game played at much better FPS (like 30 instead of 5) if you turned the music off - but only when playing MP3 - CD Audio had no hit. Now perhaps the game used a sub-par audio codec, but that single MP3 decode stream was enough to make the game unplayable.

Anyway that's not to say that I would expect MP3 decoding to be a problem in 2014, in fact you can likely play audio with no noticable increase on CPU usage, but when you have multi-stream audio (think voices, background music, sound effects from various channels - guns, explosions, etc.) I can see it starting to add up - especially when the CPU is already constrained for the graphics, game logic and perhaps of course everyone's favourite anti-piracy/anti-cheat logic.

[^1] For younger readers, yes, CD Drives used to come with built-in DACs and a special cable you could hook directly into the audio card, allowing you to listen to CD Audio on PC for "basically" free in terms of CPU cycles.


Yeah CD drives were also CD players - I had rigged my computer with a separate tiny power supply for the drive so I could have it on without the computer being on - put a disk in and it would start playing audio through the headphone port on the front and out the special wires on the back, and if you had the right audio card that would even play out the speakers with the computer off.


Even if it's not an always-available thing, how you can control Slack really depends on expectations - especially working remote. Do my colleagues and mentees really want to be blocked for two hours because they need a two-minute input from me?

I have set Slack up so that certain things alert me, and most channels are just muted, but as others pointed out more fine-grain controls would be even better. Really just "only notifications for DMs for these people" would be a great QoL.


On the other hand your organization could be built to be more async or if you do require input from people who can’t respond immediately, there is probably other stuff that can be worked on.

80% of my team is located in India while I’m in the US so none of our working hours overlap (meetings are scheduled early US time / late India time). Stuff can still get done without same-day responses.


This was my initial thought as well, but from the text I gather there is a flow like this:

[Input Data, maybe null] -> Validate field is not null -> Call this method with the assertion.

This is a small bug-bear for me with nullable types and I wish there was a better way to do it, but many languages allow you to smart-cast away nulls, but only within the local scope. If you want to pass a struct-type around which has nullable fields, but you have already checked for non-null (like this one) you need to convert to a different struct-type, which doesn't have the nullability on its fields. I can't think of a good way round this - as you say with the unit test remark, there is nothing to stop another piece of code calling this method with nulls.


> If you want to pass a struct-type around which has nullable fields, but you have already checked for non-null (like this one) you need to convert to a different struct-type, which doesn't have the nullability on its fields.

Which is exactly what IMO the author should have done. It's actually a reasonable use-case for inheritance:

    #nullable enable

    record SlackEvent
        ( int EventId
        , string Content
        , string? TeamId
        );

    record TeamSlackEvent
        ( int EventId
        , string Content
        , string TeamId
        ) 
        : SlackEvent
            ( EventId : EventId
            , Content : Content
            , TeamId : TeamId
            );


I was hoping there was going to be some minimum definition, and that I could call myself Lord, but alas, my ~300m^2 of land doesn't seem to qualify me either (I live there, I didn't get scammed ~3k times over for 1 sq foot).

I mean honestly if you could be a lord with 1 square foot, there would be a lot of lords in Scotland due to the ownership nature of buying a house (very little of this "leasehold" thing that's prevalent in England). My parents would be a lord and lady, so would their parents...

I've not encountered these ads though, I guess it would be pretty dumb to geo-target it to people living in Scotland.

Edit, on the other hand, if you buy a house and let it out, I guess you can be a land lord, even if you still can't call yourself lord.


Or you could just call yourself lord regardless, and the monarch won’t appear and throw you in the oubliette.


It's Britain, the country that wants to get out of the European Human Rights Convention because ... it protects human rights. I wouldn't be so sure about oubliettes...


That's a misrepresentation. The UK government merely wants to make the UK supreme court the final authority on some human rights cases in the UK, which seems perfectly reasonable to me. The UK is not abandoning the convention, and there are plans to create a UK Bill of Rights.

(Putting people in oubliettes is illegal under UK law anyway.)


> That's a misrepresentation. The UK government merely wants to make the UK supreme court the final authority on some human rights cases in the UK, which seems perfectly reasonable to me.

Perfectly reasonable, until the executive decides to pack the court, and then renege on / "creatively reinterpret" their shiny new "bill of rights" (or even just simply amend it).

You say "merely", but that's the whole dog and pony show. This isn't some abstract, ideological exercise, the current government wants to pass laws that would currently not fly in the ECHR court. If they didn't, there would be zero practical need to change the current state. Ask yourself which bits of the ECHR you think are worth doing without?

Remember, the whole point of the ECHR is to prevent the oppression of a people by their government. Having the court be drawn from a range of different nations and not having any single government exclusively in control is a feature, not a bug.


And what happens when the ECHR starts oppressing people? It's turtles all the way down!


Pshhh, we all know turtles would never oppress people. Gödel, Escher, and Bach, though, would be an Eternal Golden Braid.


Those two things are mutually exclusive. The convention establishes the ECHR as the highest court ruling over matters of the convention and states that are part of the convention are bound to those rulings. Establishing the UK high court as the highest court would be violating the obligations under the convention. So yes, if you want to have the high court be the last court of appeal, you’d have to abandon the convention.


What the UK politicians say they want and what the actual bills say are not necessarily aligned. FWIW, the rhetoric that gets hate-quoted at me is mostly in the camp of "rights are bad".

(Would it be illegal to make a law whose punishment was an oubliette? Can the queen be detained at her own pleasure, at least for anything other than the detonation of a nuclear device in her personal capacity?)


> Would it be illegal to make a law whose punishment was an oubliette?

No, it would not be illegal to make a law whose punishment was an oubliette. The British Parliament is sovereign and supreme, and can make any law it wishes.

The devolved legislatures are limited by the Act(s) establishing them. Such a law passed by the Scottish Parliament would probably be challenged under S29(2)(d) of the Scotland Act 1998 which prohibits breaches of the European Convention on Human Rights.

In this case, such a law in Scotland could potentially breach Article 3 of the Convention which prohibits inhuman or degrading punishment. The British Parliament can simply choose to ignore it, but the Scottish Parliament can't (unless the British Parliament chose to give it such powers).

> Can the queen be detained at her own pleasure, at least for anything other than the detonation of a nuclear device in her personal capacity?

No, she cannot be prosecuted for anything while she is the monarch. She would need to abdicate or otherwise be legally removed (e.g. an Act of Parliament abolishing the Crown) from her position.

Even if she were able to be prosecuted, the Nuclear Explosions (Prohibition and Inspections) Act 1998 is not currently in force, and even if the Act were in force, Her Majesty would be exempt under S14(4):

"Nothing in this section affects Her Majesty in her private capacity; and this subsection shall be construed as if section 38(3) of the Crown Proceedings Act 1947 (meaning of Her Majesty in her private capacity) were contained in this Act."


> No, it would not be illegal to make a law whose punishment was an oubliette. The British Parliament is sovereign and supreme, and can make any law it wishes.

Sure, but this is the same argument that comes around whenever previously signed treaties become 'inconvenient'.

Parliament can, of course, choose to ignore EHCR/Geneva/pick-any-other-ratified-treaty. But there are both predictable and unpredictable consequences for doing so, so in practice it's prudent for parliament to not just wield their unlimited power to just start murdering whoever they fancy.


I think prudence really depends on the whim of the executive, since generally the executive is Parliament given the majority it normally possesses. A number of British governments have historically shown little regard for the unpredictable consequences of violating international law.

For example, Iraq, the Chagos Islands, the Brexit trade deal concerning customs checks in Northern Ireland, etc.

I suppose it's all really just one big political matter: do they want to chance it? Probably not, but they could, legally. We just haven't had the government who's willing to do that yet, but given the current government's willingness to flout international law, prudence seems like it might have vanished some time ago...


> In this case, such a law in Scotland could potentially breach Article 3 of the Convention which prohibits inhuman or degrading punishment. The British Parliament can simply choose to ignore it, but the Scottish Parliament can't (unless the British Parliament chose to give it such powers).

Well, if Westminster choose to ignore the convention, I don't see why Holyrood would not choose to ignore Westminster?

This whole politics thing only works if we DON'T ignore stuff we all signed.


Holyrood derives its power from Westminster and Westminster can remove it. The UK Parliament is still supreme, but it refrains from making laws in areas that are devolved to local Parliaments (the Sewell convention). It could overturn the Scotland Act that created the Scottish Parliament. In theory, Scotland could ignore Parliament, but that would be akin to unilaterally declaring independence, which is unlikely to go well.

The ECHR has no such powers over signatories.


I think this is the point. All of these arrangements are ultimately by consent. Holyrood agrees to recognise Westminster's power, just as the UK agrees to recognise the ECHR.

This idea of 'power' is a complicated one. What 'power' does England have over Scotland other than the power that Scotland has agreed to accept (see Northern Ireland for what happens when one of the devolved nations is conflicted over whether to accept that power)? What 'power' does the ECHR have over any of its signatories other than that which they've chosen to accept?

There's a well-written and detailed exploration of the negative consequences of withdrawing from the ECHR here, and what is power except the ability to impose negative consequences for not following instructions?

https://verfassungsblog.de/uks-potential-withdrawal-european...


> what is power except the ability to impose negative consequences for not following instructions?

Yes, I think power is exactly this.

> What 'power' does England have over Scotland other than the power that Scotland has agreed to accept

A monopoly on violence to enforce its territorial integrity as the United Kingdom. Withdrawing from international treaties is generally accepted to be within a state's general powers. Seceding from an existing state? Generally recognised by the current international order as unlawful and solely a matter for the state to resolve internally.

Would there be international condemnation if Scotland seceded and England tried to take it back by force? Yes, but doing so would embolden existing secessionist movements in, e.g. Spain or the United States, so those states (plus the ones who promote non-interference in domestic affairs) aren't going to intervene in any meaningful sense.

I think power must go beyond consent and incorporate the expectations and realities of the nature of sovereign state power on the international stage. The UK has the right to insure its territorial integrity, while the principle of self-determination is limited within that framework.


The monarch can be 'detained at his own pleasure' and even charged with treason and executed as Charles Stuart discovered, or simply run out of the country like his son, James.


Why is becoming a Scottish Lord desirable in the first place ? explained for someone who doesn't live in Scotland or any of the neighbouring Isles ?


I'm guessing that many of these are targeting Americans.

One of the weird side effects of being from a melting pot country only a few hundred years old is a yearning for connection to things that feel permanent and historical. "Owning" some land in Scotland and "having" a title may be appealing to an American who otherwise feels cut off from their Scottish ancestry or other longer-term cultural practices.


It would have an appeal to me, because it's absurd. I'm an American and was raised up with the notion that royalty and monarchies are patently absurd, a feeling that has only grown as I've aged. Having a Lordship would be like owning a three-wheeled car or living in a house that looks like a giant seashell, to my mind. It'd be a funny thing to tell folks about at parties. I might even go about creating knights or whatever if the party was good enough.


It's pretty absurd in the UK, too, and no-one really cares about it.

Where I grew up, Lord NotGoingToDoxxHim, Clan Chief of the MacDonalds, was the local TV repairman and satellite dish installer who drove an admittedly rather nice Volvo and had a house with a garden nearly as big as ours.


Eh, I’m German and almost bought a royal baker title some years ago. It’s mainly fun ;)


I don’t think it’s related to melting pot thing - new world history is just not that rich compared to european/asian ones due to its recency and (for the most part) lack of conflict


Isn't that sort of related? People in the mixing pot want to find some rich historical legacy to connect to so they can feel distinct from the rest of the pot.

Personally, I've always found it a bit silly -- the folks I most admire in my ancestry are the ones who looked around and said "Yeah current events are shaping up to be part of some nation's rich history, time to get the hell out of dodge." Rich history is largely made of poor people being shoved into the meat grinder of 'glorious' national struggles.


There was a tremendous amount of conflict in the new world, just mostly not between countries (the conquest of the Americas had a death toll of 70-100m people).


Sure but those are relatively few events (though large in magnitude) compared to stuff that was going on in europe/asia. And americas are fucking huge compared to europe and coastal east asia. Turns out when lota people are sandwiched in tight geography, competing for resources for thousands of years a lot of stuff happens!


Counterpoint: New World history is/was just as rich and valuable; it was just systematically erased by one of the largest genocides in history spanning from pole to pole.


They had 10% of world population at the time conquest began so just by that metric nah


What level of population is required for one's culture to be considered interesting or worth preserving?


Yikes, you always twist someone’s words like that in a discourse? All I said the history is not as rich which makes sense given there was like 10x fewer people here on a huge territory. But also North American tribes didnt even have much written down unlike the South American civilizations


You literally said that by the metric of population, their history was not as rich. I don't see how that's twisting your words at all.

My point is that Native Americans do have just as rich and interesting a history as European countries. It was just largely deliberately erased by European settlers.


What other genocides do you know of that killed more than 10% of the world population?


1. That’s not my point 2. You should brush up on world history before trying to zing others https://en.m.wikipedia.org/wiki/Destruction_under_the_Mongol...


It wasn't an attempted zing, I was curious. So then, the genocide of the indigenous peoples of the Americas also would rank as "some of the most deadly acts of mass killing in human history."


Ok apologies for anti-zing then.

> So then, the genocide of the indigenous peoples of the Americas also would rank as "some of the most deadly acts of mass killing in human history."

I agree. That kind of strengthens my point - I wouldn’t want to associate my heritage with that sort of thing (unless you’re in Mongolia where GK is still considered a god-like figure by many)


Titles are a form of addressing people. Sometimes it is just nice to be addressed by something that isn't Mr/Ms/Mx(/Mrs/Miss). Plenty of people have been motivated towards a PhD or MD for among other reasons the hopes of getting to be addressed as Doctor. For various reasons most of the available Titles to be found in our world are old and baroque (sometimes literally), which both has its appeals (a feeling of "tradition") and detriments (hard to get).

I've many non-binary friends that especially pine that Mx isn't greatly accepted by many and most of the non-gendered titles (including Doctor) are hard to acquire. I keep joking that if I were Governor of Kentucky, I'd make a simple form for any non-binary person to acquire the Colonel title if they wished.

(In Kentucky the Colonel title, the same one used by, for instance, the well known Colonel Sanders, is a title gifted to anyone for general service to the Commonwealth. It's origins are military, but it's been much more a philanthropic thing for much of the past couple centuries, including in the case of Colonel Sanders who was awarded it for his role as a businessman in the Commonwealth. It's a fascinating title.)


I mean, you know, would you rather have a chat with Ben Kingsley or Sir Ben Kingsley? I know I'd rather chat with Lord Banana699 than just plain-ol' Banana699


And this is why marketing is a thing, for the record.


I doubt it confers any practical modern day benefits aside from the Right of the Braggart.


I imagine it isn't. It's simply something you can mention during drinks.

And the correct term is laird btw.


> And the correct term is laird btw.

The OP quite clearly notes that this is false:

> In the United Kingdom, “lord” and “lady” are peerage titles, meaning a person can only hold the legal recognition, privileges, and protections associated with those titles if they rightfully inherit them, if they marry into a noble family, or if the queen grants them a peerage.

> Companies that sell souvenir plots with the claim of granting titles often rely on a conflation of the titles “lord” and “lady” with “laird,” which is not a peerage title, but rather solely a courtesy title akin to the English phrase “lord of the manor.” In Scotland, this title is traditionally applied to a member of the landed gentry who owns a large estate that has a long history and who generally has servants and tenants. Because “laird” is merely a courtesy title, it has no legal significance

The title of nobility is "lord". "Laird" is not a title at all, but a conventional designation for a somewhat related concept.


Scottish Lords and Ladies get a free pint from their local pub the third Tuesday of the month between 3-5pm.


Only if they loudly declaim their status to all present...


Time was, you could be a Senator in the American South without the land... or the actual job. They'd call you that in the bar or restaurant, just in case your were.


And I believe there is a requirement to do a funny little dance…


If you make a reservation at the Marriott they have a long list of titles you can claim. You can say you are a "Lord" there. Also an example that "Established Titles" makes is that you can have your credit card say "Lord so-an-so"


For the Marriott, sure. But for credit cards, normally you can't though. It's not part of your name, and it's not a legal title. Most KYC processes will stop you if you try.


There are sufficient people who value items that give them (real or imagined) prestige with other people (especially those of a similar mindset) to sustain these industries.


It's bought as a fun gag gift.


I think it’s a holdover from when airlines and other service industries would treat their customers with titles more favorably.


Or the square foot grant when you buy a bottle of Laphroaig... I have a few, though I'd strongly doubt they're contiguous.


The only way to know for sure is to go check it out :)

And while you're there, take the Water to Whiskey tour: https://whiskystories.com/2017/02/18/laphroaig-water-to-whis...


They don't do water to whiskey anymore, the new version is called "uisge". I believe the primary difference is that they don't take you out to dig peat on the new tour (because all the peat is dug mechanically now).

Still a nice day out with a picnic by the water source, as long as weather cooperates.


> They don't do water to whiskey anymore

Ah, so I see. It has been a while.

The stop to dig peat was fun, but it's still a really good tour without that.


the distillers wares tour at laphroaig is much better imo: https://www.laphroaig.com/gb/book-islay-distillery-tour#id=d...

went a few years ago


I think the uisge tour (formerly known as w2w but now without peat digging) is basically just distillers wares with a picnic first.


> I've not encountered these ads though, I guess it would be pretty dumb to geo-target it to people living in Scotland.

In-video "sponsored content" on YouTube is probably only made in only one version and thus not geo-targetted, right? I've seen this in several videos on history and adjacent subjects (for instance in some by Max Miller, whose "Tasting History" channel on old recipes seems rather popular here). So if you want to see it, that's where to look.


Could be targeting tourists.


I doubt this is what the author is alluding to, but a few years ago[^1] I read a proposal about CPU cooling using small-scale fluid tubes, much like a car engine is cooled. This would allow cores to be stacked vertically, much like HBM memory is stacked today. Perhaps this new development could allow better implementations?

[^1] https://arstechnica.com/science/2020/09/researchers-demonstr...


Yeah I think so, too! If we can solve vertical stacking that would make an absolutely huge difference!


This looks really cool, but the author... is... replacing their laptop every year? Like, I'm writing this on an 8 year old MBP that has survived as my round-the-house driver because it still does everything well. My daily driver is getting on a bit now (3yr) and my desktop only just got replaced after 5 years.

Forget the cost, but the waste!


As the other comments have noted, Cory addresses this further down in the linked post. He further expanded on this in the post he wrote when he quit smoking[0]:

> That was my homework: go away and think of an immediate reason not to smoke. When I came back, I had my answer ready: “I spend two laptops per year on smokes. That money goes directly to the dirtiest companies on Earth, the literal inventors of the science-denial playbook that is responsible for our inaction on climate change. Those companies’ sole mission is to murder me and all my friends. I’m going to quit smoking and I’m going to buy a laptop this year and every year hereafter, and I’ll still be up one laptop per year.”

[0]: https://doctorow.medium.com/i-quit-9ae7b6010c99


Or he could just save the money. But I guess he has to constantly reward himself for quitting? Man addiction sure is a pain.


Well, maybe he's happier this way? There's a classic joke about a lifelong smoker talking to a stop-smoking councillor:

"With all the money you've spent on cigarettes in your lifetime, you could have bought a Ferrari."

"Do you smoke?"

"No."

"Then where's your Ferrari?"

It's a good question. Most of us have the financial capability to be extremely extravagant with a few select areas of our life, but instead we average everything down to boring mediocrity.


Sounds like giving up one addiction for another. But I guess buying laptops yearly is better for your health than smoking.


The labor and physical footprint needed to produce modern electronics is completely insane. You're comparing little league basketball to major league baseball, and it's not like a player like Framework is going to change this at all.

There is a severe ecological impact to the wider environment that comes from electronics, let's not kid ourselves. That doesn't mean buying electronics makes you like, a terrible person, but if you're sitting around prostheyzing on blogs like Doctorow about how these companies are killing you, it's a bit funny to essentially go from a thing that kills people you know in the first world to one that only kills people in the third world you never cared for. Modern comforts like cutting edge electronics have extreme externalities. Like, okay, let me just throw the "murders people I care about" problem over the fence, where it will surely not be an issue for all those people halfway across the planet from me (that I coincidentally do not care or think about.)

In general I'm not trying to be too hard. It's not like anyone else deals with this level of cognitive dissonance much better, and I say that as someone who mostly quit cold turkey over a year ago...


>it's a bit funny to essentially go from a thing that kills people you know in the first world to one that only kills people in the third world you never cared for

I feel like you're not really representing his argument on why he quit fairly. He does talk about the effects of tobacco on the developing world for one and also his overall reason seems to be more relating to the wider idea of tobacco companies being pioneers in the misinformation industry.


While acutely better for the individual, surely the e-waste and resource sequestration outweigh that over time


From the smokers in my life, it's apparent to me people generally need a fairly concrete reason or goal to successfully quit.

Otherwise it's always Sure, I'll quit - tomorrow

Willpower is a muscle. It fatigues. So simply willing your way out of an addiction is not effective for many people.


My mom smoked for about 45 years and stopped the day she found out she was having a grand daughter. She didn't want to smell like smoke around her. Hasn't touched a cig in years. The whole family is better for it.


People justify buying things they don't technically need in many different ways.


Can't take your money to the grave man. Cory is well set up and isn't hurting for cash.


> Or he could just save the money.

What a stupid comment. He saved the money and spent it on what he wanted. The hell?

He should save the money and you should call your grandmother.


Kudos to him for quitting. I quit, oh, about a dozen years ago. When I decided to quit, every time I smoked, I told my self they taste like shit; every drag off the cigarette, I told myself that. Eventually (about 2 or 3 months as I recall) it worked and I could no longer stand the taste and haven't touched one since.


I don't care enough but I want someone to fact-check him on the environmental impact of a MacBook worth of cigarettes versus the MacBook itself. It'd be funny if the MacBook is ultimately worse for Nature.


If he's giving his old one to someone else that presumably needs it, I don't see why it would be a negative.

The cigarettes are probably better for Nature since you'll live a shorter life.


Back when I was making crap wages, I would get the cheapest laptops I could afford that would more or less give me decent performance (on the order of ~$500-600). It's not too hard to find a new laptop that performs well at a reasonable price, but you always run the risk of them reclaiming those costs by cheaping out on all the mechanics of it, and it's not like I was able to afford paying ~$2000 for a high-quality machine. Usually within 2 years, the laptop would just start falling apart, I would get sad, and then I would repeat the pattern.

After the fourth or so time of doing this, and after getting higher-paying jobs, I ended up biting the bullet for a more expensive computer, and it lasted me five years, and I only replaced it because I wanted more RAM.

Point is, if you're lower-income, it's fairly easy to get stuck in the "one laptop a year" trend, because, while probably a better deal in the long term, it's really hard for lower-income to justify a multi-thousand dollar expense. I'm a proper tech bro now so buying a good computer isn't the worst thing in the world for me, but that wasn't always the case.


I bought a low end laptop back in 2005, and I used it for about 2-3 years until it started to fall apart. It just didn't hold up (hinges started to disintegrate). It's performance was terrible, too, and it couldn't be upgraded.

I got a business class laptop in 2007 for probably 3 times as much. That laptop lasted me until last month. I maxed out the RAM and replaced the HD with an SSD about 7 years ago, but it was ultimately the now-anemic CPU and graphics that got me to buy a replacement. I'd have replaced it last fall but laptop stocks were too low.


That's the Vimes' boots theory of socioeconomic unfairness.


Looking at your trend, you've got $500 laptop / 2 year, or $2000 laptop / 5 year, which reduces to $250 laptop / year vs $400 laptop / year. Getting low cost laptops isn't necessarily a worse financial outcome, although it depends on how fast the processor updates are moving; when a 2020 intel cpu is about the same as a 2015 intel cpu, it would probably have been better to pay a little more in 2015 for a faster one; when a 2015 intel cpu smokes a 2010 intel cpu, incremental updates every year or two mean a low cost 2015 cpu is probably better than a high cost 2010 cpu. Plus, you get a battery refresh (even if it's small).

I think there's more junk at the low end to avoid, but it's not as if the high end doesn't have a lot of junk to avoid. Either way, you have to do careful shopping.

It's like just my opinion, but a lot of higher end laptop spending seems to be on increasing the screen's DPI, which is then run with scaling, at the cost of more CPU, more RAM, more GPU, and more software BS. Buying a cheaper laptop with fewer pixels that just runs 1:1 saves all that extra computation and BS, and maybe looks a bit less nice. Sometimes glossy screens are reserved for the high cost laptops, which is like wait, I want a matte screen, so I have to save money to get one, great!


Yeah, I've actually done this math too, though I don't think it's quite this simple. When a laptop started falling apart, I usually tried to just put up with it until I couldn't.

For example, I used to have an Asus computer whose plastic surrounding the screen decided to start coming detached from the monitor flap. This made the laptop substantially more fragile and annoying to use, and after a certain point I tried to remedy this with gorilla glue and it led to this ugly mess on the bottom left corner. The laptop still "worked" in the sense that still did computation, but it was crappier. Then the 7 key broke off the keyboard, I was unable to put it back on, so I just decided I didn't need the 7 key, since I didn't type 7 that often, and when I did I could still hit the little switch. Again, the laptop still "worked" in the sense that it still did computation, but it was crappier. A bunch of other stuff ended up happening (e.g. the LED for the backlight started to go out and become this flickery mess, the connector to the battery didn't always seem to make contact, etc).

Stuff like that starts to add up, and "experience" is substantially more difficult to quantify. I bought an expensive Macbook, and I never had any issues outside of the inevitable "moores law" depreciation.


> I bought an expensive Macbook, and I never had any issues outside of the inevitable "moores law" depreciation.

I hope that keeps going. I used a macbook for work for almost 8 years, and they did OK, but I had one that decided not to take external power and the hard drive wasn't removable, thankfully I noticed it wasn't charging while it was near full so I could pull a backup to a spare work hand. And then there was the year where iTunes would have a 25% chance of spewing high volume digital noise at me instead of playing music. I guess that was a software problem because it went away with the next major OS X release, but no useful forum contents. I think there was something else bothersome too, but not sure anymore.


Maybe next time your 7 key breaks you can set up a macro so everytime you type "6+1" it will replace it with "7"!


I have an apple macbook air from mid 2012, that i paid 1200$ for. If it survies 6 more months, then I've spent 120$/year on laptops over the last 10 years.


I have a 2015 air that was $1k. I expect to get down to $120 per year in a couple years, but I would have to add $10/year for replacing the battery every few years.


Instead of buying a brand-new potato, consider buying a used or refurb'd laptop. Your performance stance doing this is much better now compared to.... any time else in PC history because PC hardware performance gains have flattened out. Plus, corporations get rid of perfectly good PCs like, every year because they want the latest model for their staff and especially their executives/management. Know where to shop and you'll find a glut of cheap and even free computers. I've been poor myself; used machines is how I got by. That and building my own.

Either way, you'll pay about as much for a used ThinkPad in good condition with good specs as you would for a new HP Stream or other cheaptop.


> Plus, corporations get rid of perfectly good PCs like, every year because they want the latest model for their staff and especially their executives/management.

Tangential, but a bit of a lifehack I figured out awhile ago is that corporations dump off old servers on eBay for basically nothing, and most servers allow you to install a regular desktop graphics card in there. Servers usually have a lot of CPUs and a lot of RAM, so 9 years ago when a broke me needed enough power to do cool stuff on the computer, I would go buy a used server on eBay, and it was good enough for video processing and editing and gaming and distributed computing experiments...as long as I remembered to turn it off when I wasn't using it. Whenever I would accidentally leave it on for a few days, I would end up increasing my power bill by ~$40, a lot of money when you don't have much.

Still, it's a trick I still use occasionally, even now that I make decent money. I semi-recently bought a 48 core, 128gb RAM server for around $400, which I use for any big computing experiments. Could I just spin up an AWS box with these specs? Probably, but I think there is value in being able to have the hardware locally.


I once scavenged an HP workstation from behind a dumpster. It was just sitting there in the rain. I brought it in, dried it off, and checked the innards for rust or damage. All looked nearly brand new, so I let it dry out for a couple of days, and powered it on -- it worked. Put a hard disk in and it was ready to go. It's a fairly powerful machine, with four cores and 12 GiB of RAM, a real powerhouse for 2012 when it was new. Probably chewed through many a spreadsheet back in the day. Now I'm making it into a build server.


That's awesome. I think my wife would punch me if I got into the habit of dumpster diving, but there have been multiple times where I've seen what looks like awesome equipment (monitors, computers, surge protectors, etc.) being thrown away near universities and office buildings, and I always have to resist my hoarding nature to take them.

Four cores and 12gb of RAM would make a pretty solid build server, with enough room left for a Minecraft and video streaming server to boot! Sounds like a pretty awesome find.


I found it's better to buy a second hand top model, or even last year's best on sale, than brand new low quality stuff.

It's a little less visible for laptops than for, say, kitchen appliances, but even there my thinkpad x220 was bought and upgraded for €400 in 2015, and it did its job well untill half way this year.


I usually buy top quality laptops second hand from shops that give at least 6 months warranty. Best strategy. You get a $2000 laptop for $500. And honestly, Intel did not do too much in the last decade, so these are of great value.


I was going to say the same - you are often better off buying a quality not-too-old used than buying a crappy new low-end machine. Better for the world too. However I tend to keep my gears for a very long time (hello my well-loved 2007 MacBook Pro) so I can justify buying new (w/warranty).


Yeah, I got my fancy macbook pro now because I used to work for Apple and had a pretty substantial discount on it as a result. When I need to replace this one, I'll probably get something decent in the used market and just install Linux on there.


In my experience, it's the laptop case that always fails first. So it's disappointing to see the trend toward ultra-thin cases.


The author does address that

> The environmental consequences of that system weren't lost on me, even given my very good track-record of re-homing my old computers with people who needed them.


That's not really addressing it. It's just acknowledging it.


If it causes other people not to by new laptops, it kinda is addressing it. (As long as we assume the people getting the old laptop would have bought a new laptop, which might or might not be the case.)


And apparently it made sense for him to pay 150$/year to get his laptop fixed in 24h if needed, and buy two powerbooks at once... I guess what he really should have bought is a Toughbook instead of a ThinkPad?


Buying a new thinkpad every year is especially confusing to me given that Lenovo’s switch from mobile to ultrabook processors in the x40 series meant that for around 5 years, buying a newer thinkpad than the x30 series meant getting a speed downgrade.


"the author" here is well known writer Cory Doctorow. No one here seems to be making this connection, but pluralistic.net is his blog.


Ah, hehe, I got confused opening the tweet. I wonder why I had in my head that this was a woman writing the story, must be the monica-byrne in the url :)


The author, Cory, links to a previous explanation of when he quit smoking he converted the cost into getting a new laptop annually. As he mentioned in the article he typically finds a new home for the used device. Laptop appears to be his primary device and critical to his work so updates annually makes sense, though a new device is partially due to the construction framework elimates(i.e. riveted or glued components).


Also just bc the link is buried, see https://doctorow.medium.com/i-quit-9ae7b6010c99

He traded smoking for buying a new laptop every year. Now that it's been years, I guess he could quit and not buy a new laptop. But also people do more wasteful things. I do understand though, I drive laptops into the ground over many years but still 4-5 years per laptop


2013 MacBook Air daily driver here. MagSafe? Usb A? Sd card reader? User replaceable battery? Runs Linux? All checks. It’s light on RAM (which for just chrome and light app use honestly it’s fine).


It just haa a horrible screen resolution.


yeah 1440x900 does kinda stink, but the screen is so small it doesn't really bother me too much. plasma does a good job shrinking itself down enough and virtual desktops help. 99% its a full screen chromium window so who cares.


I would guess that MacBook Airs, especially the current ones, are sufficient for the needs of 80%, maybe even 90%, of the entire laptop market, who I presume just need to be able to use a browser and spreadsheets.

And they last for years and years, and I doubt the cost:performance:longevity ratios can be beat.


What you're supposed to do, apparently, is buy a brand-new laptop, use it for a few months, then flip it on eBay before it gets too old so you can recover most of what you spent on it and buy the next new laptop.

I worked with a guy who practiced this with all his personal hardware.


It's very, very normal for wealthy people to replace their daily-use tools every year, or even more often.

I replace my phone and laptop and iPad every year. I know people who replace their car and wardrobe and luggage every year, too.

In laptops and mobile devices in general, annual updates make a lot of sense as power efficiency is still regularly increasing. The M1 Air, is, for example, a fucking marvel. It's been out for way less than a year. I have an M1 Air, and will upgrade it again in less than a year when the Mx (where x > 1) Macbook Pro comes out.


Further down the article:

> The environmental consequences of that system weren't lost on me, even given my very good track-record of re-homing my old computers with people who needed them.


>Forget the cost, but the waste!

They can just sell the laptop and someone else will use it. For example, I almost never buy new laptops, as perfect Linux support generally lags behind.


>> replacing their laptop every year?

Not really that uncommon especially with a MacBook Pro where a new one is released...every year.

How much sense it makes, that's another story.


Ignoring the (potentially substantial) environmental costs, if you do it "correctly" the total cost of ownership is about the same.

If you buy a $2000 Mac and use it for as long as reasonably, it's going to depreciate by several hundred dollars (let's say, roughly $300) a year. At a certain point it's worth nearly zero, and you must buy a new laptop. After 6-7 years your total outlay is $2000.

Alternatively, every year or two you can sell the old one for a few hundred dollars less than the new model, and buy the new model. You always have a new laptop. And your total outlay is still only about $2000. Plus you are covered by free AppleCare every time you buy the new one.

Plenty of people do this with mobile phones and automobiles and other things as well.

Please note that I am not advocating it. I was still using my 2015 laptop until very recently. But economically it is not necessarily insane.

(Assuming you are selling the old laptops, that is. It's not clear to me that the author is doing that. He says he's donating/rehoming them. Not sure if that includes selling)


The downtime of setting up a new laptop, migrating data, etc is going to be worth several hundred dollars to many people


Agreed.

Though, with Macs, it's trivial - their migration tool is peerless. (As one would certainly expect and demand: since they control the whole software/hardware stack)


Many products are released on a yearly basis, but very few people are upgrading every year.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: