I think your second point is interesting, and it has actually already happened a couple of times.
It used to be a lot easier to find devs that knew assembly and could navigate call stacks through memory by hand because a lot of folks had to learn that to get their job done. Now higher level languages have mostly eliminated that level of operation.
The same applies to infosec roles. It is 10x harder for junior infosec folks than 20 years ago because there are a bunch of skills you need in infosec that today's mainline dev experience doesn't need, but were more common a while ago.
Case in point, I remember working with a partner company's junior engineer on some integration. They needed some hard-coded constant changed and time was of the essence. I told them to change a couple bytes in the elf binary directly. They looked at me like I was a wizard. I thought it was a fairly pedestrian skill having grown up reversing computer game save files.
Google, Meta, Deepseek, Alibaba, and Mistral all offer models you can run yourself. The market can decide whether privacy should be the highest priority.
Talk about timely. I was just experimenting with a data provider's new MCP server, and I was able to use up my entire Claude Max token limit in under a minute.
It is happening in embedded as well. I noticed just the upgrade from Gemini 2.5 to 3.0 Pro went from "I can get the assembly syntax mostly right but I don't understand register lifetimes" to "I can generate perfect assembly by hand".
I just saw a Reddit post yesterday about somebody that successfully one-shot in Gemini 2.5 the bare metal boot code for a particular board with the only input being the board's documentation.
I am firmly in both camps. On one hand, getting stuff working has its own thrill.
On the other hand, I step back, look at the progress made in just the last year, and realize that not only is my job soon to be gone, but pretty much everyone's job is gone that primarily does knowledge work.
I feel there's now an egg timer set on my career, and I better make the best of the couple of minutes I have left.
Exactly. I don't want to wade through a whole session log just to get to reasoning, and more importantly, I don't want to taint my current agent context with a bunch of old context.
Context management is still an important human skill in working with an agent, and this makes it harder.
A lot of people see how good recent agents are at coding and wonder if you could just give all your data to an agent and have it be a universal assistant. Plus some folks just want "Her".
I think that's absolutely crazy town but I understand the motivation. Information overload is the default state now. Anything that can help stem the tide is going to attract attention.
I've been dismayed by how fast the "we should own our hardware" crowd has so quickly radicalized into "all security features are evil", and "no security features should exist for anyone".
Not just "there should be some phone brands that cater to me", but "all phone brands, including the most mainstream, should cater to me, because everyone on earth cares more about 'owning their hardware' than evil maid attack prevention, Cellebrite government surveillance, theft deterrence, accessing their family photos if they forget their password, revocable code-signing with malware checks so they don't get RATs spying on their webcam, etc, and if they don't care about 'owning their hardware' more than that, they are wrong".
"No security features should exist for anyone" is itself fanatically hyperbolic narrative. The primary reason this event has elicited such a reaction is because OnePlus has historically been perceived as one of the brands specifically catering to people that wanted ultimate sovereignty over their devices.
As time goes on, the options available for those that require such sovereignty seem to be thinning to such an extent that [at least absent significant disposable wealth] the remaining options will appear to necessitate adopting lifestyle changes comparable to high-cost religious practices and social withdrawal, and likely without the legal protections afforded those protected classes. Given the "big tech's" general hostility to user agency and contempt for values that don't consent to being subservient to its influence peddling, intense emotional reaction to loss of already diminished traditional allies seem like something that would reasonably viewed compassionately, rather than with hostility.
I’ve posted about this on HN before; I think that there’s a dangerous second-order enshittification going on where people are so jaded by a few bad corporate actions that they believe that everyone is out to get them and hardware is evil. The most disappointing thing to me is that this has led to a complete demolition of curiosity; rather than learning that OTP is an ancient and essential concept in hardware, the brain-enshittification has led to “I see hardware anti-*, I click It’s Evil” with absolutely no thought or research applied.
Given how the opposition has radicalized into "you should own nothing and be happy", it's not surprising.
None of the situations you mentioned are realistic or even worth thinking about for the vast majority of the population. They're just an excuse to put even more control into the manufacturer's hands.
The attack is simple: the attacker downgrades the phone to a version of firmware that has a vulnerability. The attacker then uses the vulnerability to get at your data. Your data is PIN-protected? The attacker uses the vulnerability to disable the PIN lockout and tries all of them.
There's over a 10x difference in fence price between a locked and unlocked phone. That's a significant incentive/deterrent.
The whole point of this is so that when someone steals your phone, they can't install an older vulnerable version of the firmware than can be used to set it back to factory settings which makes it far more valuable for resale.
Phone thieves aren't checking which phone brand I have before they knick my phone. Your scenerio is not improved by making Oneplus phones impossible to use once they're stolen.
> It reduces the expected value of stealing a phone, which reduces the demand for stolen phones.
It's not at all obvious that this is what happens. To begin with, do you regard the average phone thief as someone who even knows what expected value is?
They want drugs so they steal phones until they get enough money to buy drugs. If half the phones can't be resold then they need to steal twice as many phones to get enough money to buy drugs; does that make phone thefts go down or up?
On top of that, the premise is ridiculous. You don't need to lock the boot loader or prevent people from installing third party software to prevent stolen phones from being used. Just establish a registry for the IMEI of stolen phones so that carriers can consult the registry and refuse to provide service to stolen phones.
It's entirely unrelated to whether or not you can install a custom ROM and is merely being used as an excuse because "prevent theft somehow" sounds vaguely like a legitimate reason when the actual reason of "prevent competition" does not.
> It's not at all obvious that this is what happens.
This is what we've empirically seen as Apple went from having devices which could trivially be reflashed and resold without much impediment to now most iPhones being locked and their hardware parts cryptographically tied together.
There is a lot of "how to lie with statistics" going on with correlations like that. To begin with, property crime rates have been declining year over year in general, so "it was lower the year after X" is the expected result whether or not X actually did any good. This is especially true in years -- like the one in question -- that follow an epidemic of thefts, and then subsequent years see large declines as a result of reversion to the mean.
Then clickbait headline authors do their favorite thing and find a table of numbers, sort by size and choose the biggest one. 50% in London! That's probably not an outlier, right? But down to 25% by the time they get to city number 3, and no other cities are listed.
Likewise, when there are a lot of thefts then everyone tries a lot of solutions, and then some subset of them do something (or just reversion to the mean again) and everybody wants to claim it was their thing that solved it.
But if it was their thing, and their thing is still in place, then the theft rate shouldn't be going back up again, right? Yet it is:
> It's not at all obvious that this is what happens. To begin with, do you regard the average phone thief as someone who even knows what expected value is?
They know if their fence went from offering them $20/phone to offering $5/phone, it's not worth their time to steal phones any more.
> Just establish a registry for the IMEI of stolen phones so that carriers can consult the registry and refuse to provide service to stolen phones.
This seems like something that the average HNer is going to get equally riled up about as a surveillance and user freedom issue.
> They know if their fence went from offering them $20/phone to offering $5/phone, it's not worth their time to steal phones any more.
Except that phones are worth significantly more than both of those numbers or nobody would be stealing them to begin with, and they have a value floor in what they're worth if disassembled for parts which is above what many people would be willing to steal in order to get. And then we're back to, if you need X amount of money to buy drugs, and the amount of phones you have to steal to get X amount of money doubles, how many phones are they going to steal now?
> This seems like something that the average HNer is going to get equally riled up about as a surveillance and user freedom issue.
The only thing on the list is stolen phones. The phone carrier consulting the list would have your IMEI regardless. The only information anyone would get from the list is that the owner of a phone with a particular IMEI has reported it as stolen.
The main thing you need to make sure and do is to have a good way to prevent someone from reporting someone else's phone as stolen, and "make that a crime and make people who want to file a theft report show a valid ID so they can be prosecuted if they're committing that crime" is probably a pretty good way to do that.
Thieves don't always get the news right away, but when you work hard to steal a bunch of phones and can't sell them for anything, you don't get your fix and you find something else to steal and sell.
Regulations have made it pretty hard to sell catalytic converters, but there's still thefts cause some theives are really out of the loop, but I think it's been reduced by a lot. Still a few people who want to fill up their stolen trailer with cats before they go to the scrap yard, though.
A strong lock system that prevents stolen phones from being used is better than a global IMEI denylist because phones that can't be connected to a cell network but are otherwise usable still have value, some networks won't participate in a global list, and some phones can have their IMEI changed if you can run arbitrary software on them (which is maybe a bigger issue, but still steal phone -> wipe -> change IMEI -> resell is stopped if you can't wipe the stolen phone)
> Thieves don't always get the news right away, but when you work hard to steal a bunch of phones and can't sell them for anything, you don't get your fix and you find something else to steal and sell.
Thieves figure that out pretty quick, and they still seem to be stealing plenty of phones.
> Regulations have made it pretty hard to sell catalytic converters
This is the equivalent of having a list of stolen phones.
> A strong lock system that prevents stolen phones from being used is better than a global IMEI denylist because phones that can't be connected to a cell network but are otherwise usable still have value
It's pretty likely that this value is lower than, or approximately the same as, the value of the phone as individual parts.
> some networks won't participate in a global list
Thieves want to sell phones in rich countries where people can afford to buy them. Get the rich countries to use the list and nobody is going to be stealing iPhones so they can pay $10 to ship them to sell in Somalia for $5. For that matter it's going to make a huge dent even if yours is the only country using the list, because most thieves are not going to use an international fence.
> some phones can have their IMEI changed if you can run arbitrary software on them
So the manufacturers who want to do something like this should prevent that rather than preventing people from running arbitrary software in general.
It seems like you're trying too hard to defend the premise. Having a list of stolen IMEIs would be significantly effective. "What about this marginal edge case?" is like, preventing the thieves from selling stolen catalytic converters would be significantly effective, but they could hypothetically ship them to Somalia and sell them there, so we need OEMs to lock down everyone's cars instead.
That seems more like an excuse to lock down everyone's devices than an actual concern about the marginal edge case which itself could be addressed in various ways without doing something with such high costs to competition. Assuming the edge case was even significant, which it probably isn't.
I find it hard to believe that Oneplus is spending engineering and business recourses, upsetting a portion of their own userbase, and creating more e-waste because they want to reduce the global demand for stolen phones. They only have like 3% of the total market, they can't realistically move that needle.
I don't understand what business incentives they would have to make "reduce global demand for stolen phones" a goal they want to invest in.
It'd be ideal if the phone manufacturer had a way to delegate trust and say "you take the risk, you deal with the consequences" - unlocking the bootloader used to be this. Now we're moving to platforms treating any unlocked device as uniformly untrusted, because of all of the security problems your untrusted device can cause if they allow it inside their trust boundary.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
> We cant have nice things because bad people abused it :(.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
> That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I don't entirely agree but neither would I be dead set against such an arrangement. Consider that (for example) while private banks are free not to do business with you at least in civilized countries there is a government associated bank that will always do business with anyone. Mobile devices occupy a similar space; there would always need to be a vendor offering user controllable devices. And we would also need legal protections against app authors given that (for example) banking apps are currently picking and choosing which device configurations they will run on.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
How is that supposed to fix anything if I don't trust the hypervisor?
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Well I don't entirely, but in that case there's even less of a choice and also (it seems to me) less risk. The OEM software stack on the phone is expected to phone home. On the other hand there is a strong expectation that a CPU or southbridge or whatever other chip will not do that on its own. Not only would it be much more technically complex to pull off, it should also be easy to confirm once suspected by going around and auditing other identical hardware.
As you progress down the stack from userspace to OS to firmware to hardware there is progressively less opportunity to interact directly with the network in a non-surreptitious manner, more expectation of isolation, and it becomes increasingly difficult to hide something after the fact. On the extreme end a hardware backdoor is permanently built into the chip as a sort of physical artifact. It's literally impossible to cover it up after the fact. That's incredibly high risk for the manufacturer.
The above is why the Intel ME and AMD PSP solutions are so nefarious. They normalize the expectation that the hardware vendor maintains unauditable, network capable, remotely patchable black box software that sits at the bottom of the stack at the root of trust. It's literally something out of a dystopian sci-fi flick.
It used to be a lot easier to find devs that knew assembly and could navigate call stacks through memory by hand because a lot of folks had to learn that to get their job done. Now higher level languages have mostly eliminated that level of operation.
The same applies to infosec roles. It is 10x harder for junior infosec folks than 20 years ago because there are a bunch of skills you need in infosec that today's mainline dev experience doesn't need, but were more common a while ago.
Case in point, I remember working with a partner company's junior engineer on some integration. They needed some hard-coded constant changed and time was of the essence. I told them to change a couple bytes in the elf binary directly. They looked at me like I was a wizard. I thought it was a fairly pedestrian skill having grown up reversing computer game save files.
reply