Software verification has gotten some use for smart contracts. The code is fairly simple, it's certain to be attacked by sophisticated hackers who know the source, and the consequence of failure is theft of funds, possibly in large amounts. 100% test coverage is no guarantee that an attack can't be found.
People spend gobs of money on human security auditors who don't necessarily catch everything either, so verification easily fits in the budget. And once deployed, the code can't be changed.
Verification has also been used in embedded safety-critical code.
If the requirements you have to satisfy arise out of a fixed, deterministic contract (as opposed to a human being), I can see how that's possible in this case.
I think the root problem may be that most software has to adapt to a constantly changing reality. There aren't many businesses which can stay afloat without ever changing anything.
I agree but for some reason, there are people who enjoy doing that. I think they should be allowed to do as they like.
In any case, Amazon claims this feature is spoiler-free and that would be easy to implement. It likely works by feeding the book into an LLM context, and they could simply feed in the portion you've already read.
So your claim is that this massive data collection, done at massive public expense, is not used at all? That seems unlikely. And given how good computers are at natural language processing these days, the data is more usable than ever.
Of course it is used. But unless you're a target of interest to intelligence analysts, the metadata generated by your online activities will be of no interest whatsoever. It won't even be looked at.
The whole point of mass data collection is that you can check everyone to see if they should be targets of interest. And as societies get more totalitarian, what qualifies you to be a target becomes less and less dramatic.
Doing this is easy these days. You keep using phrases like "looked at" as if humans had to manually read through the records.
Just because it's in the training data doesn't mean the model can remember it. The parameters total 60 gigabytes, there's only so much trivia that can fit in there so it has to do lossy compression.
People spend gobs of money on human security auditors who don't necessarily catch everything either, so verification easily fits in the budget. And once deployed, the code can't be changed.
Verification has also been used in embedded safety-critical code.
reply