My favorite quote:
[[[
Greg Sullivan, lead product manager for Microsoft's Windows client division, says that the computing community as a whole has benefited from the Web's standardization around Internet Explorer. Competing platforms would have meant that developers would have had to duplicate their efforts more often, he said.
"There is benefit to everybody who's involved," Sullivan said. "In general, a standard is very useful, whether it's de facto or du jour. It enables a level of consistency and a level of investment and minimizes some of the redundancy that can occur."
]]]
The actual speed "limit" on those streets is definitely no higher than 30km/h, regardless of what the signs say. If I drove 60 there I hope I'd lose my license for reckless driving.
In fact .zip file contains file header information twice: before every compressed file (local file header) and at the end of the .zip file (central directory). It's possible to omit file size in local file header, but most most compressing utilities doesn't use this option. So most .zip files could be extracted without seeking to the end.
Sure, package maintainers may backport fixes to their old versions. But they need to fully understand all upstream source code and follow all commits. Otherwise they can miss important fixes: following only security fixes for supported branches is not enough. Sometimes project developers fixes bug/security problem in the code, but doesn't flag it as CVE because current code usage doesn't trigger it. But code in old branch could.
That's the current reality. That's how it was for years. Especially CentOS and other RH-based systems are more happy to patch than to upgrade. This caused the kernel 2.6.32-573 situation where lots of patches (over a hundred?) were applied by the distro.
>Sure, package maintainers may backport fixes to their old versions. But they need to fully understand all upstream source code and follow all commits.
Only if they need to do a perfect job. But as history tells us they are just as content of making a ho-hum job.
In general this approach should work fine, but devil is in detail:
1. You have to flush change to temporary file before move because otherwise you may get empty file: OS may reorder move and write operations
2. After move you have to flush parent directory of destination file on Posix. Windows have special flag for MoveFileEx() to ensure that operation is done or you have to call FlushFileBuffers() for destination file.
Linked paper mention the many popular programs forgets about (1).
"There is benefit to everybody who's involved," Sullivan said. "In general, a standard is very useful, whether it's de facto or du jour. It enables a level of consistency and a level of investment and minimizes some of the redundancy that can occur." ]]]