Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> e.g. your utility will need to be recompiled and updated if a security vulnerability is discovered in one of those libraries. You also miss out on free bugfixes without recompiling.

This was the biggest pain point in deploying *application software* on Linux though. Distributions with different release cycles providing different versions of various libraries and expect your program to work with all of those combinations. The Big famous libraries like Qt , gtk might follow proper versioning but the smaller libraries from distro packages - guarantee. Half of them don't even use semantic versioning.

Imagine distros swapping out the libraries you've actually tested out your code with with their libraries for "security fixes" or whatever the reason. That causes more problems than it fixes.

Custom start up script was to find the same xml library I've used in the tar ball i packaged the application in. They could then extract that tar ball wherever they need - including /opt and run the script to start my application and it ran as it should. Iirc we used to even use rpath for this.



> Half of them don't even use semantic versioning.

This is a red herring. Distros existed before semantic versioning was defined and had to deal with those issues for ages. When packaging, you check for the behaviour changes in the package and its dependencies. The version numbers are a tiny indicator, but mostly meaningless.


I think semantic versioning actually predates distributions. It just was not called "semantic versioning." It was called Unix shared library versioning.


Imagine a world where every library and every package has their release date as their version. You'd instantly know which software lacks maintenance or updates (bitrot).

To me it seems more attractive than how Nix does it, but I guess they considered such and saw conflicts, therefore went with hashes.


Would you also instantly know if 20250110 is a drop-in replacement for 20240930 or will it require changes in your code to make it work?


How do you know if 12345 is a patch for 432 released years ago or a major upgrade from 12344 released a moment ago? Pure time versioning doesn't work with multiple release streams.


Thanks for proving me wrong!


IIRC GNU Parallel versions by date.

Recently in the python ecosystem the `uv` package manager let's you install a package as it was on a certain date. Additionally, you can "freeze" your dependencies to a certain date in pyproject.toml. So when someone clones it and installs dependencies it will install it at the date you've chosen to freeze it at.

Personally I love this method much more than versioning.

I think versioning is mostly useful to just talk about software and maybe major major additions/changes e.g io-uring shipping with linux mainline.


On the opposite side of the world in Gentoo, we compile updates to libraries and applications together on a rolling basis all the time, and it generally works out while letting us have everything as bleeding edge as we want it.


There's software that is done and doesn't need constant updates.


Yes, but if statically linked, this excludes all software that relies on security-relevant libraries (e.g. cryptography) or receives data from the network. I struggle to think of a lot of software that would qualify beyond coreutils and friends.


The interesting case here is dynamically linked software that doesn't need updates.


>When packaging, you check for the behaviour changes in the package and its dependencies

Yeah, but the package maintainer for a widely used library doesn't actually have the resources to do this. (heck, a package maintainer for a non-trivial application likely doesn't have the resources to do this). Basically they update and hope to get some bug reports from users.


>Imagine distros swapping out the libraries you've actually tested out your code with with their libraries for "security fixes" or whatever the reason. That causes more problems than it fixes.

I don't believe that it causes more problems than it fixes. It's just that you didn't notice the problems being silently fixed!

There are issues related to different distros packaging different versions of libraries. But that's just an issue with trying to support different distros and/or their updates. There are tradeoffs with everything. Dynamic linking is more appropriate for things that are part of a distro, because it creates less turnover of packages when things update.


I often refer to semantic versioning as "semanticless versioning". Everyone disagrees about what constitutes a change warranting each version number to be increased


Fun part is that it actually is true as for different use cases of the same library change might mean something different.

So it is complicated and there is no solution for every context, therefore we use best approximation.


This is never ever a problem unless a developer insists on always using the most cutting edge version of a library. There's no law that says you have to use the bleeding edge of every library when you make a program. Another issue these days, is that library maintainers often add new features or delete old features without incrementing the major version number. In the olden days it was assumed that minor versions were for bug fixes that don't break compatibility, and when you wanted to change how the library works in a major way, you increment the major number.

Now a lot of stuff is contnuously buggified so there is no concept of stable and in-progress.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: