Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Perhaps not everything should be packaged the same way, but one key advantage of the "one package system to rule them all" approach is that it really simplifies THINKING about your package installation needs. For systems composed of many pieces being able to "apt-get install <everything>" beats having to go npm for one item, pip install for another and then gem for a third.

Given that there are generally a common set of requirements for package management one possible solution could be virtual repositories. I almost exclusively use Ubuntu/Debian: you could have a repo that LOOKED like a standard apt endpoint but translated all the apt requests into the appropriate npm/pip/gem/<language of your choice's package management system> without having to manually bake the .deb files. Updates to the backing repo would be seamlessly supported, so an 'apt-get -y upgrade' would update everything. Integration with virtualenv-style local packages would involve some magic, but having only one set of "package management primitives" would be pretty awesome.



I fear the problem is distributions can't agree on how to do it. If there were a common "virtual repository" standard then the npm/rubygems/etc. maintainers probably wouldn't mind implementing it - but actually they'd have to expose an apt endpoint, and also a yum one, and a portage one, and...

Debian seems uninterested in working with standards it doesn't like (see their "support" for installing LSB RPMs), and other distributions won't want to introduce the complexity of APT, so I unfortunately see little way forward for this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: