Keep in mind that there's a bug if you put `uv run` as your script's shebang line, as uv will recursively call itself if your script's name doesn't end in .py.
Other than that, though, automatically managing a single script's dependencies and running it in a venv with just ./script.py is magical.
Please god, let uv be the final tweak of the python requirements best practices. I can't handle another easy_install -> requirements.txt/pip -> pyproject.toml. It will break me.
That appears to be far more awkward because it doesn't document the declaration and inclusion of specs in the script itself and doesn't manage temporary environments. And, so uv run looks superior in this use case. pipx is great for CLI tool management.
Neat. For tool packages, I use pipx. There's little need for global system or user packages anymore. Many other major platforms lack these kind of essential package isolation management tools, or make it awkward or are fragmented if they do.
Am I right in understanding that these tools all expect the packages are available from PyPI?
I have one third-party Python component which isn't from PyPI, but instead is downloaded from the vendor, via a password-protected web page, or available via conda. I believe that scenario still calls for a user package.
Another package is installable with pip via the vendor's Simple Repository API, but PEP 709 ("Extending the Repository API to Mitigate Dependency Confusion Attacks") makes me think that the current requirements system might not handle having many dependencies, across multiple servers, without risking dependency confusion.
That's not quite right! You can use `[tool.uv.sources]` to declare dependencies on local packages, e.g., at some path[1], in scripts. Additionally, you can install tools from local packages with `uv tool install <path>`.
Dependency confusion attacks are concerning. That's why uv will not check for versions of a package across multiple indexes unless you opt-in[2]. People complain about this all the time, but it's a safer default.
How do I specify that a package is installed from conda?
It says "A dependency source can be a Git repository, a URL, a local path, or an alternative registry."
Conda is a registry, but I couldn't find information about how that works. I'm assuming it specifically means a PyPI-like registry?
(Even worse would be if I have a package which used ctypes to use a shared library installed by a C package available through a distro package index.)
Reading your [2], I see uv will have problems with a package of mine. I have an old version on PyPI, but years ago I switched to hosting newer releases on my own server. (I believe PyPI's intermediation makes my sales pipeline worse.)
This works with pip, because it sees my version is newer than the one on PyPI, but for uv it will require 'unsafe-best-match', or 'unsafe-first-match' with my server before PyPI.
And since that value is set through an environment variable, it means people will need to unset it for other uses.
Meh. I've documented that my server is not meant for high availability, and anyone who wants that should purchase a source license and host it themselves locally. That's what I would have to do if I were to switch away from my current persistent venv.
BTW, pip's constant check for updates means that this year 88 people have trusted me to not update their pip installs. For that matter, I see some dependabot checks for certifi, aiohttp, and more - what an excellent way to raise some false alarms.
Other than that, though, automatically managing a single script's dependencies and running it in a venv with just ./script.py is magical.