That single command creates a throwaway virtual environment, installs an alpha release of my LLM tool, adds an alpha release of my llm-claude-3 plugin, runs that command with options, returns the results and then throws everything away again.
And because of the way uv uses symlinks and caching, running the commands a second time is almost instant.
You don't need to think about the virtual environment that might have been created to run the tool, that thing is entirely ephemeral. The caching is shared with all other virtual environments on your machine, so if some other project uses the same version of a dependency that cached dependency will be shared.
Contrast that with other Python virtual environment tools where you can end up with the same exact dependency copied dozens of times across your system.
"pipx run" for example can result in identical copies of dependencies that are used by multiple tools.
Ah, so by "everything" you just meant some configuration explaining what's in that environment. (In a broad sense, including e.g. the use of hard links to the cache.)
Yes, it's a quite clever system. I have plans to follow a similar strategy, implemented in Python (self-installing into its own isolated environment, similar to the one that Pipx makes for its Pip copy).
I have no relation with the Astral team but have been an early and aggressive adopter of everything they've released and it's all been easy, fast, and ergonomic. Over the past eighteen months I've:
1. Replaced `flake8` with `ruff`;
2. Replaced `black` with `ruff`;
3. Replaced `pip` with `uv`;
4. Replaced `poetry` with `uv`.
Their next project appears to be exactly what I _wish_ their next project to be — a replacement for pyright et al (https://github.com/astral-sh/ruff/discussions/10581). Type checking is my least favorite part of my Python toolchain at the moment; I can't wait to take it for a spin.
In addition to what everybody else has mentioned, uv plays nice with other people's build tools. I use scikit-build-core for building most of my python packages and making scikit-build-core play nicely with poetry was somewhere between very hard and impossible. UV trivially lets me combine the two so I can let uv do what it does best (manage and install python versions and dependencies) and then have it get out of the way and let scikit-build-core take over and do what it does best.
uv also manages the Python version as well, and it doesn't require jumping through hoops and hacking to get it to play nicely with Docker. uv is much more of a production-ready setup than pip or poetry from my experience moving to uv over the last few weeks
I really haven't been able to understand why "managing the Python version" is such a big value add for so many people. I just have each version built from source and make venvs from them as needed.
Because most people don't build their toolchain from source anymore. When I am trying to help a junior programmer get up and going, it's a lot easier to say "OK, just do `brew install uv` then `uv sync` in the project folder and you are all set" than to try to walk them through building from source.
Not to mention being able to have reproducible builds for deployments, CI/CD, etc. The fact that the Python version is picked up from the project's package configuration is nice because then you don't have to remember to update the source build to match when you update pyproject.toml for a new Python release.
I happen to have built the interpreters (not an entire toolchain) from source. But I could equally well have obtained them from a PPA or whatever else. On Windows I could have used the official installers. The point is that once I have base environments, creating venvs is trivial.
(And really, the experience of building Python from source on Mint was about as smooth as I could have ever hoped for.)
>when you update pyproject.toml for a new Python release.
? I don't upper-cap the Python version (or anything else without a well-defined reason - https://iscinumpy.dev/post/bound-version-constraints/ ) so I wouldn't be updating anyway. If I want to test under a new Python version then I need to write that out explicitly somewhere regardless; might as well do it on the command line.
they are working on it! the internal name is currently red-knot (you can see it in their GH History and in the discord). First thing is a type checker, I think, then the renaming capabilities and such.
Astral's tools (Ruff, uv) are awesome, I'm a big fan. I'm using them every time I can and recommending them everywhere. I use them for my open source projects (FastAPI, Typer, SQLModel, etc.) and also for private things, small and big, even small scripts, everything. Also, they have great docs, and I'm kinda picky with docs. I'm looking forward to whatever they build next, I'm pretty sure I'll like it. By now, it's not just by chance that they built one great thing, it's a trend, they build great stuff.
Ruff is great because you need to lint your code all the time and you could save maybe 1 minute per CI.
As for Python package management, my team is migrating to Bazel which has its own way of locking in the Python dependencies and then pulling them from a remote cache. Under Bazel, we are only re-examining the dependencies when someone proposes a change to produce the lock. It's so rare, that having a new+faster thing that does this part would not present a meaningful benefit.
> As for Python package management, my team is migrating to Bazel which has its own way of locking in the Python dependencies and then pulling them from a remote cache. Under Bazel, we are only re-examining the dependencies when someone proposes a change to produce the lock. It's so rare, that having a new+faster thing that does this part would not present a meaningful benefit.
Have you considered Pants[0], Buck[1] or Waf[2]? What ultimately made you decide to go for Bazel?
ruff has been a fantastic discovery for me, along with pre-commit, commitizen, and flit.
In about the length of a weekend I was able to find a reasonable group of settings for my pyproject.toml and roll that out to many of my projects.
The tooling is maturing and converging on more common ground (eg pyproject.toml) while adopting existing, accepted practices and standards.
The documentation is awesome and integrates well within my IDE of choice (pycharm)
Myself, I run ruff with most of the lintiners turned on. It is quite strict and opinionated (to the point of pedantic) -- a good learning path for me because I would never read PEPs anyway.
But once it is running I just stay within bounds and take the enforced rigor around doc-strings, syntatic sugar, and other little things.
I'm not on the uv train yet because I have no issues with venv and pip right now.
For now their decisions align with the community needs and everyone is happy.
But what will gonna be when their investors needs divert from the community needs or worse, they fail to meet their investors financial expectations and shutdown?
I was actually thinking of creating a python-pipx but for uv instead in archlinux (because arch has some workarounds to do with in pip , you can't just do pip install something , you need to do it with pacman)
I really hope that any developer is reading this and is interested in this project , they do this. because I don't feel like doing it but I honestly feel its impact because I wanted to try nomadnet once on arch machine and I tried pipx and it was so slow
> Tool management: `uv tool install` and `uv tool` run (aliased to uvx). uv can now install command-line tools in isolated virtual environments and execute one-off commands without explicit installation (e.g., `uvx ruff check`), making it a high-performance, unified alternative to tools like pipx.
Coming from the Poetry world, I'm loving ruff and uv. It's awesome having one single binary to handle linting for both convention and format, and uv itself is amazingly fast.
I do miss being able to shell into an env, but it hasn't been a big deal because you can still run `uv run -- code .` to launch VSCode with the right Python interpreter.
It's surprising to me that the future toolchain for Python productivity is Rust, but the results are awesome.
Just fyi since UV generates a virtual environment in .venv you can just point to the .venv folder in VSCode and stuff should work
(I highly recommend using direnv and sourcing the virtual environment in a .envrc file though… when your editor has support for that it works so well compared to bespoke shell tools)
Seems like the only ecosystem that moves as fast and is as fragmented as JavaScript is Python, so I'll wait before jumping on the hype train. Looks interesting though.
Really enjoy using pdm. Might make the move when there’s more maturity on uv, but for now it does the job and I just switched from pipenv to pdm. Too many changes too often.
When I would be excited for is an actual LSP alternative to pylance. Pylance is basically a proprietary monopoly. Kind of crazy for Python to not have its own high performance LSP.
Problem: python is very hard to use consistently across various environments
Solution: make python execution as disposable as paper plates :)
Uv works very well for me, but it does still sometimes run into weird issues when the underlying python installation is bad (ie installing a library that depends on a dev build)
The speed of uv relative to other dependency management tools I've tried is incredible. My biggest hangup so far is that there isn't as great support for automated dependency updates as with more mature tools.
We're poetry here too, but pyenv I always had problems with so switched to ASDF which has been rock sold. I tried uv a while back and it was stupidly fast, just need to find the time to switch a ton of projects over.
I've migrated a couple moderately large projects from pyenv & poetry to uv, and it has been great. The switch was easier than I expected. I uninstalled pyenv after a few weeks, as I realized that uv makes it irrelevant. I've also replaced pipx (which manages Python tools/programs each with their own virtual environment) with uv's tool feature. I used uv for every new Python project. The only thing it's missing that's important for me is support by Dependabot on GitHub. Once Dependabot can update its lock files, I'll be able to uninstall poetry.
I had some trouble migrating from Poetry to uv because the dependency constraints are different. With Poetry, you can use caret ^ and tilde ~ version constraints. Apparently this is outside the PEP spec so uv doesn't implement it. Understandable, but it does make the migration a bit troublesome at times.
That’s a great combination that I’ve been using up until very recently. If you have it up and running, I don’t see huge benefits in switching to uv today.
In contrast, I had to use pipenv recently and I’d forgotten how terrible its UX was. I wanted to upgrade one of the deps in a smallish project and it took 43 seconds to resolve the change. If anyone reading this is using pipenv, switch to uv right now and thank me later.
We've migrated our pyenv poetry application - that's pretty complex with data pipeline flows and apis. The only issue we had was loading .env files - we had done some custom env var scripting as a workaround to an AWS issue and that was hard to migrate over. However, once that was done (and was due to bad implementation initially outside of poetry), moving from poetry to uv was rock solid. No issues and it just worked. I was surprised.
That used to be my stack, I’ve found that asdf and poetry fit me better (I dreaded having to set up pyenv and/or it working some of the time). The time saver has also been that I no longer have to keep any other language’s bespoke language version environment switcher in my head as much (goodbye nvm too)
Did you see the performance gains? Even in marginal case (with caching) 4.6s => 0.15s resolution (compared to poetry). Even better when there are cache misses
Respectfully, your dogmatism doesn't seem to serve you with numbers like that
The other day I got to share a demo by pasting this command in a Discord channel:
That single command creates a throwaway virtual environment, installs an alpha release of my LLM tool, adds an alpha release of my llm-claude-3 plugin, runs that command with options, returns the results and then throws everything away again.And because of the way uv uses symlinks and caching, running the commands a second time is almost instant.
(That stuff is no longer alpha, see https://simonwillison.net/2024/Oct/29/llm-multi-modal/)