> Perhaps we need a common community effort to create a “distro” of curated and safe dependencies one can install safely, by analyzing the most popular packages and checking what’s common and small enough to be worth being included/forked.
Debian is a common community effort to create a “distro” of curated and safe dependencies one can install safely.
If you want stable, tested versions of software, only getting new versions every few years:
> sort of a "delayed" mode to updating my own dependencies. The idea is that when I want to update my dependencies, instead of updating to the absolute latest version available of everything, it updates to versions that were released no more than some configurable amount of time ago.
For Python's uv, you can do something like:
> uv lock --exclude-newer $(date --iso -d "2 days ago")
It would ask a root nameserver for an ip address for the .com nameserver, and then ask the .com nameserver for the ip address of the example.com nameserver, then ask the example.com nameserver for further records (and may continue to recurse).
2:40 "I do like how the pelican's feet are on the pedals." "That's a rare detail that most of the other models I've tried this on have missed."
4:12 "The bicycle was flawless."
5:30 Re generating documentation: "It nailed it. It gave me the exact information I needed. It gave me full architectural overview. It was clearly very good at consuming a quarter million tokens of rust." "My trust issues are beginning to fall away"
I feel like we need to move on from using the same test on models since as time goes on the information about these specific test is out there in the training data and while i am not saying that it's happened in this case there is nothing stopping model developers from adding extra data for theses tests directly in the training data to make their models seem better than they are
Honestly, I have mixed feelings about him appearing there. His blog posts are a nice way to be updated about what's going on, and he deserves the recognition, but he's now part of their marketing content. I hope that doesn't make him afraid of speaking his mind when talking about OpenAI's models. I still trust his opinions, though.
Their spec sheet shows 208 and 3 phase as options. 3 phase is smaller wiring, and with 15KW per rack I could see how that would quickly become a problem.
Worth scrolling through util.py to see lots of hand-implemented code: MultipartParser, read_header(), read_socket_chunked(), html_escape(), atomic_move(), killtree ("still racy but i tried"), termsize(), etc
src layout _should_ work. The pyproject.toml needs to live next to src, not inside of it. You might need to `uv run python` in order to pick up the right PYTHONPATH?
You don't know what you're missing if you haven't spent 5 minutes trying out uv.
If you're intentionally not trying it simply because you don't want to get addicted like everyone else clearly is, I could see that as a valid reason to never try it in the first place.
I usually avoid jumping on bandwagons, so I've always stuck with vanilla pip/venv, but at this point it really is clear to me that uv really is the "One True (tm) python package management solution", and probably will be for the next 10 years.
Debian is a common community effort to create a “distro” of curated and safe dependencies one can install safely.
If you want stable, tested versions of software, only getting new versions every few years:
https://packages.debian.org/stable/javascript/
If you want the newer versions of software, less tested, getting new versions continuously:
https://packages.debian.org/unstable/javascript/