(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=43386357

Hacker News 上讨论了 Wagtail 用户中 uv(一个 Python 包管理器)对 Poetry 的采用率日益提高。评论者称赞 uv 的速度和效率,使 Python 生态系统感觉更加成熟。用户强调了它有效管理依赖项、创建锁文件、处理虚拟环境以及快速设置项目(甚至管理不同的 Python 版本)的能力。 一些人对 uv 的治理由一家营利性公司驱动以及其对 Python 包规范化工作的潜在影响表示担忧。另一些人则想知道它如何处理 C 库,以及它是否完全取代了 conda、pyenv、pipx 和 tox 等工具。虽然 PDM 的功能集受到赞扬,但 uv 的性能似乎更胜一筹。 尽管有一些保留意见,许多用户报告说他们无缝过渡到 uv,并赞赏其速度和减少的故障模式。普遍共识似乎强烈倾向于 uv 是 Python 依赖项管理的重大改进,并可能成为新的标准。


原文
Hacker News new | past | comments | ask | show | jobs | submit login
uv downloads overtake Poetry for Wagtail users (wagtail.org)
132 points by ThibWeb 2 hours ago | hide | past | favorite | 112 comments










I recently checked out UV, and it's impressively fast. However, one challenge that keeps coming up is handling anything related to CUDA and Torch.

Last week, I started developing directly in PyTorch containers using just pip and Docker. With GPU forwarding on Windows no longer being such a hassle, I'm really enjoying the setup. Still, I can’t shake the feeling that I might be overlooking something critical.

I’d love to hear what the HN crowd thinks about this type of env.



I recently switched to uv, and I cannot praise it enough. With uv, the Python ecosystem finally feels mature and polished rather than like a collection of brittle hacks.

Kudos to the uv developers for creating such an amazing piece of software!



Yeah, switched to writing python professionally ~4 years ago, and been low key hating the ecosystem. From a java and javascript background, it's mostly been npm/mvn install and it "just works". With python, there's always someone being onboarded that can't get it to work. So many small issues. Have to have the correct version per project, then have to get the venv running. And then installing it needs to build stuff because there's no wheel, so need to set up a complete c++ and rust toolchain etc., just to pull a small project and run it.

uv doesn't solve all this, but it's reduced the amount of ways things can go wrong by a lot. And it being fast means that the feedback-loop is much quicker.



Python has been mostly working okay for me since I switched to Poetry. (“Mostly” because I think I’ve run into some weird issue once but I’ve tried to recall what it was and I just can’t.)

uv felt a bit immature at the time, but sounds like it’s way better now. I really want to try it out... but Poetry just works, so I don’t really have an incentive to switch just yet. (Though I’ve switched from FlakeHeaven or something to Ruff and the difference was heaven and hell! Pun in’tended.)



I cannot share the same experiences. mvn is a buggy mess, randomly forgetting dependencies, and constantly needing a full clean to not die on itself. npm and the entire js ecosystem feels so immature with constant breaking changes, and circular dependency hell, when trying to uppgrade stuff.


That's an issue with the packages themselves though, not with package management as a whole. You and the comment above you are talking about different things. While there's plenty of pain to be had with npm, if you have a project that used to work years ago, you can generally just clone, install and be done, even if on older versions. On Python this used to mean a lot of hurt, often even if it was a fresh project that you just wanted to share with a colleague.


For value of "years" greater than 1?

Node/NPM was a poster child of an ecosystem where projects break three times a week, due to having too many transitive dependencies that are being updated too often.



This argument makes no sense. Your dependencies don't change unless you change them, npm doesn't magically update things underneath you. Things can break when you try to update one thing or another, yes, but if you just take an old project and try and run it, it will work.


Assuming the downloads still exist? Does NPM cache all versions it ever distributed?

That's always one major thing I saw breaking old builds: old binaries stop being hosted, forcing you to rebuild them from old source, which no longer builds under current toolchains - making you either downgrade the toolchain that itself may be tricky to set up, or upgrade the library, which starts a cascade of dependency upgrades.

It's not like Node projects are distributed with their deps vendored; there's too much stuff in node_modules.



You're using a different, not hosted anymore package, three times a week? That's somewhere between very unusual and downright absurd.

Yes you can find edge cases with problems. Using this as an argument for "breaks 3 times per week" does not hold.



No, I was using this as an argument for why I don't expect Node projects older than a year or two to be buildable without significant hassle.

(Also note that outside the web/mobile space, projects that weren't updated in a year are still young, not old. "Old" is more like 5+ years.)

The two things are related. If your typical project has a dependency DAG of 1000+ projects, a bug or CVE fix somewhere will typically cause a cascade of potentially breaking updates to play out over multiple days, before everything stabilizes. This creates pressure for everyone to always stay on the bleeding edge; with a version churn like this, there's only so many old (in the calendar sense) package dists that people are willing to cache.

This used to be a common experience some years back. Like many others, I gave up on the ecosystem because of the extreme fragility of it. If it's not like that anymore, I'd love to be corrected.



Projects breaking so frequently on npm and node is simply not the case unless you are trying upgrade an old project, one dependency per day…


But how does it work with components that require libraries written in C?

And what if there are no binaries yet for my architecture, will it compile them, including all the dependencies written in C?



IMO if you require libraries in other languages then a pure python package manager like uv, pip, poetry, whatever, is simply the wrong tool for the job. There is _some_ support for this through wheels, and I'd expect uv to support them just as much as pip does, but they feel like a hack to me.

Instead there is pixi, which is similar in concept to uv but for the conda-forge packaging ecosystem. Nix and guix are also language-agnostic package managers that can do the job.



But for example, if I install the Python package "shapely", it will need a C package named GEOS as a shared library. How do I ensure that the version of GEOS on my system is the one shapely wants? By trial and error? And how does that work with environments, where I have different versions of packages in different places? It sounds a bit messy to me, compared to a solution where everything is managed by a single package manager.


You are describing two different problems. Do you want a shapely package that runs on your system or do you want to compile shapely against the GEOS on your system. In case 1 it is up to the package maintainer to package and ship a version of GEOS that works with your OS, python version, and library version. If you look at the shapely page on pypi you'll see something like 40 packages for each version covering most popular permutations of OS, python version and architecture. If a pre-built package exists that works on your system, then uv will find and install it into your virtualenv and everything should just work. This does means you get a copy of the compiled libraries in each venv.

If you want to build shapely against your own version of GEOS, then you fall outside of what uv does. What it does in that case is download the all build tool(s) specified by shapely (setuptools and cython in this case) and then hands over control to that tool to handle the actual compiling and building of the library. It that case it is up to the creator of the library to make sure the build is correctly defined and up to you to make sure all the necessary compilers and header etc. are set up correctly.



That's what I mean, in this case pip, uv, etc. are the wrong tool to use. You could e.g. use pixi and install all python and non-python dependencies through that, the conda-forge package of shapely will pull in geos as a dependency. Pixi also interoperates with uv as a library to be able to combine PyPI and conda-forge packages using one tool.

But conda-forge packages (just like PyPI packages, or anything that does install-time dependency resolution really) are untestable by design, so if you care for reliably tested packages you can take a look at nix or guix and install everything through that. The tradeoff with those is that they usually have less libraries available, and often only in one version (since every version has to be tested with every possible version of its dependencies, including transitive ones and the interpreter).

All of these tools have a concept similar to environments, so you can get the right version of GEOS for each of your projects.



Indeed, I'd want something where I have more control over how the binaries are built. I had some segfaults with conda in the past, and couldn't find where the problem was until I rebuilt everything from scratch manually and the problems went away.

Nix/guix sound interesting. But one of my systems is an nVidia Jetson system, where I'm tied to the system's libc version (because of CUDA libraries etc.) and so building things is a bit trickier.



You could use a package manager that packages C, C++, Fortran and Python packages, such as Spack: here's the py-shapely recipe [1] and here is geos [2]. Probably nix does similar.

[1]: https://github.com/spack/spack/blob/develop/var/spack/repos/... [2]: https://github.com/spack/spack/blob/develop/var/spack/repos/...



What exactly prevents you from creating your own packages if you want to use your system package manager?

On Alpine and Arch Linux? Exactly nothing.

On Debian/Ubuntu? maybe the convoluted packaging process, but that's on you for choosing those distributions.



On Nvidia/Jetson systems, Ubuntu is dictated by the vendor.


UV is not (yet) a build system and does not get involved with compiling code. But easily lets you plug in any build system you want. So it will let you keep using whatever system you are currently using for building your C libraries. For example I use scikit-build-core for building all of my libraries C and C++ components with cmake and it works fine with uv.


    uv build
    Building source distribution...
    running egg_info
    writing venv.egg-info/PKG-INFO
    Successfully built dist/venv-0.1.0.tar.gz
    Successfully built dist/venv-0.1.0-py3-none-any.whl


I guess it depends on what you mean by a build system. From my understanding uv build basically just bundles up all the source code it finds, and packages it into a .whl with the correct metadata. It cannot actually do any build steps like running commands to compile or transform code or data in any way. For that you need something like setuptools or scikit-build or similar. All of which integrate seamlessly with uv.


Ok, you convinced me to give it a try. Tbh, I am a casual user of python and I don't want to touch it unless I have a damn good reason to use it.


You do not need a damn good reason for this. Just try it out on a simple hello world. Then try it out on a project already using poetry for eg.

uv init

uv sync

and you're done

I'd say if you do not run into the pitfalls of a large python codebase with hundreds of dependencies, you'll not get the bigger argument people are talking about.



I don't think you need to sync, do you? It always just does it when running.

That said, I do wish uv had `uv activate`. I like just working in the virtualenv without having to `uv run` everything.



I do usually include instructions in our READMEs to do a `uv sync` as install command, in order to separate error causes, and also to allow for bootstrapping the venv so that it's available for IDEs.


That makes sense, thanks.


You can still `source .venv/bin/activate(.fish)` and skip the uv run bit. I have Fish shell configured to automatically activate a .venv if it finds one in a directory I switch to.




Unlike uv this tool is unlikely to solve problems for the average Python user and most likely will create new ones.


Agree, however for user who want to get faster speed out of python wouldn't that just work with rustpython? It can also run in the browser then.


I've read so much positive feedback about uv, that I'd really like to use it, but I'm unsure if it fits my needs.

I was heavily invested into virtualenv until I had to upgrade OS versions, which upgraded the Python versions and therefore broke the venvs.

I tried to solve this by using pyenv, but the need of recompiling Python on every patch wasn't something which I would accept, specially in regards to boards like Raspberry Pis.

Then I tried miniconda which I initially only liked because of the precompiled Python binaries, and ultimately ended up using pyenv-managed miniforge so that I could run multiple "instances" of miniforge and therefore upgrade miniforge gradually.

Pyenv also has a plugin which allows to set suffixes to environments, which allows me to have multiple miniforges of the same version in different locations, like miniforge-home and miniforge-media, where -home has all files in the home dir and -media has all files on a mounted nvme, which then is where I put projects with huge dependencies like CUDA inside, not cluttering home, which is contained in a VM image.

It works really great, Jupyter and vscode can use them as kernels/interpreters, and it is fully independent of the OS's Python, so that OS upgrades (22.04 -> 24.04) are no longer an issue.

But I'm reading about all these benefits of uv and wish I could use it, but somehow my setup seems to have tied my hands. I think I can't use uv in my projects.

Any recommendations?

Edit: Many of my projects share the same environment, this is absolutely normal for me. I only create a new environment if I know that it will be so complex that it might break things in existing environments.



I’m a bit confused why uv is not an option for you. You don’t need to compile Python, it manages virtualenvs for you, you can use them with Jupyter and vscode. What are you missing?


So the only difference is that Conda also isolates "system" libraries (like libcublasLt.so), or does uv also do this?

It's not that uv is not an option for me, I made this move to miniforge before uv was on my radar because it wasn't popular, but I'm still at a point where I'm not sure if uv can do what I need.



According to these docs

https://docs.astral.sh/uv/pip/environments/

I think uv supports conda envs



uv does not ship system libraries because pypi does not have them. There is a philosophical difference between pypi and conda today. I believe over time pypi will likely ship some system libraries but we will see.


So uv is resticted to pypi but does offer isolated Python installations, with precompiled Python binaries?


I have moved to uv few months back and never looked back. I use it with venv and it works very well. There is a new environment handling way with uv:

- uv init new-py-env

- cd new-py-env

- uv add jupyter

- uv build

These are executed super fast. Not sure if this could help your situation but it is worth to be aware of these.



The python ecosystem has become a disaster. Even reading your post gave me a headache.


I keep reading praise about uv, and every single time I never really understand what the problems are that it addresses.

I've got a couple quite big Django projects for which I've used venv for years, and not once have I had any significant issues with it. Speed at times could have been better and I would have liked to have a full dependency list lock file, but that never caused me issues.

The only thing that comes to mind is those random fails to build of C/C++ dependencies. Does uv address this? I've always seen people rave about other benefits.



What makes it so great for me is the effortlessness.

I often use Python for quick one off scripts. With UV I can just do `uv init`, `uv add` to add dependencies, and `uv run` whatever script I am working on. I am up and running in under a minute. I also feel confident that the setup isn't going to randomly break in a few weeks.

With most other solutions I have tried in the Python ecosystem, it always seemed significantly more brittle. It felt more like a collection of hacks than anything else.



You can even inline the dependencies:

https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

That plus this: https://news.ycombinator.com/item?id=42855258

Makes it pretty seamless for one-off scripts.



Not a surprise. I said it before and I'll say it again, all the competing projects should just shut up shop for the good of Python. uv is so much better it's like pushing penny farthings after the safety bike has been invented.


That's rough for all the creators of poetry, pdm, pipenv, etc. to hear. They put in a ton of great work over the last decade, but I fear you may be right.


I quite really like pdm! I can see why maybe poetry but especially pipenv might be replaced with uv, but what's the value of uv over pdm beyond performance? It ticks all my boxes otherwise.


Beyond performance? Performance!


I am feeling the same way about PDM, it works very well, easy to configure and checks all the boxes feature-wise.


PyCharm also added uv support in their latest versions.

We recently switched to PDM in our company because it worked very well in our tests with different package/dependency managers. Now I am rethinking if we should switch to uv while PDM usage is still not very wide-spread in our company. But PDM works very well, so I am not sure whether to keep using it.



With the caveat I only have the package installers usage data for Wagtail downloads – pdm usage has fallen off a cliff, from 0.2% of downloads in January 2024, to 0.01% in January 2025. Roughly matches the uptake of uv.

Doesn’t make pdm bad in itself but that means there’ll be fewer pdm users around to report bugs, potentially fewer contributors to it too, fewer resources, etc.



Indeed, on one hand PDM works great, but on the other hand we wouldn't want to choose a package manager which might not be maintained anymore after a few years because there are just not many users of it.


I feel for me, at least one nice thing about poetry over uv is, that if I have an issue or feature extension, I can just write my own plugin in pure Python. With uv, I'd need to learn Rust in addition to python/c/c++/etc.

I wonder what it would take to get poetry on par with uv for those who are already switching to it? Poetry is definitely very slow downloading multiple versions of packages to determine dependencies (not sure how uv works around this?). Does uv have a better dependency checker algorithm?



> I wonder what it would take to get poetry on par with uv

Different laws of physics, to start with.



For the uninitiated what is the benefit of UV over pip?

I've been working with pip for so long now that I barely notice it unless something goes very wrong.



- uv is aware of your dependencies, you can add/remove development dependencies, create group of development dependencies (test, lint, dev, etc) and add or remove those and only those at will. You can add dependencies and optional dependencies for a project as well, think my_app[cli,standard]. You don't need to have different requirements.txt for each case nor do you need to remove things by hand as you'd do in pip, since it doesn't remove deps when you remove a package for example. As a result, you can remove {conda,poetry,...} from your workflows.

- uv can install python and a virtualenv for you. Any command you run with `uv run` from the root of a repo will be aware of its environment, you don't even need to activate a virtualenv anymore. This replaces {pyenv, pyenv-virtualenv, virtualenvwrapper,...}.

- uv follows the PEPs for project config (dependencies, optional dependencies, tool configs) in the pyproject.toml so in case uv dies, it's possible to migrate away for the features are defined in the PEPs. Which is not the case for say, poetry.

- uv has a lock file and it's possible to make deps platform specific (Windows, Linux, MacOS, etc). This is in compliance with a PEP but not supported by all tools.

- uv supports custom indexes for packages so you can prefer a certain index, for example your company package index or pytorch's own index (for ML work).

- very fast, makes local dev very seamless and is really helpful in CI/CD where you might just setup and tear down python envs a lot.

Also, the team is responsive on Github so it's easy to get help.



Does this also replace, or work well with tox? We currently use it to run basic CI/local workflows (`tox -e lint` for all linters, `tox -e py310`, `tox -e py312` to run tests suites on chosen interpreters' environments), and to set up a local environment with package installed in-place (so that we can run `myprogram -arg1 -arg2` as if it was installed via `pip`, but still have it be editable by directly editing the repo).

With how much the ecosystem is moving, I don't know whether the way we're doing it is unusual (Django and some other big projects still have a tox.ini), obsolete (I can't find how ux obsoletes this), or perfectly fine and I just can't find how to replace pip with ux for this use case.



I'm not personally releasing a ton of internal packages where I work but I know of https://github.com/tox-dev/tox-uv. Haven't tried it yet though but it seems to do what you want. I also saw that nox (tox but in python instead of a tox.ini file https://nox.thea.codes/en/stable/config.html), is supporting uv from what I understand.

I don't think there's a definite answer yet.



Not only it's faster, it also provides a lock file, `uvx tool_name` just like `npx`, and a comprehensive set of tools to manage your Python version, your venv and your project.

You don't need `pyenv`, `poetry` and `pipx` anymore, `uv` does all of that for you.



> over pip

It's a much more complete tool than pip. If you've used poetry, or (in other languages) cargo, bundler, maven, then it's like that (and faster than poetry).

If you haven't, in addition to installing dependencies it will manage and lock their versions (no requirements.txt, and much more robust), look after the environment (no venv step), hold your hand creating projects, and probably other things.

Edit to add: the one thing it won't do is replace conda et al, nor is it intended to.



It brings way more to the table than just being fast, like people are commenting. E.g. it manages Python for your projects, so if you say you want Python 3.12 in your project, and then you do 'uv run python my script.py', it will fetch and run the version of Python you specified, which pip can't do. It also creates lock files, so you know the exact set of Python package dependencies that worked, while you specify them more loosely. Plus a bunch of other stuff..


The whole explaination is here: https://www.bitecode.dev/p/a-year-of-uv-pros-cons-and-should

The td;rd is that is has a lot less modes of failure.



Faster.


Ok, and what's the advantage for the people who don't have "my pip is too slow" problem?




Wait times are in the order of tens of milliseconds instead of seconds. That makes a massive difference in how nice uv is to use vs pip.


That’s just the same “my pip is too slow” problem which some people don’t have.

I work in a place with 200 developers, and 99% of pip usage is in automated jobs that last an hour. Shaving a couple seconds off that will not provide any tangible benefit. However moving 200 people from a tool they know to one they don’t comes at a rather significant cost.



I can't stress how fast it is when using on resource constrained envs like a Pi Zero.

I intend to use system python there but previously poetry will simply crash the whole Pi while installing itself.



The only advantage over pip is it's faster. But the downside is it's not written in Python.

The real point of uv is to be more than pip, though. It can manage projects, so basically CLI commands to edit your `pyproject.toml`, update a lockfile, and your venv all in one go. Unlike earlier tools it implements a pretty natural workflow on top of existing standards where possible, but for some things there are no standards, the most obvious being lockfiles. Earlier tools used "requirements.txt" for this which was quite lacking. uv's lockfile is cross-platform, although, admittedly does produce noisier diffs than requirements.txt, which is a shame.



The problems start as soon as your scripts should run on more than your own computer.

If you pip install something, you install it on the system python (the python binary located at sys.executable). This can break systems if the wrong combination of dependencies comes together. This is why you should never install things via pip for other people, unless you asked them first.

Now how else would you install them? There is a thing called virtual environments, which basically allows you to install pip dependencies in such way, they are only there within the context of the virtual environment. This is what you should do when you distribute python programs.

Now the problem is how do you ensure that this install to the virtual environment uses specific versions? What happens when one library depends on package A with version 1.0 and another library depends on a package with version 2.0? Now what happens if you deploy that to an old debian with an older python version.. Before uv I had to spend literal days to resolve such conflicts.

uv solves most of these problems in one unified place, is extremely performant, just works and when it does not, it tells you precisely why.



I switched from Poetry to uv last year. I like the speed and how it stores virtual envs in a .venv directory along with the project by default, whereas Poetry store it in a separate directory in your home directory by default, which makes it hard to work with tools that only discover virtual envs in the project root.

uv tool is also a great replacement for pipx.

I think it's the way to go for Python dependency management in 2025.



May uv please keep on eating the python world. It's so good.


Man, I’m so jealous of insane praise that uv (and most other astral tools) gets. I don’t think ever seen anything so unanimously lauded around here.


With good reason honestly. They take all the best practices from existing tooling we had, discard the bad, and make it run blazingly fast.

Ruff for me meant i could turn 4 pre-commit hooks (which you have to configure to be compatible with each other too) into just 1, and i no longer dread the "run Pylint and take a coffee break" moment.

I jumped ship to UV recently. Though i was skeptical at first i don't regret it. It makes dependency management less of a chore, and just something i can quickly do now. Switching from Poetry was easy for me too, only package i had issues with was pytorch, but that just required some different toml syntax.



Guess people here don't talk much about cargo. I wouldn't be surprised to learn that cargo inspired uv. Rust with cargo showed for the first time that tooling _can_ be good, even for systems programming languages.


uv has been introduced as cargo for python: https://astral.sh/blog/uv-unified-python-packaging


Like with any social media site, you also have to consider the possibility that not all comments are 100% organic.


I’ve seen fishy looking engagement in hn before, but I’m inclined to think uv’s praise is genuine. It reflects the collective relief of seeing an extremely long and painful journey finally come to an end (hopefully).


Tailscale?


It’s not really on my radar, but I’d be curious to know what other pieces of software get similar respect from their communities.


Uv should replace pip for all I care.


Well, seems like 100% what’s going to happen (for the majority of Wagtail users at least) if the current trend continues. I’m not sure if that’s a good thing to be frank. But we’ll have to adjust regardless.


As a semi casual user of python that had to battle w/ dependency management recently, can you elaborate on why that would not be a good thing ? I thought about switching our project to uv but could not find the time necessary


Sure – and I think it’s certainly proving to be a good thing so far! My concerns are more longer-term. I see two primarily:

(1) As uv’s governance is driven by a for-profit company, I see incentives that will eventually compromise on its benefits.

(2) Python packaging has historically been very fragmented, and more recently there’s been lots of work on standardization. That work will be impacted when users massively shift to one package installer.

Neither of those things are clear negatives, but they’re worth being aware of.



You may be overestimating the amount of time it takes to switch to uv.


Took me all of about 10 seconds after I decided to switch from Poetry and PipX. Been just learning it bit by bit as I go along and been really pleased with it thus far.




What about it? RustPython is an alternative interpreter; it's not in the same category of thing as pip or uv.


As much as I am glad that it looks like one solution is being more and more accepted as the golden standard, I'm a little disappointed that PDM [0] -- which has been offering pretty much everything uv does for quite some time now -- has been completely overlooked. :(

[0] https://pdm-project.org



Plenty of packages still fail trying to spawn cmake, gcc and al.

UV does not solve all the hard problems.

Maybe switch to Pixi?



I'm extremely satisfied with Pixi. It fixes almost all the issues I had with conda and mamba. It supports both conda and pypi (via uv) packages. I don't know if uv fixes pip's dependency management hell. I settled on conda packages because pip was such a mess.


Switch to who?


Google's not that broken yet: https://pixi.sh/latest/


Well deserved, I wish more cloud providers have it preinstalled


if an app or tool has a huge speed advantage, then I'd choose it no matter what.


It's the integrated python version management which sold me.

Took a huge chunk of complexity out of bootstrapping projects which I was otherwise handling myself.



This is the one thing that took a while for me to get to grips with. I tend to use `asdf` for my python version management, which I want to continue to use since I use it for most other languages.

It'd be nice if we could have a shim for `uv` that allows it to play nice with `asdf` (there is an `asdf-uv` but seems to be about installing different versions of `uv` when instead I want to install different versions of python _via_ uv).



I can't see any reason really to keep using asdf for python when a better alternative now exists, unless you just don't want to learn the new syntax?


If all you do is write Python then sure, but for the rest of us that have to run code written in 7 different languages within our project, written by 7 different teams, playing nicely with asdf is non-negotiable.

I've had it with version managers that only target a single language or tool, the cognitive load is too high if there's more than a couple of languages in the mix.

What would be really nice is an asdf-like single package manager with language-specific plugins. That would save me a bunch of headaches.



I’m still using pip, what am I missing?


1) Deterministic environments 2) not having to manage python installations 3) 100x speed boost


Poetry and uv both offer better dependency management, and project isolation out of the box. If you work in a team, or on more than one python project, then it's worth spending a day to install & adopt either one of these systems.


Both dependency management and project isolation are available in a standard Python 3 installation, also out of the box, without 3rd party tool dependency - `pip` and `python3 -m venv`. Admittedly, they work slower, but fast enough for me, especially that I do not run these commands every hour.




absolutely nothing


I don't understand a word of the headline, I guess I am not the intended audience.


uv sounds great! For those still using Python v2, how well does it work? pip used to be a pain when having to manage both Python v2 and v3 projects and tools.


If you are still on 2.7, packaging is the least of your problems


Unfortunately, I don't think many things nowadays are tested with a 15-year-old version of a language.

I was one of the last holdouts, preferring to keep 2.7 support if it wasn't too much hassle, but we have to move on at some point. Fifteen years is long enough support.



Been using Python for 20 years and tried just about every tool related to packaging over the years. The only ones that worked well (IMO) were pip, pip-tools and venv. uv finally replaces all of them.

But being written in Rust means I'm having to also finally get somewhat proficient in Rust. Can any Rust developers comment on the quality of the uv codebase? I find it surprisingly procedural in style, with stuff like hundreds of `if dry_run` type things scattered throughout the codebase, for example. Is this normal Rust style?



is it me or is there a new python dependency manager every year?


nah other than uv it's just poetry, pdm, and pipenv over last decade, and uv is so dominant I don't think anyone else will try making another one for a while






Join us for AI Startup School this June 16-17 in San Francisco!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com