I agree, I like some of the directions the fork would go and dislike some. The apparent, fork, publish on HN, then change (and the change showing not a lot of understanding) makes me throughly question the legitimacy and long term stability of it.
> 2. Centralized venv storage — keep .venvs out of your project dirs
I do not like this. virtual environments have been always associated with projects and colocated with them. Moving .venv to centralized storage recreates conda philosophy which is very different from pip/uv approach.
In any case, I am using pixi now and like it a lot.
I like it. Enjoyed having it with Conda, was sorry when it was lost with uv. Been a pain to search my projects and have irrelevant results that I then have to filter. Or to remember to filter in the first place. The venvs may be associated with the projects, but they're just extraneous clutter unless there's actually something to be done directly on them, which is very rare.
One problem I have on my work machine is that it will do a blind backup of project directories. Useless .venv structure with thousands of files completely trashes the backup process. Having at least the flexibility to push the .venv to a cache location is useful. There was (is?) a uv issue about this similar use case (eg having a Dropbox/Onedrive monitored folder).
thats my biggest problem with uv, i liked the way pipenv did it much better. I want to be able to use find and recursive grep without worrying that libraries are in my project directory.
Pip doesn’t have any philosophy here. It doesn’t manage your virtualenv at all, and definitely doesn’t suggest installing dependencies into a working directory.
Putting the venv in the project repository is a mess; it mixes a bunch of third party code and artifacts into the current workspace. It also makes cleaning disk space a pain, since virtualenvs end up littered all over the place. And every time you “git clean” you have to bootstrap all over again.
Perhaps a flag to control this might be a good fit, but honestly, I always found uv’s workflow here super annoying.
Disagree—better to have space allocated in each project where they can be easily deleted at once. Rather than half hidden in your home folder somewhere with random names and forgotten about.
If for some rare reason you wanted to delete all venvs, a find command is easy enough to write.
Virtual environments have been always associated with projects in your use case I guess.
In my use case, they almost never are. Most people in my industry have 1-2 venvs that they use across all their projects, and uv forcing it into a single project directory made it quite inconvenient and unnecessary duplication of the same sets of libraries.
I dislike conda not because of the centralized venvs, but because it's bloated, poorly engineered, slow and inconvenient to use.
At the end of the day, this gives us choice. People can use uv or they can use fyn and have both use cases covered.
> and uv forcing it into a single project directory made it quite inconvenient and unnecessary duplication of the same sets of libraries.
Actually, uv intelligently uses hardlinks or reflinks
to avoid file duplication. On the surface, venvs in different projects are duplicate, but in reality they reference the same files in the uv's cache.
BTW, pixi does the same. And pixi global allows you to create global environments in central location if you prefer this workflow.
EDIT: I forgot to mention an elephant in the room. With agentic AI coding you do want all your dependencies to be under your project root. AI agents run in sandboxes and I do not want to give them extra permissions pocking around in my entire storage. I start an agent in the project root and all my code and .venv are there. This provides sense of locality to the agent. They only need to pock around under the project root and nowhere else.
This is actually the feature that initially drew me towards uv. I never have to worry about where venvs live while suffering literally zero downsides. It's blazing fast, uses minimal storage, and version conflicts are virtually impossible.
Do you only work on projects individually? Without project-specific environments I don’t know how you could share code with someone else without frequent breakages.
pixi is a general multi-languge, multi-platform package manager. I am using it now on my new macbook neo as a homebrew _replacement_. Yes, it goes beyond python and allows you to install git, jj, fzf, cmake, compilers, pandoc, and many more.
For python, pixi uses conda-forge and PyPI as package repos and relies on uv's rattler dependency resolver. pixi is as fast as uv (it uses fast code path from uv) but goes further beyond python wheels.
For detail see [0] or google it :-)
There is a good chunk of overlap but mise predominately pulls from github releases artifacts/assets and pixi uses conda packages. While mise can use conda packages, the mise-conda backend is still experimental. I don't think github releases or conda packages are better than the other, they both have tradeoffs.
Pixi is very python focused, it's both a tool manager and a library dependency manager (see uv/pip). Mise considered library dependency an anti-goal for a long time, while I don't see that on the website anymore I haven't seen any movement to go into that space.
Given the telemetry, how did uv ever get approved/adopted by the open source community to begin with, or did it creep in? Why isn't it currently burning in a fire?
The field that guesses if something is running in a CI environment is particularly useful, because it helps package authors tell if their package is genuinely popular or if it's just being installed in CI thousands of times a day by one heavy user who doesn't cache their requirements.
Honestly, stripping this data and then implying that it was collected by Astral/OpenAI in a creepy way is a bad look for this new fork. They should at least clarify in their documentation what the "telemetry" does so as not to make people think Astral were acting in a negative way.
Personally I think stripping the telemetry damages the Python community's ability to understand the demographics of package consumption while not having any meaningful impact on end-user privacy at all.
I suspect that my normal workflows might just have evolved to route around the pain that package management can be in python (or any other ecosystem really).
In what situations are uv most useful? Is it once you install machine learning packages and it pulls in more native stuff - ie is it more popular in some circles? Is there a killer feature that I'm missing?
If you have hundreds of different Python projects on your machine (as I do) the speed and developer experience improvements of uv make a big difference.
I love being able to cd into any folder and run "uv run pytest" without even having to think about virtual environments or package versions.
78 comments
And the first two commits are "new fork" and "fork", where "new fork" is a nice (+28204 -39206) commit and "fork" is a cheeky (+23971 -23921) commit.
I think I'm good. And I would question the judgement of anyone jumping on this fork.
"fix: updated readme. sorry was so tired i accidentally mass replaced uv with fyn for all"
> 2. Centralized venv storage — keep .venvs out of your project dirs
I do not like this. virtual environments have been always associated with projects and colocated with them. Moving .venv to centralized storage recreates conda philosophy which is very different from pip/uv approach.
In any case, I am using pixi now and like it a lot.
It's been open for two years but it looks like there's a PR in active development for it right now: https://github.com/astral-sh/uv/pull/18214
uv is just so fast that i deal with it.
…but I don’t have everything in a git repo. Some of my “projects” are just local scraps for trying things out.
And it doesn’t account for any other tooling that may not respect gitignore by default.
it’s my biggest problem with npm too. Ive worked around it long enough, it’s just annoying.
Putting the venv in the project repository is a mess; it mixes a bunch of third party code and artifacts into the current workspace. It also makes cleaning disk space a pain, since virtualenvs end up littered all over the place. And every time you “git clean” you have to bootstrap all over again.
Perhaps a flag to control this might be a good fit, but honestly, I always found uv’s workflow here super annoying.
If for some rare reason you wanted to delete all venvs, a find command is easy enough to write.
UV_PROJECT_ENVIRONMENT=$HOME/.virtualenvs/{env-name} uv {command}
Virtual environments have been always associated with projects in your use case I guess.
In my use case, they almost never are. Most people in my industry have 1-2 venvs that they use across all their projects, and uv forcing it into a single project directory made it quite inconvenient and unnecessary duplication of the same sets of libraries.
I dislike conda not because of the centralized venvs, but because it's bloated, poorly engineered, slow and inconvenient to use.
At the end of the day, this gives us choice. People can use uv or they can use fyn and have both use cases covered.
> and uv forcing it into a single project directory made it quite inconvenient and unnecessary duplication of the same sets of libraries.
Actually, uv intelligently uses hardlinks or reflinks to avoid file duplication. On the surface, venvs in different projects are duplicate, but in reality they reference the same files in the uv's cache.
BTW, pixi does the same. And
pixi globalallows you to create global environments in central location if you prefer this workflow.EDIT: I forgot to mention an elephant in the room. With agentic AI coding you do want all your dependencies to be under your project root. AI agents run in sandboxes and I do not want to give them extra permissions pocking around in my entire storage. I start an agent in the project root and all my code and .venv are there. This provides sense of locality to the agent. They only need to pock around under the project root and nowhere else.
> How is pixi better than uv?
pixi is a general multi-languge, multi-platform package manager. I am using it now on my new macbook neo as a homebrew _replacement_. Yes, it goes beyond python and allows you to install git, jj, fzf, cmake, compilers, pandoc, and many more.
For python, pixi uses conda-forge and PyPI as package repos and relies on uv's rattler dependency resolver. pixi is as fast as uv (it uses fast code path from uv) but goes further beyond python wheels. For detail see [0] or google it :-)
[0] https://pixi.prefix.dev/latest/
Pixi is very python focused, it's both a tool manager and a library dependency manager (see uv/pip). Mise considered library dependency an anti-goal for a long time, while I don't see that on the website anymore I haven't seen any movement to go into that space.
It's providing platform information to PyPI to help track which operating systems and platforms are being used by different packages.
The result is useful graphs like these: https://pypistats.org/packages/sqlite-utils and https://pepy.tech/projects/sqlite-utils?timeRange=threeMonth...
The field that guesses if something is running in a CI environment is particularly useful, because it helps package authors tell if their package is genuinely popular or if it's just being installed in CI thousands of times a day by one heavy user who doesn't cache their requirements.
Honestly, stripping this data and then implying that it was collected by Astral/OpenAI in a creepy way is a bad look for this new fork. They should at least clarify in their documentation what the "telemetry" does so as not to make people think Astral were acting in a negative way.
Personally I think stripping the telemetry damages the Python community's ability to understand the demographics of package consumption while not having any meaningful impact on end-user privacy at all.
Here's the original issue against uv, where the feature was requested by a PyPI volunteer: https://github.com/astral-sh/uv/issues/1958
Update: I filed an issue against fyn suggesting they improve their documentation of this: https://github.com/duriantaco/fyn/issues/1
In what situations are uv most useful? Is it once you install machine learning packages and it pulls in more native stuff - ie is it more popular in some circles? Is there a killer feature that I'm missing?
I love being able to cd into any folder and run "uv run pytest" without even having to think about virtual environments or package versions.
https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...
UV_CACHE_MAX_SIZEandUV_LOCKFILEwithUV_if they're new features? Makes no sense.Crazy that there is not way in uv to limit the cache size. I have loved using uv though, it is a breath of fresh air.
I assume mainstream uv development will go into maintenance mode now, so it’s great to see a quality lineage like this.