I was skeptical as fuck about uv at first. Another Rust tool claiming to "fix" Python? Sounded like bullshit. But then I tried installing the usual Django crap and pip took forever like always while uv was already done. I literally thought it failed at first because it finished so fast.
uv is fast because it's written in Rust and downloads packages in parallel instead of pip's stupid one-at-a-time approach from 2008. The official benchmarks show it's 8-10x faster than pip without caching.
Here's what actually happened when I ran uv pip install django==4.2.7 psycopg2-binary==2.9.7 pillow==10.0.1
:
Resolved 8 packages in 47ms
Downloaded 8 packages in 1.8s
Installed 8 packages in 92ms
+ asgiref==3.7.2
+ django==4.2.7
+ pillow==10.0.1
+ psycopg2-binary==2.9.7
+ pytz==2023.3
+ sqlparse==0.4.4
+ typing-extensions==4.8.0
Compare that to pip's usual horseshit (timed this on my M1 MacBook):
Collecting django==4.2.7
Downloading Django-4.2.7-py3-none-any.whl (8.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.0/8.0 MB 2.1 MB/s eta 0:00:00
[waits 8 more seconds doing absolutely nothing]
Collecting psycopg2-binary==2.9.7
Downloading psycopg2_binary-2.9.7-cp311-cp311-macosx_11_0_arm64.whl (2.1 MB)
[another 6 second pause because pip is single-threaded garbage]
uv downloads everything in parallel because it's built in Rust. pip downloads one package at a time like it's still 2008.
Memory Usage and Caching
uv uses less RAM than pip - maybe 50-60MB vs pip's 100+ MB from what I've seen. The global cache is smarter too. When I install the same packages across different projects, uv reuses cached wheels instead of downloading everything again.
Unlike pip's limited caching, uv's global cache works across virtual environments and projects. This means faster installs for Docker builds, CI/CD pipelines, and local development. Check out the Real Python tutorial for more cache optimization strategies.
When Dependency Resolution Actually Works
Here's where pip drives me insane. Try this with pip:
pip install tensorflow==2.13.0 scikit-learn==1.3.0
You'll get some variation of:
ERROR: pip's dependency resolver does not currently take into account
all the packages that are installed. This behaviour is the source of
the following dependency conflicts.
tensorflow 2.13.0 requires numpy<=1.24.3,>=1.21.2
scikit-learn 1.3.0 requires numpy>=1.17.3
Then pip just... installs whatever numpy version it feels like. Your environment is broken but pip doesn't care.
uv actually resolves this shit properly:
uv pip install tensorflow==2.13.0 scikit-learn==1.3.0
Resolved 15 packages in 12ms
✓ Found compatible numpy==1.24.3
Same with Poetry, but uv doesn't take 30 seconds thinking about it.
Where uv Actually Saves Time
Our CI was taking 6 minutes per build, and 4 of those were just installing the same 73 dependencies over and over. Switched our GitHub Actions from pip install -r requirements.txt
to uv pip install -r requirements.txt
and installs dropped to 45 seconds. We run this pipeline 50+ times daily, so that's 4+ hours saved. Check the uv CI guide for setup instructions.
Docker builds got faster too. uv's single binary approach means you don't need Python installed to install Python packages:
FROM alpine:3.18
RUN curl -LsSf https://astral.sh/uv/install.sh | sh
COPY requirements.txt .
RUN /root/.local/bin/uv pip install -r requirements.txt
No more waiting for Python to bootstrap or dealing with Alpine's weird package manager.
When uv Breaks (And It Does)
uv isn't magic. I've hit these specific failures:
Packages with weird C extensions: mysqlclient==2.2.0
completely shat the bed with some 200-line Rust panic about malloc failures. Spent 2 hours going down rabbit holes on GitHub issues before saying fuck it and using pip. Check the compatibility guide for known package issues.
Corporate proxy bullshit: Our BlueCoat proxy (10.0.0.1:8080) killed uv's parallel downloads with ConnectTimeout
after exactly 30 seconds. Had to set UV_HTTP_TIMEOUT=300
and UV_CONCURRENT_DOWNLOADS=1
to make it work like pip.
ARM Mac weirdness: Installing psycopg2-binary==2.9.7
on Apple M1 threw "no matching distribution found" even though the wheel exists on PyPI. pip grabbed psycopg2_binary-2.9.7-cp311-cp311-macosx_11_0_arm64.whl
just fine. Still makes no fucking sense.
The error messages are complete trash compared to pip. When uv breaks, you get a Rust panic dump that tells you nothing useful. When pip breaks, at least it fails in English and tells you which package fucked up.