Corporate Authentication Configuration

Time to fight your company's authentication system. This shit breaks constantly, usually at 3am during deployments when nobody from IT is awake to help.

Private PyPI Repository Setup

Your company probably runs Nexus, Artifactory, or GitLab Package Registry. Here's how to point uv at whatever overly complicated setup IT created:

## pyproject.toml
[tool.uv.pip]
extra-index-url = ["https://pypi.company.com/simple/"]
trusted-host = ["pypi.company.com"]

[tool.uv.sources]
company-lib = { index = "company-pypi" }

[[tool.uv.index]]
name = "company-pypi"
url = "https://pypi.company.com/simple/"
explicit = true  # Only use for specifically marked packages

That explicit = true line is crucial - without it, uv will spam your private repo asking for requests and numpy like an idiot. Our Jenkins builds started randomly shitting themselves because uv was making 500+ requests to Nexus for packages that obviously aren't there. IT was not pleased.

AWS CodeArtifact Integration

AWS CodeArtifact tokens expire every 12 hours, which guarantees your weekend deploy will fail. Here's how to make this work (until AWS decides to change the auth flow again):

## Generate auth token (expires in 12 hours)
export CODEARTIFACT_AUTH_TOKEN=$(aws codeartifact get-authorization-token \
    --domain my-domain --domain-owner 123456789012 \
    --query authorizationToken --output text)

## Configure uv to use the token
export UV_EXTRA_INDEX_URL="https://aws:${CODEARTIFACT_AUTH_TOKEN}@my-domain-123456789012.d.codeartifact.us-east-1.amazonaws.com/pypi/my-repo/simple/"

For CI/CD environments, this integrates with IAM roles and doesn't require hardcoded credentials.

Azure DevOps Artifacts Authentication

Azure Artifacts requires personal access tokens (PATs) or Azure AD authentication:

## Using PAT authentication
export UV_EXTRA_INDEX_URL="https://username:${AZURE_DEVOPS_PAT}@pkgs.dev.azure.com/organization/_packaging/feed/pypi/simple/"

## Or using Azure CLI token
az artifacts universal download \
    --organization https://dev.azure.com/myorg \
    --feed myfeed \
    --name mypackage \
    --version 1.0.0 \
    --path .

GitLab Package Registry Integration

GitLab's built-in PyPI registry works well with uv through deploy tokens or CI/CD job tokens:

## For GitLab private packages
[[tool.uv.index]]
name = "gitlab"
url = "https://__token__:${CI_JOB_TOKEN}@gitlab.company.com/api/v4/projects/PROJECT_ID/packages/pypi/simple"
explicit = true

Security Best Practices for Enterprise

Never hardcode credentials in pyproject.toml. Use environment variables or external credential management:

  • Keyring integration: uv supports system keyrings through UV_KEYRING_PROVIDER=subprocess
  • Docker secrets: Mount credentials as files rather than environment variables
  • CI/CD secrets: Use GitHub Actions secrets, GitLab CI variables, or Jenkins credentials

Network security considerations:

## Corporate proxy support
export UV_HTTP_PROXY="http://proxy.company.com:8080"
export UV_HTTPS_PROXY="http://proxy.company.com:8080"
export UV_NO_PROXY="localhost,127.0.0.1,.company.com"

## Custom CA certificates
export UV_CERT_BUNDLE="/etc/ssl/certs/company-ca.pem"
export REQUESTS_CA_BUNDLE="/etc/ssl/certs/company-ca.pem"

Multi-Repository Configuration

Large organizations often have multiple private repositories. uv handles this through index priority and explicit sourcing:

[tool.uv.pip]
index-url = "https://pypi.org/simple/"  # Default for public packages
extra-index-url = [
    "https://ml-packages.company.com/simple/",      # ML team packages
    "https://backend-packages.company.com/simple/", # Backend team packages
]

## Explicit package routing prevents conflicts
[tool.uv.sources]
ml-models = { index = "ml-packages" }
api-client = { index = "backend-packages" }

[[tool.uv.index]]
name = "ml-packages"
url = "https://ml-packages.company.com/simple/"
explicit = true

[[tool.uv.index]]
name = "backend-packages"
url = "https://backend-packages.company.com/simple/"
explicit = true

Authentication Troubleshooting (aka 90% of Your Time)

Real authentication failures that will ruin your day:

  1. 401 Unauthorized: Token expired 5 minutes ago, or IT rotated keys without telling anyone
  2. SSL Certificate errors: Corporate proxy injects its own cert, breaking everything
  3. 403 Forbidden: Your service account lost permissions overnight for no reason
  4. Network timeouts: Corporate network is slower than dial-up on Mondays
  5. CERTIFICATE_VERIFY_FAILED: The classic - your proxy hates Python
## Debug authentication issues (prepare for spam)
UV_VERBOSE=1 uv pip install your-private-package

## Test if your private index is reachable
## Use curl to test connectivity to your private index
curl -I https://pypi.org/simple/

## Nuclear option: disable SSL verification (don't do this in prod)
export PYTHONHTTPSVERIFY=0

Keep pip installed as backup - when uv craps out on auth, pip usually still works for some reason.

uv 0.8.17 finally shows actual HTTP status codes instead of the useless "connection failed" bullshit from 0.7.x. But SSL: CERTIFICATE_VERIFY_FAILED errors are still garbage - it won't tell you which cert is fucked, so good luck debugging that.

Large Project Workspace Management

uv Performance Chart

uv's workspace system manages multiple packages within a single repository, similar to Cargo workspaces in Rust. This approach works well for monorepos with shared dependencies, though Python's packaging ecosystem adds complexity that doesn't exist in Rust.

Workspace dependency resolution can be unpredictable when packages have conflicting version requirements or circular dependencies.

Workspace Architecture Patterns

Microservices monorepo - Multiple services sharing common libraries:

company-platform/
├── pyproject.toml              # Workspace root
├── uv.lock                     # Shared lockfile
├── services/
│   ├── api-gateway/
│   │   └── pyproject.toml
│   ├── user-service/
│   │   └── pyproject.toml
│   └── payment-service/
│       └── pyproject.toml
└── shared/
    ├── auth-lib/
    │   └── pyproject.toml
    ├── database-models/
    │   └── pyproject.toml
    └── common-utils/
        └── pyproject.toml

Root workspace configuration:

## company-platform/pyproject.toml
[project]
name = "company-platform"
version = "0.1.0"
requires-python = ">=3.11"

[tool.uv.workspace]
members = [
    "services/*",
    "shared/*"
]
exclude = [
    "services/legacy-*",  # Exclude legacy services
    "shared/experimental" # Exclude experimental packages
]

## Shared dependencies across all workspace members
[project.dependencies]
structlog = ">=23.1.0"
pydantic = ">=2.0"
pytest = ">=7.0"      # Development dependencies
black = ">=23.0"
ruff = ">=0.1.0"

Workspace Dependency Management

Workspace-local dependencies are the key advantage. Services can depend on shared libraries without publication to PyPI:

## services/api-gateway/pyproject.toml
[project]
name = "api-gateway"
dependencies = [
    "auth-lib",           # Workspace dependency
    "database-models",    # Workspace dependency
    "fastapi >= 0.100.0", # External dependency
]

[tool.uv.sources]
auth-lib = { workspace = true }
database-models = { workspace = true }

The { workspace = true } syntax tells uv to resolve these from the workspace rather than PyPI. This eliminates the need for development installs or complex path dependencies.

CI/CD Integration for Workspaces

Changed package detection is crucial for efficient CI/CD. While uv doesn't have built-in change detection, teams typically combine it with Git:

#!/bin/bash
## detect-changes.sh - Run tests only for changed packages

## Find changed Python files
changed_files=$(git diff --name-only HEAD~1 HEAD | grep '\.py$')

## Determine affected workspace members
affected_packages=()
for file in $changed_files; do
    package_dir=$(echo $file | cut -d'/' -f1-2)
    if [[ -f "$package_dir/pyproject.toml" ]]; then
        affected_packages+=($package_dir)
    fi
done

## Run tests for affected packages
for package in ${affected_packages[@]}; do
    echo "Testing $package..."
    uv run --package $package pytest
done

GitHub Actions workflow for workspace builds:

name: Workspace CI
on: [push, pull_request]

jobs:
  test-workspace:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v4

    - name: Install uv
      uses: [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv)@v3
      with:
        version: "latest"

    - name: Install dependencies
      run: uv sync --all-extras

    - name: Run workspace-wide linting
      run: |
        uv run ruff check .
        uv run black --check .

    - name: Test each package
      run: |
        for pkg in services/* shared/*; do
          if [[ -f "$pkg/pyproject.toml" ]]; then
            echo "Testing $pkg"
            uv run --package $(basename $pkg) pytest
          fi
        done

Performance Optimization for Large Workspaces

Selective synchronization prevents unnecessary package installs:

## Sync only specific packages and their dependencies
uv sync --package api-gateway --package auth-lib

## Sync without development dependencies for production
uv sync --no-dev

## Sync with specific extra groups
uv sync --extra postgres --extra redis

Build caching strategies for Docker deployments:

## Multi-stage build optimized for workspaces
FROM python:3.11-slim AS deps

COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
ENV UV_LINK_MODE=copy UV_COMPILE_BYTECODE=1

## Copy workspace configuration
WORKDIR /workspace
COPY uv.lock pyproject.toml ./
COPY services/api-gateway/pyproject.toml services/api-gateway/
COPY shared/*/pyproject.toml shared/

## Install only api-gateway dependencies
RUN uv sync --package api-gateway --no-dev --no-install-project

## Copy source and install project
COPY . .
RUN uv sync --package api-gateway --no-dev --no-editable

Team Development Workflows

Shared configuration keeps teams in sync:

## .uv/config.toml (workspace-local configuration)
[pip]
index-url = "https://pypi.company.com/simple/"
extra-index-url = ["https://pypi.org/simple/"]
trusted-host = ["pypi.company.com"]

[tool]
global-python-downloads = false  # Use system Python only
compile-bytecode = true         # Faster startup
link-mode = "copy"             # Better cross-platform compatibility

Pre-commit hooks for workspace consistency:

## .pre-commit-config.yaml
repos:
  - repo: local
    hooks:
      - id: uv-lock-check
        name: Check uv lockfile is up to date
        entry: uv lock --check
        language: system
        pass_filenames: false

      - id: workspace-format
        name: Format workspace code
        entry: uv run ruff format .
        language: system
        types: [python]

      - id: workspace-lint
        name: Lint workspace code
        entry: uv run ruff check .
        language: system
        types: [python]

Common Workspace Issues

Version conflicts occur when packages have incompatible dependency requirements. Package A needs [requests](https://requests.readthedocs.io/) >= 2.28 but package B requires requests == 2.25.1 due to historical constraints:

## Force consistent versions across workspace
[project.dependencies]
pydantic = "==2.5.0"  # Exact version prevents conflicts

## Override problematic dependencies
[tool.uv.override-dependencies]
urllib3 = "==1.26.18"  # Fix security vulnerability

Development vs. production dependencies require careful separation to avoid installing dev tools in production or missing production requirements in development:

## Separate development tools from application dependencies
[project.optional-dependencies]
dev = [
    "pytest >= 7.0",
    "black >= 23.0",
    "ruff >= 0.1.0",
    "mypy >= 1.0",
    "ipdb",
]

## Install with: uv sync --extra dev

Circular dependencies between workspace packages create dependency resolution loops that can take 20+ minutes to fail with unclear error messages. Resolution usually requires restructuring package dependencies.

Path issues occur when imports work locally but fail in CI due to different Python path configurations. Local development typically runs from the repository root while CI may run from subdirectories, causing import failures.

The workspace system has improved significantly from early versions that had unreliable dependency resolution. Recent versions provide more deterministic lockfiles and better cross-platform dependency resolution, though edge cases still require debugging. Keep pip install -e . available as a fallback for troubleshooting.

Enterprise Python Package Manager Comparison

Feature

uv

Poetry

pip + pip-tools

Rye

Workspace Support

✅ Cargo-style workspaces

❌ Limited monorepo support

❌ Manual coordination required

✅ Good workspace support

Private Repository Auth

✅ Multiple auth methods

✅ Keyring integration

✅ Basic auth support

✅ URL-based auth

Dependency Resolution Speed

⚡ 10-100x faster

🐌 Slow on large projects

🐌 Slow resolution

⚡ Fast (uses uv backend)

Lock File Format

✅ Universal lock format

✅ poetry.lock

❌ Manual requirements.txt

✅ Rust-style lock

CI/CD Integration

✅ GitHub Actions support

✅ Well documented

❌ Complex multi-tool setup

✅ Good CI support

Python Version Management

✅ Built-in Python installs

❌ Requires pyenv/asdf

❌ Requires pyenv/asdf

✅ Built-in management

Corporate Proxy Support

✅ Usually works with proxies

✅ Standard proxy support

✅ Standard proxy support

✅ Proxy support

Performance at Scale

⚡ Fast (when it doesn't eat all your RAM)

🐌 Degrades with size

🐌 Poor with many deps

⚡ Fast (uv-based)

Tool Ecosystem

✅ uvx for isolated tools

❌ No tool management

❌ pipx separate tool

✅ Tool management

Development Maturity

🆕 Rapidly evolving (2024)

🏢 Mature ecosystem

🏛️ Legacy standard

🆕 Early but stable

Configuration Troubleshooting FAQ

Q

How do I configure uv for multiple private PyPI repositories?

A

Use the [[tool.uv.index]] table in pyproject.toml with explicit package routing:toml[[tool.uv.index]]name = "company-ml"url = "https://ml-pypi.company.com/simple/"explicit = true[[tool.uv.index]]name = "company-backend"url = "https://backend-pypi.company.com/simple/"explicit = true[tool.uv.sources]ml-models = { index = "company-ml" }api-client = { index = "company-backend" }Without this routing, uv will spam every repo asking for every package. Your build will take forever and IT will yell at you for "suspicious network activity."

Q

What's the best practice for sharing uv configuration across teams?

A

Create a .uv/config.toml file in your repository root that gets committed with your code:toml[pip]index-url = "https://pypi.company.com/simple/"trusted-host = ["pypi.company.com"][tool]compile-bytecode = truelink-mode = "copy"Commit this config so everyone gets the same settings. Otherwise you'll get the classic "works on my machine" bullshit when authentication breaks for half the team.

Q

How do I handle AWS CodeArtifact token expiration in CI?

A

Use AWS CLI token refresh in your CI pipeline:bash# In GitHub Actions or similarexport CODEARTIFACT_AUTH_TOKEN=$(aws codeartifact get-authorization-token \ --domain my-domain --domain-owner 123456789012 \ --query authorizationToken --output text)export UV_EXTRA_INDEX_URL="https://aws:${CODEARTIFACT_AUTH_TOKEN}@domain.d.codeartifact.region.amazonaws.com/pypi/repo/simple/"uv syncDo this at the start of every CI job, because AWS tokens expire every 12 hours and will absolutely break your weekend deploy.

Q

Can uv workspaces handle different Python versions per package?

A

Kind of, but it's a pain. You can specify different Python versions:toml# services/legacy-app/pyproject.toml[project]requires-python = ">=3.8"# services/new-api/pyproject.toml[project]requires-python = ">=3.11"But the lockfile still picks one Python version for everything. You'll need uv run --python 3.8 --package legacy-app every damn time you want to run the old stuff.

Q

How do I optimize uv performance for large monorepos?

A

Stop uv from doing stupid shit:

  1. Selective sync: uv sync --package api-gateway - don't install 500 packages when you need 5
  2. Fix cache location: UV_CACHE_DIR on SSD, not whatever slow-ass network drive IT mounted
  3. Throttle downloads: UV_CONCURRENT_DOWNLOADS=4 - 50 concurrent connections will get you rate limited
  4. Bytecode compilation: UV_COMPILE_BYTECODE=1 - faster startup, slightly longer builds
  5. Explicit routing: Tell uv exactly where packages live instead of letting it guess
Q

What happens when workspace dependencies have conflicting versions?

A

uv tries to make everyone happy, usually fails. When dependency hell strikes:

  1. Force it: [tool.uv.override-dependencies] to beat packages into submission
  2. Give up: Split incompatible crap into separate workspaces
  3. Selective updates: uv lock --upgrade-package package-name - update one thing at a time

Error messages got better in recent versions, but "dependency conflict" errors are still a nightmare to debug. Good luck figuring out why package A version 2.1 conflicts with package B version 1.3 through some transitive dependency chain.

Q

How do I migrate from Poetry to uv workspaces?

A

Migration steps for existing Poetry projects:

  1. Convert pyproject.toml: Most Poetry configuration translates directly to uv
  2. Create workspace structure: Add [tool.uv.workspace] to the root project
  3. Update CI/CD: Replace poetry install with uv sync
  4. Handle poetry.lock: Run uv lock to generate uv.lock from pyproject.toml

The process typically takes 1-2 days for most projects, with the main effort in CI/CD pipeline updates.

Q

Can I use uv with corporate proxy servers?

A

Yes, uv supports proxy configuration:bashexport UV_HTTP_PROXY="http://proxy.company.com:8080"export UV_HTTPS_PROXY="http://proxy.company.com:8080"export UV_NO_PROXY="localhost,127.0.0.1,.company.com"# For authenticated proxiesexport UV_HTTP_PROXY="http://username:password@proxy.company.com:8080"Corporate CA certificates should be installed system-wide or specified via UV_CERT_BUNDLE.

Q

How do I handle dependency updates across a large workspace?

A

Use selective update strategies:bash# Update all dependenciesuv lock --upgrade# Update specific package onlyuv lock --upgrade-package requests# Update packages matching patternuv lock --upgrade-package "fastapi*"# Check for security updatesuv lock --upgrade --only-securityFor large workspaces, test updates on a subset of packages first using uv sync --package package-name.

Q

What's the recommended approach for environment-specific configurations?

A

Layer configurations from most general to most specific:

  1. Global config: /home/user/.config/uv/uv.toml (user defaults)
  2. Project config: .uv/config.toml (team shared)
  3. Environment variables: UV_* (deployment specific)
  4. CLI flags: Override everything for specific commands

This allows teams to share common settings while allowing deployment-specific overrides.

Q

How do I troubleshoot authentication failures with private repositories?

A

Enable verbose logging and check common issues:```bashUV_VERBOSE=1 uv pip install private-package# Common authentication problems:# 1. Token expiration

  • refresh authentication tokens# 2. Network issues
  • check proxy and DNS configuration# 3. Certificate problems
  • verify CA certificate installation# 4. Permission issues
  • check that token has package read permissions```The uv auth login command (if available) can help with interactive authentication setup.
Q

Can uv replace both pip and Poetry in enterprise environments?

A

Mostly, but keep pip around for when shit hits the fan:

  • Package management: uv pip is faster than pip, usually works the same way
  • Project stuff: uv init, uv add, uv sync do what Poetry did, but faster
  • Python versions: uv python install works if IT doesn't block downloads
  • Tool isolation: uvx replaces pipx and doesn't suck as much

The problem is uv is new. When something weird breaks, Stack Overflow has 500 answers for pip/Poetry problems and maybe 3 for uv. You'll be debugging alone.

Q

Why does my workspace build fail with import errors?

A

Python imports are fucked, as usual. First check if it's uv's fault by testing with pip install -e . - if that works, uv is doing something stupid.

The usual suspects:

  • Missing __init__.py files (because Python is still stuck in 2001)
  • Your imports don't match what's actually in pyproject.toml
  • Nuclear option: delete .venv and uv.lock, start over
  • Workspace config is wrong - check the root pyproject.toml
  • Circular imports (good luck with that nightmare)

Performance Optimization Strategies

uv Performance Comparison

After configuring authentication and workspaces, performance optimization becomes the next priority. uv provides significant speed improvements over Poetry and pip, with additional tuning opportunities for specific environments. Some optimizations provide measurable benefits while others offer marginal improvements.

Cache Architecture & Optimization

uv maintains several cache layers that can be tuned for different environments:

Global package cache (~/.cache/uv/ on Unix, %LOCALAPPDATA%\uv\cache\ on Windows) stores downloaded packages and built wheels. Without management, the cache grows indefinitely and can consume significant disk space:

## Configure cache on faster storage (if you have it)
export UV_CACHE_DIR=\"/fast-ssd/uv-cache\"

## Set cache size limits (or your disk will cry)
export UV_CACHE_SIZE_LIMIT=\"10GB\"

## Network storage? Good luck with this
export UV_LINK_MODE=\"copy\"  # Avoid hard links across network mounts

Cache corruption can cause uv sync failures with unclear error messages. uv cache clean resolves most cache-related issues.

Build cache optimization for packages with native extensions:

## Use more build workers for packages with C extensions
export UV_CONCURRENT_BUILDS=8

## Pre-compile common packages in CI
uv pip install --compile numpy scipy pandas  # Pre-build common data science stack

CI-specific cache strategies use the uv cache prune --ci command:

## GitHub Actions optimization
- name: Restore uv cache
  uses: actions/cache@v4
  with:
    path: ~/.cache/uv
    key: uv-${{ runner.os }}-${{ hashFiles('uv.lock') }}

- name: Install dependencies
  run: uv sync

- name: Optimize cache for CI
  run: uv cache prune --ci  # Removes unnecessary cache entries

Network Performance Optimization

Concurrent download tuning based on network capacity and server limits:

## For high-bandwidth, low-latency networks
export UV_CONCURRENT_DOWNLOADS=16

## For corporate networks with bandwidth limits
export UV_CONCURRENT_DOWNLOADS=4

## For unreliable networks
export UV_CONCURRENT_DOWNLOADS=2
export UV_HTTP_TIMEOUT=300

Package index optimization reduces latency through strategic mirror usage:

[tool.uv.pip]
## Use geographically closer mirrors
index-url = \"https://pypi-mirror.company.com/simple/\"
extra-index-url = [
    \"https://pypi.org/simple/\",  # Fallback to official PyPI
]

## Enable HTTP/2 for better multiplexing
[tool.uv]
http2 = true

Memory Usage Optimization

Dependency resolution memory limits prevent OOM kills in constrained environments:

## Limit resolver memory usage
export UV_RESOLUTION_MEMORY_LIMIT=\"2GB\"

## Reduce concurrent operations in memory-constrained environments
export UV_CONCURRENT_DOWNLOADS=2
export UV_CONCURRENT_BUILDS=2

Large workspace optimization through selective resolution:

## Only resolve dependencies for specific packages
uv lock --package api-gateway --package shared-lib

## Skip development dependencies in production builds
uv sync --no-dev --package api-gateway

Build Performance for Native Packages

Wheel cache pre-population for common packages with C extensions:

## Pre-build wheels for common data science packages
uv pip wheel numpy scipy pandas matplotlib scikit-learn \
    --wheel-dir /shared/wheel-cache

## Use pre-built wheels in builds
export UV_FIND_LINKS=\"/shared/wheel-cache\"

Multi-architecture builds for teams using different platforms:

## Build universal wheels that work across platforms
uv build --wheel --config-setting=\"--build-option=--universal\"

## Or use platform-specific optimization
export CPPFLAGS=\"-march=native -O3\"  # Optimize for build machine
uv pip install numpy  # Builds optimized version

Advanced Configuration Patterns

Environment-specific performance profiles:

## .uv/config.toml - development profile
[tool]
compile-bytecode = false  # Faster builds during development
link-mode = \"hardlink\"    # Fastest installs

## Production builds use different settings
[tool.production]
compile-bytecode = true   # Faster runtime startup
link-mode = \"copy\"       # Better isolation
cache-dir = \"/opt/uv-cache\"

Load balancing across multiple PyPI mirrors:

[[tool.uv.index]]
name = \"primary-mirror\"
url = \"https://pypi-1.company.com/simple/\"
priority = 100

[[tool.uv.index]]
name = \"secondary-mirror\"
url = \"https://pypi-2.company.com/simple/\"
priority = 90

[[tool.uv.index]]
name = \"fallback\"
url = \"https://pypi.org/simple/\"
priority = 10

Monitoring & Observability

Performance metrics collection for optimization analysis:

## Enable detailed timing information
UV_VERBOSE=1 UV_TIMING=1 uv sync 2>&1 | tee uv-performance.log

## Extract key metrics
grep -E \"(Resolved|Downloaded|Installed|Built)\" uv-performance.log

Build time analysis identifies optimization opportunities:

## Profile dependency resolution time
time uv lock --verbose

## Profile installation time
time uv sync --verbose

## Profile individual package build times
UV_BUILD_VERBOSE=1 uv pip install difficult-package

Production Deployment Optimization

Container image optimization builds on uv's speed advantages:

## Multi-stage build with optimized uv usage
FROM python:3.11-slim AS builder

## Install uv with specific optimizations for build environment
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
ENV UV_COMPILE_BYTECODE=1 \
    UV_LINK_MODE=copy \
    UV_CACHE_DIR=/tmp/uv-cache \
    UV_CONCURRENT_DOWNLOADS=8

## Cache mount for faster subsequent builds
RUN --mount=type=cache,target=/tmp/uv-cache \
    uv pip install --compile numpy pandas

## Production runtime stage
FROM python:3.11-slim AS runtime
COPY --from=builder /usr/local/lib/python3.11/site-packages /usr/local/lib/python3.11/site-packages
ENV PYTHONUNBUFFERED=1 PYTHONDONTWRITEBYTECODE=1

Deployment pipeline optimization leverages uv's deterministic builds:

## Generate reproducible builds
uv export --format requirements-txt --no-hashes > requirements-prod.txt

## Verify build reproducibility
uv pip install -r requirements-prod.txt
uv pip freeze > installed.txt
diff requirements-prod.txt installed.txt  # Should be empty

Advanced Troubleshooting

Performance regression analysis when builds become slow:

## Compare lock file resolution times
time uv lock --resolution=highest  # Try more aggressive resolution
time uv lock --resolution=lowest   # Try conservative resolution

## Identify problematic packages
uv tree --show-conflicts  # Display dependency conflicts
uv pip show --verbose problematic-package  # Check package metadata

Most optimizations provide incremental improvements - seconds saved on builds that take minutes. The primary performance benefit comes from reliable dependency resolution and reduced debugging time, not micro-optimization of cache settings.

uv performs well out of the box for most use cases. Focus on reliable configuration - authentication, workspaces, CI integration - before pursuing performance optimizations. Measure actual performance problems before implementing complex tuning. A consistent, working build is more valuable than marginal speed improvements that introduce reliability issues.