Your typical Node.js database setup requires 4+ packages just for PostgreSQL: pg
, @types/pg
, pg-pool
, and pg-format
. When Node.js 20 dropped, half my projects broke because node-gyp
couldn't rebuild the native modules. Spent 3 hours fixing compilation errors.
Bun v1.2.21 ships with database drivers built-in. Database connections are noticeably faster - my API responses went from ~300ms to ~120ms on my usual test setup. Not scientific, but real enough.
Single API for Multiple Databases
Instead of learning separate APIs for pg
, mysql2
, and better-sqlite3
, Bun uses one consistent interface:
import { sql } from 'bun';
// PostgreSQL - same syntax as before, just faster
const postgres = sql`postgres://user:pass@localhost/db`;
// MySQL - works with your existing Docker setup
const mysql = sql`mysql://user:pass@localhost/db`;
// SQLite - perfect for local dev and testing
const sqlite = sql`sqlite:///path/to/database.db`;
// Same query for all three (finally!)
const users = await postgres`SELECT * FROM users WHERE active = ${true}`;
Connections feel snappier. Complex queries with joins don't show much improvement, but simple CRUD operations are noticeably faster. Your mileage will vary depending on network latency and query complexity.
SQLite: No More Compilation Hell
SQLite with Node.js is a fucking nightmare. better-sqlite3
breaks every time Node updates. On M1 Macs, rebuilding takes 5+ minutes if it works at all. Docker builds fail with:
gyp ERR! build error
gyp ERR! stack Error: `make` failed with exit code: 2
I've wasted entire afternoons debugging these compilation issues. Alpine Linux containers are especially painful.
Bun's built-in SQLite requires no compilation or native dependencies:
import { Database } from 'bun:sqlite';
const db = new Database('app.db'); // Creates file if missing
// Prepared statements automatically (no manual optimization)
const getUser = db.query('SELECT * FROM users WHERE id = ?');
const user = getUser.get(123);
// Synchronous transactions that don't suck
db.transaction(() => {
db.run('INSERT INTO logs (message) VALUES (?)', 'User logged in');
db.run('UPDATE users SET last_seen = ? WHERE id = ?', Date.now(), 123);
})(); // Either all succeed or all fail
SQLite is fast enough for most apps unless you need heavy concurrent writes. For local dev and testing, it's perfect - no Docker PostgreSQL setup bullshit.
ORM Reality Check
Prisma still needs Node.js for prisma generate
, which defeats the point of switching to Bun. Works fine but you're running two runtimes. The setup guide explains the pain.
Drizzle is what you actually want. Works directly with Bun.SQL, generates readable SQL, and doesn't make you hate ORMs. The benchmarks show it's barely slower than raw SQL.
import { drizzle } from 'drizzle-orm/bun-sql';
import { sql } from 'bun';
const db = drizzle(sql`postgres://localhost/mydb`);
const users = await db.select().from(usersTable).where(eq(usersTable.active, true));
// Generates proper SQL, doesn't do weird magic
TypeORM is slow everywhere. Decorators and reflection make everything sluggish. Avoid unless you're masochistic.
Migrations Don't Totally Suck
Drizzle Kit generates migrations that make sense instead of cryptic bullshit. File I/O is fast enough that reading raw .sql
files works fine for simple setups.
When shit breaks, you're debugging one thing (Bun) instead of figuring out which of your 6 database packages is incompatible with the others.
Additional Resources
- SQLite optimization techniques for high-performance applications
- PostgreSQL connection best practices in production environments
- MySQL 8.0 performance tuning guide from Oracle
- Database design patterns by Martin Fowler
- ACID transaction principles explained
- Database indexing strategies comprehensive guide
- Connection pooling theory and practice
- SQL query optimization fundamentals