Built-in Database Drivers

Your typical Node.js database setup requires 4+ packages just for PostgreSQL: pg, @types/pg, pg-pool, and pg-format. When Node.js 20 dropped, half my projects broke because node-gyp couldn't rebuild the native modules. Spent 3 hours fixing compilation errors.

Bun v1.2.21 ships with database drivers built-in. Database connections are noticeably faster - my API responses went from ~300ms to ~120ms on my usual test setup. Not scientific, but real enough.

Single API for Multiple Databases

Instead of learning separate APIs for pg, mysql2, and better-sqlite3, Bun uses one consistent interface:

PostgreSQL Logo MySQL Logo

import { sql } from 'bun';

// PostgreSQL - same syntax as before, just faster
const postgres = sql`postgres://user:pass@localhost/db`;

// MySQL - works with your existing Docker setup
const mysql = sql`mysql://user:pass@localhost/db`;

// SQLite - perfect for local dev and testing  
const sqlite = sql`sqlite:///path/to/database.db`;

// Same query for all three (finally!)
const users = await postgres`SELECT * FROM users WHERE active = ${true}`;

Connections feel snappier. Complex queries with joins don't show much improvement, but simple CRUD operations are noticeably faster. Your mileage will vary depending on network latency and query complexity.

SQLite: No More Compilation Hell

SQLite with Node.js is a fucking nightmare. better-sqlite3 breaks every time Node updates. On M1 Macs, rebuilding takes 5+ minutes if it works at all. Docker builds fail with:

gyp ERR! build error
gyp ERR! stack Error: `make` failed with exit code: 2

I've wasted entire afternoons debugging these compilation issues. Alpine Linux containers are especially painful.

Bun's built-in SQLite requires no compilation or native dependencies:

import { Database } from 'bun:sqlite';

const db = new Database('app.db'); // Creates file if missing

// Prepared statements automatically (no manual optimization)
const getUser = db.query('SELECT * FROM users WHERE id = ?');
const user = getUser.get(123);

// Synchronous transactions that don't suck
db.transaction(() => {
  db.run('INSERT INTO logs (message) VALUES (?)', 'User logged in');
  db.run('UPDATE users SET last_seen = ? WHERE id = ?', Date.now(), 123);
})(); // Either all succeed or all fail

SQLite is fast enough for most apps unless you need heavy concurrent writes. For local dev and testing, it's perfect - no Docker PostgreSQL setup bullshit.

ORM Reality Check

Prisma still needs Node.js for prisma generate, which defeats the point of switching to Bun. Works fine but you're running two runtimes. The setup guide explains the pain.

Drizzle is what you actually want. Works directly with Bun.SQL, generates readable SQL, and doesn't make you hate ORMs. The benchmarks show it's barely slower than raw SQL.

import { drizzle } from 'drizzle-orm/bun-sql';
import { sql } from 'bun';

const db = drizzle(sql`postgres://localhost/mydb`);
const users = await db.select().from(usersTable).where(eq(usersTable.active, true));
// Generates proper SQL, doesn't do weird magic

TypeORM is slow everywhere. Decorators and reflection make everything sluggish. Avoid unless you're masochistic.

Migrations Don't Totally Suck

Drizzle Kit generates migrations that make sense instead of cryptic bullshit. File I/O is fast enough that reading raw .sql files works fine for simple setups.

When shit breaks, you're debugging one thing (Bun) instead of figuring out which of your 6 database packages is incompatible with the others.

Additional Resources

Bun Database Options Comparison

Database Option

Performance

Setup

Types

What Breaks

Raw Bun.SQL + PostgreSQL

Fast

Easy

Decent

Bun bugs (rare)

Raw Bun.SQL + MySQL

Fast

Easy

Decent

Connection string issues

bun:sqlite

Very fast

Just works

Good

Docker file permissions

Prisma + Bun

Slow but fine

Annoying

Excellent

Code generation

Drizzle + Bun.SQL

Fast

Clean

Great

Your schema mistakes

TypeORM + Bun

Slow

Pain in ass

Meh

Everything

Production Shit That Actually Matters

Database connections will fail at 3am when you're oncall. Here's what breaks and how to fix it without losing your mind.

Connection Pooling

Pooling happens automatically, but you'll hit connection limits under load. PostgreSQL defaults to 100 connections, and you'll get this lovely error:

Error: sorry, too many clients already

This killed our Black Friday traffic last year. Set explicit limits before you find out the hard way:

import { sql } from 'bun';

// Basic setup - pooling handled automatically
const db = sql`postgres://user:pass@localhost/db`;

// Production setup - specify limits before you hit them
const db = sql`postgres://user:pass@localhost/db?max_connections=20&idle_timeout=30`;

// Health check for load balancers
async function healthCheck() {
  try {
    await db`SELECT 1`; // Fails fast on connection issues
    return { database: 'healthy' };
  } catch (error) {
    console.error('Database connection failed:', error.message);
    return { database: 'down', error: error.message };
  }
}

Bun uses less memory per connection than Node.js, so you can run more concurrent connections. We handle 50+ connections per process without issues, but this varies by app.

Error Handling: Shit Breaks

Networks drop. Servers restart. Your retry logic needs to handle this or you're fucked:

async function queryWithRetry(query, values, maxRetries = 3) {
  for (let attempt = 1; attempt <= maxRetries; attempt++) {
    try {
      return await db(query, ...values);
    } catch (error) {
      console.warn(`Attempt ${attempt} failed: ${error.message}`);
      
      // Don't retry stupid mistakes
      if (error.code === '42P01' || error.code === '23505') {
        throw error; // Table doesn't exist, constraint violation = your fault
      }
      
      // Handle specific PostgreSQL error codes correctly
      // Reference: https://www.enterprisedb.com/blog/application-high-availability-and-resiliency-steps-improve-transaction-retry
      
      // Exponential backoff because hammering won't help
      if (attempt < maxRetries) {
        const delay = Math.min(1000 * Math.pow(2, attempt - 1), 5000);
        await Bun.sleep(delay);
      } else {
        throw new Error(`Gave up after ${maxRetries} attempts: ${error.message}`);
      }
    }
  }
}

Transactions: Don't Lose People's Money

Transactions are faster in Bun, but they'll still fuck you if you don't handle concurrency properly. This cost us 2 hours when a race condition let someone withdraw money twice:

// Don't lose people's money
async function transferMoney(fromAccount, toAccount, amount) {
  return await db.transaction(async (tx) => {
    // FOR UPDATE locks the row - critical for money stuff
    const balance = await tx`
      SELECT balance FROM accounts WHERE id = ${fromAccount} FOR UPDATE
    `;
    
    if (balance[0]?.balance < amount) {
      throw new Error('Insufficient funds'); // Auto-rollback
    }
    
    await tx`UPDATE accounts SET balance = balance - ${amount} WHERE id = ${fromAccount}`;
    await tx`UPDATE accounts SET balance = balance + ${amount} WHERE id = ${toAccount}`;
    
    return { success: true, transferAmount: amount };
  });
}

// For SQLite - handle write concurrency or it will bite you
async function atomicUpdate(userId, data) {
  return db.transaction(() => {
    // SQLite transactions are sync - faster but different from PostgreSQL
    const user = db.query('SELECT * FROM users WHERE id = ?').get(userId);
    if (!user) throw new Error('User not found');
    
    return db.query('UPDATE users SET data = ?, updated_at = ? WHERE id = ?')
      .run(JSON.stringify(data), Date.now(), userId);
  })(); // Don't forget the () at the end
}

Monitoring: Don't Overthink It

Start with basic query timing. Log slow shit. Fix it. Don't build a fucking NASA monitoring system on day one:

// Simple query timing that actually works
async function timedQuery(query, ...values) {
  const start = performance.now();
  
  try {
    const result = await db(query, ...values);
    const duration = performance.now() - start;
    
    // Log slow queries - adjust threshold based on your app
    // Database-level monitoring: https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/USER_DatabaseInsights.SlowSQL.html
    if (duration > 1000) {
      console.warn(`Slow query (${duration.toFixed(0)}ms): ${query.toString().slice(0, 50)}...`);
    }
    
    return result;
  } catch (error) {
    const duration = performance.now() - start;
    console.error(`Query failed (${duration.toFixed(0)}ms): ${error.message}`);
    throw error;
  }
}

Loading a Million Rows Will Kill Your Server

Don't do it. Your server will run out of memory and die. Page that shit:

// Process large datasets without dying
async function processBigTable(callback, batchSize = 1000) {
  let offset = 0;
  let hasMore = true;
  
  while (hasMore) {
    const batch = await db`
      SELECT * FROM huge_table 
      ORDER BY id 
      LIMIT ${batchSize} OFFSET ${offset}
    `;
    
    if (batch.length === 0) {
      hasMore = false;
    } else {
      for (const row of batch) {
        await callback(row);
      }
      offset += batchSize;
      
      // Give garbage collector a chance to breathe
      if (offset % 10000 === 0) {
        await new Promise(resolve => setImmediate(resolve));
      }
    }
  }
}

Migrations: Don't Break Production

Test your migrations. Use transactions. Have a rollback plan or you'll have a bad time:

// Simple migration runner that won't fuck up your data
async function runMigrations() {
  // Prevent multiple deployments from running migrations simultaneously
  const lockAcquired = await db`
    INSERT INTO migration_lock (locked_at) VALUES (NOW())
    ON CONFLICT DO NOTHING
    RETURNING locked_at
  `;
  
  if (lockAcquired.length === 0) {
    console.warn('Another migration is running, skipping...');
    return;
  }
  
  try {
    const pending = await getPendingMigrations();
    console.log(`Running ${pending.length} migrations...`);
    
    for (const migration of pending) {
      console.log(`Running: ${migration.name}`);
      
      await db.transaction(async (tx) => {
        await tx.unsafe(migration.sql); // DDL operations
        await tx`INSERT INTO migrations (name) VALUES (${migration.name})`;
      });
    }
  } finally {
    await db`DELETE FROM migration_lock`;
  }
}

Connection overhead was killing our API response times. Switching to Bun made pages load noticeably faster. Users stopped bitching about performance, which was nice.

Additional Production Resources

Frequently Asked Questions

Q

Will my existing PostgreSQL setup work?

A

Yeah. Your database doesn't change. Just swap import pg from 'pg' to import { sql } from 'bun' and change $1, $2 parameters to template literals. That's it.

Q

Does connection pooling actually work?

A

Works fine. We run 50+ connections per process without issues. Set ?max_connections=20 in your connection string or you'll hit PostgreSQL's default limit of 100.

Q

Which ORM won't make me want to quit?

A

Drizzle. Good types, fast queries, readable SQL. Prisma works but you still need Node.js for codegen. TypeORM is slow and painful.

Q

Is SQLite actually faster or just hype?

A

Yeah it's faster. No compilation bullshit means your tests start immediately instead of waiting for better-sqlite3 to rebuild every Node update:

Error: Cannot find module '/node_modules/better-sqlite3/build/Release/better_sqlite3.node'

That error cost me 3 hours last Tuesday.

Q

Can I drop Bun into my Prisma setup?

A

Yeah. Your schema doesn't change. Keep your schema.prisma and generated client. You still need Node.js for prisma generate though, which sucks.

Q

Does MySQL work or is it broken?

A

Works fine since v1.2.21.

Faster than mysql 2. Use sqlmysql://user:pass@host/database`` and you're done.

Q

How do I not screw up database migrations?

A

Use Drizzle Kit for type-safe migrations, or write a simple migration runner that reads .sql files in transactions. Always use a migration lock table to prevent multiple deployments from running migrations at the same time. Test your rollback plan.

Q

Do transactions work differently in Bun?

A

Same SQL syntax, just faster execution. For SQLite, transactions are synchronous by default, which is actually better for performance. PostgreSQL transactions work exactly like you expect. Always wrap money operations in transactions or you'll have a bad time.

Q

Does this work in Docker or will it break weirdly?

A

Works fine in Docker. Use --init flag to prevent signal handling weirdness with database connections. For SQLite, mount your database file as a volume or it disappears when the container restarts. PostgreSQL/MySQL connections work the same as Node.js.

Q

Is TypeScript support actually good or half-assed?

A

Actually good. Bun.SQL has proper types for query results. Drizzle gives you compile-time type safety. Prisma generates comprehensive types. Even raw SQL template literals get TypeScript support. No more any types for database results.

Q

How much slower are ORMs compared to raw SQL?

A

Drizzle barely slows things down. Prisma adds noticeable overhead but it's still faster than Prisma + Node.js. TypeORM is slow everywhere. Raw Bun.SQL is fastest but you're writing SQL by hand. Pick your poison based on team preference.

Q

How do I know when queries are being slow?

A

Start simple

  • wrap queries in performance.now() and log anything over 1 second. For advanced monitoring, enable PostgreSQL's slow query log or use your existing APM tools. Don't overcomplicate it early.
Q

Does PgBouncer work or will it conflict with Bun's pooling?

A

Pg

Bouncer works fine and doesn't conflict with Bun's built-in pooling. For high-traffic apps, use both

  • PgBouncer for database-level pooling, Bun.SQL for application-level management. It's actually the recommended setup.
Q

What about serverless? Does this work in Lambda?

A

SQLite works great in serverless

  • single file databases are perfect for edge functions. For Postgre

SQL/MySQL, you need connection pooling services like Supabase Pooler or PlanetScale to handle cold starts, otherwise you'll hit connection limits.

Q

I have a ton of Node.js database code. How painful is migration?

A

Not bad. Change import pg from 'pg' to import { sql } from 'bun' and swap parameterized queries for template literals. Your SQL stays the same, just different syntax for parameters. Both approaches prevent SQL injection fine. Took me about 2 hours to migrate a medium-sized API, most of which was find-and-replace.

Related Tools & Recommendations

compare
Similar content

Deno, Node.js, Bun: Deep Dive into Performance Benchmarks

Explore detailed performance benchmarks for Deno, Node.js, and Bun. Understand why Bun is fast, what breaks during migration, and if switching from Node.js is w

Deno
/compare/deno/node-js/bun/benchmark-methodologies
100%
compare
Similar content

Bun vs Node.js vs Deno: JavaScript Runtime Performance Comparison

Three weeks of testing revealed which JavaScript runtime is actually faster (and when it matters)

Bun
/compare/bun/node.js/deno/performance-comparison
100%
tool
Similar content

Bun JavaScript Runtime: Fast Node.js Alternative & Easy Install

JavaScript runtime that doesn't make you want to throw your laptop

Bun
/tool/bun/overview
69%
integration
Recommended

Hono + Drizzle + tRPC: Actually Fast TypeScript Stack That Doesn't Suck

integrates with Hono

Hono
/integration/hono-drizzle-trpc/modern-architecture-integration
57%
compare
Recommended

Which Node.js framework is actually faster (and does it matter)?

Hono is stupidly fast, but that doesn't mean you should use it

Hono
/compare/hono/express/fastify/koa/overview
56%
howto
Similar content

Bun: Fast JavaScript Runtime & Toolkit - Setup & Overview Guide

Learn to set up and use Bun, the ultra-fast JavaScript runtime, bundler, and package manager. This guide covers installation, environment setup, and integrating

Bun
/howto/setup-bun-development-environment/overview
56%
troubleshoot
Similar content

Bun Memory Leaks: Debugging & Fixing Production Issues

Learn why Bun applications experience memory leaks and how to effectively debug and fix them in production. Get practical tips and solutions for common Bun memo

Bun
/troubleshoot/bun-production-memory-leaks/production-memory-leaks
56%
troubleshoot
Similar content

Fix MongoDB "Topology Was Destroyed" Connection Pool Errors

Production-tested solutions for MongoDB topology errors that break Node.js apps and kill database connections

MongoDB
/troubleshoot/mongodb-topology-closed/connection-pool-exhaustion-solutions
56%
pricing
Similar content

JavaScript Runtime Cost Analysis: Node.js, Deno, Bun Hosting

Three months of "optimization" that cost me more than a fucking MacBook Pro

Deno
/pricing/javascript-runtime-comparison-2025/total-cost-analysis
49%
compare
Similar content

Bun vs Deno vs Node.js: JavaScript Runtime Comparison Guide

A Developer's Guide to Not Hating Your JavaScript Toolchain

Bun
/compare/bun/node.js/deno/ecosystem-tooling-comparison
49%
tool
Similar content

Bun Production Optimization: Deploy Fast, Monitor & Fix Issues

Master Bun production deployments. Optimize performance, diagnose and fix common issues like memory leaks and Docker crashes, and implement effective monitoring

Bun
/tool/bun/production-optimization
47%
howto
Similar content

Bun Production Deployment Guide: Docker, Serverless & Performance

Master Bun production deployment with this comprehensive guide. Learn Docker & Serverless strategies, optimize performance, and troubleshoot common issues for s

Bun
/howto/setup-bun-development-environment/production-deployment-guide
47%
tool
Similar content

ClickHouse Overview: Analytics Database Performance & SQL Guide

When your PostgreSQL queries take forever and you're tired of waiting

ClickHouse
/tool/clickhouse/overview
47%
tool
Similar content

PostgreSQL: Why It Excels & Production Troubleshooting Guide

Explore PostgreSQL's advantages over other databases, dive into real-world production horror stories, solutions for common issues, and expert debugging tips.

PostgreSQL
/tool/postgresql/overview
47%
tool
Similar content

Neon Production Troubleshooting Guide: Fix Database Errors

When your serverless PostgreSQL breaks at 2AM - fixes that actually work

Neon
/tool/neon/production-troubleshooting
47%
tool
Similar content

Neon Serverless PostgreSQL: An Honest Review & Production Insights

PostgreSQL hosting that costs less when you're not using it

Neon
/tool/neon/overview
47%
tool
Similar content

Supabase Overview: PostgreSQL with Bells & Whistles

Explore Supabase, the open-source Firebase alternative powered by PostgreSQL. Understand its architecture, features, and how it compares to Firebase for your ba

Supabase
/tool/supabase/overview
47%
tool
Similar content

Supabase Production Deployment: Best Practices & Scaling Guide

Master Supabase production deployment. Learn best practices for connection pooling, RLS, scaling your app, and a launch day survival guide to prevent crashes an

Supabase
/tool/supabase/production-deployment
47%
review
Similar content

Bun vs Node.js vs Deno: JavaScript Runtime Production Guide

Two years of runtime fuckery later, here's the truth nobody tells you

Bun
/review/bun-nodejs-deno-comparison/production-readiness-assessment
46%
tool
Similar content

Firebase - Google's Backend Service for Serverless Development

Skip the infrastructure headaches - Firebase handles your database, auth, and hosting so you can actually build features instead of babysitting servers

Firebase
/tool/firebase/overview
46%

Recommendations combine user behavior, content similarity, research intelligence, and SEO optimization