The most common Gatsby build failure isn't complexity - it's running out of memory. The Node.js documentation explains heap limits, but Gatsby pushes past them. If you're seeing this error, you're not alone - Stack Overflow has hundreds of related issues:
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
This happens because Gatsby loads everything into memory at once instead of streaming like modern frameworks such as Next.js. The V8 JavaScript engine wasn't designed for this pattern. Here's how to survive until you can escape.
The Memory Limit Bandaid
Your first line of defense is increasing Node's memory limit. Most developers try 4GB and watch it still crash. Skip the baby steps:
NODE_OPTIONS=\"--max-old-space-size=8192\"
That gives Node 8GB of heap space. Still crashing? Try 16GB:
NODE_OPTIONS=\"--max-old-space-size=16384\"
If you need more than 16GB to build a website, your framework is fundamentally broken and you need to migrate immediately. GitHub Actions runners max out at 7GB, so this becomes a real blocker.
For GitHub Actions, add this to your workflow:
- name: Build site
env:
NODE_OPTIONS: \"--max-old-space-size=8192\"
run: npm run build
Memory Profiling: Find What's Actually Leaking
Want to see exactly where your memory goes? Run this and watch the carnage:
node --inspect node_modules/.bin/gatsby build
Open Chrome DevTools (chrome://inspect), click your Node target, go to Memory tab, take heap snapshots during the build. The memory profiling guide explains the process. You'll see memory climbing from 500MB to 6GB+ and never coming down.
The leak is usually in source-and-transform-nodes phase. Gatsby's GraphQL layer loads all your data at once and holds references even after transforming. Nothing you can do about it - just increase memory and pray.
Plugin Memory Bombs
Some plugins are memory grenades waiting to explode:
gatsby-plugin-sharp is the worst offender. If you have 500+ images, it'll eat 6GB+ during processing. Each image transformation loads the full image into memory, processes all sizes, then sometimes forgets to clean up.
gatsby-source-contentful with 10k+ entries will consume 2GB+ just parsing the GraphQL responses. The plugin loads every single entry into memory before creating nodes.
gatsby-transformer-remark with syntax highlighting can leak memory on large codebases. Each code block gets processed through Prism.js and the AST nodes stick around.
Check your memory usage by plugin:
gatsby build --verbose 2>&1 | grep \"source and transform\"
If you see one plugin taking 90% of the time, that's your memory hog.
The Cache Savior (When It Works)
Keep your .cache
and public
folders between builds. This is the difference between 47 minutes and 6 minutes:
## DON'T do this in CI/CD
gatsby clean && gatsby build
## DO this instead
gatsby build
For Netlify, install the Essential Gatsby Plugin:
[[plugins]]
package = \"@netlify/plugin-gatsby\"
For GitHub Actions, cache these directories:
- name: Cache Gatsby
uses: actions/cache@v3
with:
path: |
.cache
public
key: gatsby-${{ hashFiles('package-lock.json') }}
Warning: The cache sometimes gets corrupted and makes builds slower. If builds suddenly take 2x longer, delete .cache
and start over. Usually happens after Gatsby version updates or plugin changes.
Image Optimization: Stop Processing Giants
Your 8MB photographer images don't need to be 8MB in git. Resize them before committing:
## Install sharp-cli globally
npm install -g sharp-cli
## Resize all images to max 2000px width
find ./src/images -name \"*.jpg\" -exec sharp resize 2000 {} \;
Or use squoosh.app to manually optimize before committing. A 5MB image that gets resized to 1200px for web display should be 200KB max.
Pro tip: Set up a pre-commit hook to resize images automatically:
#!/bin/sh
find . -name \"*.jpg\" -size +1M -exec echo \"Image {} is too large (>1MB)\" \; -exec false \;
The Nuclear Option: Skip Image Processing
If you're using an external image service like Cloudinary or Imgix, you can skip Gatsby's image processing entirely:
// gatsby-config.js
module.exports = {
plugins: [
// Remove gatsby-plugin-sharp and gatsby-transformer-sharp
// Use gatsby-transformer-cloudinary instead
{
resolve: 'gatsby-transformer-cloudinary',
options: {
cloudName: 'your-cloud-name',
apiKey: 'your-api-key',
apiSecret: 'your-api-secret',
},
},
],
}
Build time drops from 30 minutes to 3 minutes because you're not processing thousands of images locally. Images load from Cloudinary's CDN with automatic optimization. The Gatsby image docs don't mention this workaround, but performance case studies show external image services can be faster than local processing.
This is how you survive until migration. It's not pretty, but it works. Migration guides are available when you're ready to escape.