Why AI Coders Need to Know This
There's a specific error message that stops vibe coders in their tracks. You've asked Claude or Cursor to build an API route. The code looks clean. You test it locally — everything works. You deploy to Vercel. And then:
Error: The edge runtime does not support Node.js 'fs' module.
Or this one:
Error: The edge runtime does not support Node.js 'net' module.
Import trace for requested module:
./node_modules/pg/lib/connection.js
You paste the error back into Claude. It generates a fix. You deploy again. Different error, same pattern. You paste that back in. Rinse and repeat until you've burned forty-five minutes and you're still not sure what's happening.
What's happening is edge computing — and specifically, the gap between what your laptop can do and what the edge runtime allows.
Every major deployment platform is moving toward edge-first. Vercel defaults new Next.js middleware to the edge runtime. Cloudflare Workers runs exclusively at the edge. Netlify Edge Functions are on by default for certain features. This means more and more of your AI-generated code runs in an environment your AI wasn't optimizing for when it wrote the code.
Understanding edge computing doesn't mean you need to become an infrastructure engineer. It means you need to know:
- What the edge runtime is — and what it can't do
- What specific errors to expect — and how to read them
- How to steer your AI — so it writes code that actually runs where you're deploying
That's what this article covers. No computer science degree required — just the mental model you need to stop chasing these errors in circles.
Real Scenario: You Deployed to Vercel and Your Code Ran "at the Edge"
"Build me a Next.js API route that checks if a user is authenticated and, if they are, fetches their profile from my PostgreSQL database and returns it as JSON. Deploy to Vercel."
Claude generates clean code. It imports pg (the standard PostgreSQL client), creates a connection pool, queries the database, and returns the data. You've seen this pattern a hundred times. You push to GitHub, Vercel auto-deploys, you wait for the green checkmark.
Then you hit the API endpoint and the request fails. You check the Vercel function logs:
Error: The edge runtime does not support Node.js 'net' module.
You are using a module that is not supported in the Edge Runtime.
Import trace for requested module:
./node_modules/pg/lib/connection.js
./node_modules/pg/lib/client.js
./app/api/profile/route.js
Here's what happened step by step:
- Your middleware file — which Claude also wrote — had
export const runtime = 'edge'at the top - Vercel saw that line and ran your code in the edge runtime instead of the regular Node.js runtime
- The edge runtime tried to load
pg, which internally uses Node.js'snetmodule to make TCP connections - The edge runtime doesn't have
net— so it threw an error and refused to run
Your code isn't wrong. It's in the wrong environment. And that's the entire premise of edge computing: the environment where your code runs is fundamentally different from your laptop.
Let's look at exactly what Claude generated and what each piece does.
What AI Generated
When AI builds apps that deploy to Vercel or Cloudflare, it generates two patterns you need to recognize: edge middleware and edge functions. Here's what each one looks like — and what that critical runtime line actually does.
Edge Middleware
Middleware runs on every request before your page or API route responds. It's the bouncer at the door — it can check credentials, redirect users, set headers, or modify requests before anything else happens. In Next.js, middleware always runs at the edge. There's no opt-out.
// middleware.js — this file always runs at the edge in Next.js
import { NextResponse } from 'next/server';
export function middleware(request) {
// Check for an auth token in cookies
const token = request.cookies.get('auth-token');
// If no token and user is trying to access /dashboard, redirect
if (!token && request.nextUrl.pathname.startsWith('/dashboard')) {
return NextResponse.redirect(new URL('/login', request.url));
}
// Otherwise, let the request continue
return NextResponse.next();
}
// Which paths this middleware applies to
export const config = {
matcher: ['/dashboard/:path*', '/settings/:path*']
};
This middleware runs on an edge server near the user. If someone in Berlin tries to visit /dashboard without a token, the Berlin edge server handles the redirect — not a server in the US. The user gets a response in milliseconds instead of hundreds of milliseconds.
Notice what this middleware does not do: it doesn't touch a database, read a file, or use any Node.js module. It only reads cookies and returns a redirect. That's exactly the kind of logic that belongs at the edge.
Edge Functions
An edge function is an API route or server function that's been explicitly told to run at the edge. In Next.js, you do this with one line at the top of the file:
// app/api/geo/route.js
export const runtime = 'edge'; // ← This is the magic line
export async function GET(request) {
// Edge functions have access to geolocation headers
// Vercel automatically adds these to every request
const country = request.headers.get('x-vercel-ip-country') || 'Unknown';
const city = request.headers.get('x-vercel-ip-city') || 'Unknown';
return new Response(
JSON.stringify({
message: `Hello from ${city}, ${country}!`,
timestamp: new Date().toISOString()
}),
{
headers: { 'Content-Type': 'application/json' }
}
);
}
This edge function uses only Web APIs — Request, Response, JSON, and Date. It reads headers and returns a response. No Node.js modules, no database, no file system. It will run at the edge without any issues.
The Edge-Compatible Database Pattern
When AI knows it's writing code for the edge, it should generate HTTP-based database calls instead of TCP-based ones. Here's the difference:
// ❌ Breaks at the edge — pg uses TCP connections (Node.js 'net' module)
import { Pool } from 'pg';
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
export const runtime = 'edge';
export async function GET() {
const result = await pool.query('SELECT * FROM users LIMIT 10');
return Response.json(result.rows);
}
// ✅ Works at the edge — Neon's serverless driver uses HTTP, not TCP
import { neon } from '@neondatabase/serverless';
export const runtime = 'edge';
export async function GET() {
const sql = neon(process.env.DATABASE_URL);
const users = await sql`SELECT * FROM users LIMIT 10`;
return Response.json(users);
}
Both do the same thing: query a PostgreSQL database and return results. But the first one breaks at the edge because it uses TCP, and the second works because it uses HTTP. The only difference is which library you import.
When you ask Claude to build a database-connected edge function, include: "Use Neon's serverless driver (@neondatabase/serverless) — it's HTTP-based and works at the edge." That single instruction prevents the most common edge error.
Understanding Each Part
Now let's build the mental model. You don't need to understand how edge servers work under the hood — you need to understand what they can and can't do, so you can write better prompts and fix errors faster.
The Warehouse Metaphor
Imagine your app is a product fulfillment business. Traditional deployment is like having one giant warehouse in Kansas City. When someone in Tokyo orders something, their request travels all the way to Kansas City, gets processed, and the response travels all the way back. The warehouse has everything — full equipment, all the staff, every tool you could ever need — but that round trip takes time.
Edge computing is like having mini-warehouses in every major city. Someone in Tokyo hits a warehouse in Tokyo. Someone in London hits one in London. Responses come back in a fraction of the time because the travel distance is dramatically shorter.
But here's the trade-off: those mini-warehouses are lightweight. They have a forklift and a few shelves — not the full inventory and heavy machinery of the main warehouse. They can handle simple requests instantly, but anything that requires the full warehouse setup needs to go back to headquarters.
That's the edge. Fast, distributed, close to users — but limited in what it can do.
The Edge Runtime: What It Actually Is
The edge runtime is not Node.js. This is the single most important thing to understand.
Your laptop runs Node.js — a JavaScript environment built on Chrome's V8 engine, with decades of added capabilities: file system access, TCP networking, native modules, child processes, and a huge standard library. Node.js can do almost anything.
The edge runtime is built on V8 isolates — tiny, fast, isolated JavaScript environments that start in under 10 milliseconds. They support Web APIs (the same APIs that browsers use): fetch, Request, Response, URL, Headers, TextEncoder, crypto.subtle, and a handful of others. What they do not have is everything that makes Node.js powerful beyond the browser.
Think of the edge runtime as a browser that can run server-side code. If the API you're using exists in a browser, it probably works at the edge. If it's a Node.js-specific thing, it probably doesn't.
What You Can and Cannot Do at the Edge
Works at the edge:
fetch()— Make HTTP requests to external APIsRequest/Response— The standard Web API request/response objectsURL/URLSearchParams— Parse and construct URLsHeaders/cookies— Read and set HTTP headers and cookiescrypto.subtle— Web Crypto API for hashing, signing, encryptionTextEncoder/TextDecoder— Encode and decode textsetTimeout/setInterval— TimersJSON.parse/JSON.stringify— JSON handling- HTTP-based database clients (Neon, PlanetScale, Supabase REST, Turso)
- Most pure JavaScript libraries that don't depend on Node.js internals
Does NOT work at the edge:
fs— No file system access. You cannot read or write files.net/tls— No TCP connections. This kills traditional database drivers.child_process— No spawning processes or shell commands.path— No path utilities (some platforms polyfill this, but don't rely on it).crypto(Node.js version) — The Web Crypto API works, but not Node'scryptomodule.Buffer— Node.js's Buffer class is unavailable (useUint8Arrayinstead).- Native modules — Anything compiled from C/C++ won't run (includes
bcrypt,sharp, etc.). - Traditional database drivers —
pg,mysql2,mongoose,prisma(unless using edge-compatible adapters).
The CDN Connection
If you're familiar with how CDNs work, you already have half the mental model. A CDN serves static files — pre-built HTML, CSS, images, JavaScript — from servers near users. The files are made once and cached everywhere.
Edge computing takes that one step further: instead of serving pre-made files, the edge server runs your code to generate responses on the fly. A CDN delivers a pre-made sandwich. Edge computing has a cook in every city who can make your sandwich fresh — but that cook only has a limited menu and a small kitchen.
Why Edge Functions Have Near-Zero Cold Starts
Traditional serverless functions (like AWS Lambda) run in containers — fully isolated environments with their own operating system layer. Spinning up a container from scratch takes time: anywhere from 200 milliseconds to several seconds. This delay is called a "cold start" and it's why users sometimes see slow initial responses from serverless apps.
Edge functions use V8 isolates instead of containers. An isolate is just a sandboxed JavaScript context — no OS layer, no container overhead. Starting one takes about 5 milliseconds. Your code is always "warm" from the user's perspective, even if they're the first person to hit that function in hours. This is one of the genuine advantages of the edge runtime over traditional serverless.
The trade-off: isolates are also why the edge runtime is so limited. You can't install a real operating system inside a V8 isolate. You can't open TCP sockets. You can't access a filesystem. The speed comes directly from the simplicity.
What AI Gets Wrong About Edge Computing
AI coding tools are excellent at generating working code for standard Node.js environments. The trouble is that most of the training data was written before edge runtimes became widespread — and AI doesn't always know which runtime your platform targets. Here are the most common mistakes, in order of how often you'll encounter them.
1. Using Node.js Modules in Edge Functions
This is the most frequent mistake. AI generates code that imports fs, path, crypto, or any npm package that internally depends on Node.js built-ins. The code is syntactically correct and runs fine locally. It blows up at the edge.
Common culprits AI reaches for without thinking:
// ❌ All of these break at the edge
// Reading a config file
import fs from 'fs';
const config = JSON.parse(fs.readFileSync('./config.json', 'utf8'));
// Hashing a password with bcrypt (uses native C++ addon)
import bcrypt from 'bcrypt';
const hash = await bcrypt.hash(password, 10);
// Using Node's crypto module
import crypto from 'crypto';
const hash = crypto.createHash('sha256').update(data).digest('hex');
// Connecting to PostgreSQL with pg
import { Pool } from 'pg';
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
When you see this pattern in AI-generated code destined for the edge, the fix is usually one of two things:
- Switch the runtime: Add
export const runtime = 'nodejs'to move the function off the edge and onto full Node.js - Switch the library: Replace the Node.js-specific library with a Web API equivalent
// ✅ Edge-compatible replacements
// Hashing with Web Crypto (works at the edge)
const msgBuffer = new TextEncoder().encode(data);
const hashBuffer = await crypto.subtle.digest('SHA-256', msgBuffer);
const hashArray = Array.from(new Uint8Array(hashBuffer));
const hashHex = hashArray.map(b => b.toString(16).padStart(2, '0')).join('');
// Password hashing: use a library built for edge (no native deps)
// Ask your AI: "Use bcryptjs instead of bcrypt — it's pure JavaScript and works at the edge"
// Database: switch to HTTP-based driver
import { neon } from '@neondatabase/serverless';
const sql = neon(process.env.DATABASE_URL);
2. Confusing Edge Functions with Serverless Functions
AI (and even some official documentation) uses "edge functions" and "serverless functions" interchangeably. They are not the same thing. This matters because the fix for a broken edge function often involves switching to serverless — but AI sometimes does the opposite, pushing code to the edge when it should stay on the Node.js runtime.
The clearest indicator of which runtime your function is targeting is one line:
export const runtime = 'edge'; // → runs in V8 isolate, Web APIs only
export const runtime = 'nodejs'; // → runs in full Node.js, everything works
// no line at all → depends on platform defaults (usually Node.js)
If your function needs to connect to a database with a traditional driver, process files, use native modules, or do anything that requires full Node.js — it should not have export const runtime = 'edge'. Remove that line or change it to 'nodejs'.
The common pattern where AI goes wrong: it generates an authentication system that uses jsonwebtoken (which works at the edge) but then also reads a user from a database using pg (which doesn't). Half the code is edge-compatible, half isn't. The error messages don't make it obvious which half is the problem.
3. Data Locality Problems
Edge computing adds a subtler problem that even experienced developers miss: data locality. Your edge server is close to your user, but your database is in one fixed location (usually a single region). If your edge function in Tokyo has to make a database request to a server in US-East, you've actually made things slower in some cases — not faster.
The round trip looks like this:
- User (Tokyo) → Edge Server (Tokyo): ~5ms ✅
- Edge Server (Tokyo) → Database (US-East): ~150ms ⚠️
- Database (US-East) → Edge Server (Tokyo): ~150ms ⚠️
- Edge Server (Tokyo) → User (Tokyo): ~5ms ✅
Total: ~310ms. Slower than just using a regional serverless function close to your database.
AI doesn't know where your database lives. When it generates edge functions that make database calls, it's not accounting for the fact that your database isn't distributed — it's in one place. The fix: either use a globally distributed database (like PlanetScale, Turso, or Neon with read replicas), or run your database-heavy routes as regional serverless functions instead of edge functions.
4. Putting Everything at the Edge
Once vibe coders learn about edge computing and how fast it is, AI sometimes over-applies it. The recommendation becomes: "Put everything at the edge for maximum performance." This is wrong.
Heavy computation, file processing, PDF generation, image manipulation, complex database queries — none of these belong at the edge. The edge runtime has tight CPU time limits (usually a few hundred milliseconds of CPU time), no file system, and no native modules. Trying to run a Sharp image processor or a PDF generation library at the edge won't just be slow — it'll fail entirely.
Edge computing is fast for the right things. Use it for: auth checks, redirects, geolocation logic, request modification, and responses that don't require heavy lifting. Keep everything else in a standard serverless function or on a proper server.
5. Using Prisma Without the Edge Adapter
Prisma is one of the most popular database ORMs in the JavaScript ecosystem, and AI loves to generate Prisma code. Standard Prisma does not work at the edge — it uses TCP connections internally. But Prisma has an edge-compatible mode using Prisma Accelerate or the Prisma Data Proxy, and AI sometimes generates the standard Prisma setup and tells you it'll "work fine" on Vercel.
If you're using Prisma and deploying edge functions, you specifically need:
// For Prisma at the edge, you need the edge client + Prisma Accelerate
// (ask your AI to generate this specifically)
import { PrismaClient } from '@prisma/client/edge';
import { withAccelerate } from '@prisma/extension-accelerate';
const prisma = new PrismaClient().$extends(withAccelerate());
export const runtime = 'edge';
Any time AI generates Prisma code for an edge function without mentioning Prisma Accelerate or the edge client, prompt it to fix that specifically.
How to Debug Edge Errors with AI
Edge runtime errors follow a consistent pattern. Once you know how to read them, you can diagnose the problem in seconds and give your AI the context it needs to fix it correctly the first time.
Reading the Error Message
Edge runtime errors are unusually helpful — they almost always name the exact module that's incompatible and show you the import chain that led to it:
Error: The edge runtime does not support Node.js 'net' module.
You are using a module that is not supported in the Edge Runtime.
Learn more: https://nextjs.org/docs/app/building-your-application/rendering/edge-and-nodejs-runtimes
Import trace for requested module:
./node_modules/pg/lib/connection.js
./node_modules/pg/lib/client.js
./node_modules/pg/lib/index.js
./app/api/users/route.js
Read this bottom to top. Your file (route.js) imported pg. pg's index.js loaded client.js. client.js loaded connection.js. connection.js tried to use the net module. The edge runtime doesn't have net. Boom.
The fix is clear: either remove pg and switch to an HTTP-based driver, or change export const runtime = 'edge' to export const runtime = 'nodejs'.
The Decision Tree for Edge Errors
When you hit an edge runtime error, work through this:
- Does this function need to run at the edge? If it's doing heavy database work, file processing, or uses native modules — probably not. Remove
export const runtime = 'edge'and you're done. - Is it middleware? Middleware always runs at the edge in Next.js. You can't change this. You have to make the middleware code edge-compatible or move that logic out of middleware.
- Can I swap the library? If you want to keep the function at the edge, find an edge-compatible version of the library that's failing. The import trace tells you exactly which library to replace.
What to Paste Into Your AI
Give your AI three things when you're debugging an edge error:
"I'm getting an edge runtime error on Vercel. Here's the full error message and import trace: [paste error]. Here's the route file that's failing: [paste your route file]. I'm using Next.js 15 on Vercel. Either switch this route to the Node.js runtime (export const runtime = 'nodejs') if it doesn't need to be at the edge, or replace any incompatible libraries with edge-compatible versions. My database is PostgreSQL on Neon."
Three things made that prompt effective: (1) the full error with the import trace, (2) the actual file code, and (3) a clear decision — either switch runtimes or replace libraries. Without the last part, AI tends to try to keep the code at the edge and patch around the problem, which often creates new errors.
Common Error Messages and Their Fixes
The edge runtime does not support Node.js 'fs' module
Your code is reading or writing files at the edge. The edge runtime has no filesystem. Fix: remove the file operation (read from a database or API instead), or switch to export const runtime = 'nodejs'.
The edge runtime does not support Node.js 'net' module
A library (usually a database driver) is trying to open a TCP connection. Fix: switch to an HTTP-based database client like @neondatabase/serverless, or move to the Node.js runtime.
Dynamic server usage: headers
This one's different — it means you're using headers() or cookies() from Next.js in a way that makes a page or component dynamic. This isn't an edge error — it means you need to add export const dynamic = 'force-dynamic' to your route, or restructure so the dynamic parts are separated from the static parts.
Module not found: Can't resolve 'bcrypt' (on Vercel)
bcrypt is a native module — it compiles C code when installed. Native modules don't work at the edge. Switch to bcryptjs, which is a pure JavaScript implementation of the same algorithm. Tell your AI: "Replace bcrypt with bcryptjs — same API, no native module, works at the edge."
Quick Reference: Edge vs. Serverless vs. Traditional Server
| Feature | Edge Functions | Serverless Functions | Traditional Server (VPS) |
|---|---|---|---|
| Where it runs | 30–300+ locations worldwide | 1–3 specific regions | One fixed server |
| Startup time | <10ms (near-instant) | 200ms–5s (cold start) | No startup — always on |
| Runtime | V8 isolates (Web APIs only) | Full Node.js | Full Node.js (or any language) |
| File system | No | Yes (temporary) | Yes (persistent) |
| TCP database drivers | No — HTTP-only | Yes | Yes |
| Native modules | No | Yes | Yes |
| Max execution time | ~30 seconds (wall clock) | Up to 5–15 minutes | No limit |
| Memory limit | 128MB typical | 1–3GB typical | Whatever the server has |
| Scaling | Automatic, unlimited | Automatic, unlimited | Manual — you manage it |
| Best for | Auth checks, redirects, geolocation, fast API responses | Database queries, file ops, heavy computation | Long-running processes, persistent connections, full control |
| Examples | Vercel Edge, Cloudflare Workers, Netlify Edge, Deno Deploy | Vercel Serverless, AWS Lambda, Netlify Functions | DigitalOcean Droplet, Hetzner VPS, EC2 |
The practical rule of thumb: if your logic is lightweight and needs to be fast for everyone globally, use the edge. If your logic needs full Node.js capabilities, use serverless. If your logic needs to be persistent, stateful, or always-on, use a VPS.
What to Learn Next
Edge computing connects to several other concepts that come up constantly when you're deploying AI-built apps. Here's where to go next:
Frequently Asked Questions
What is edge computing in simple terms?
Edge computing means running your code on servers that are physically close to your users — not in one central data center. When someone in Tokyo visits your app, an edge server in Tokyo responds instead of a server in Virginia. The response time drops dramatically because the data travels a much shorter distance. The trade-off is that edge servers run in a limited environment that doesn't support everything Node.js can do.
What is the difference between edge computing and cloud computing?
Cloud computing runs your code in one or a few centralized data centers — like one massive warehouse that ships to the entire country. Edge computing distributes your code to dozens or hundreds of smaller servers worldwide — like mini-warehouses in every major city. Cloud gives you maximum power and flexibility. Edge gives you speed and proximity to users, but with stricter limits on what your code can do. Most modern apps use both: static files and fast responses from the edge, heavy processing on cloud servers.
Why does my code work locally but fail on Vercel's edge runtime?
Your local machine runs full Node.js, which has access to the filesystem, TCP networking, native modules, and everything else. Vercel's edge runtime is a stripped-down environment based on Web APIs — the same APIs browsers use. Features like fs (file system), net (TCP connections), path, and native npm modules are not available. If your code or any npm package it depends on uses these features, it will fail at the edge even though it runs perfectly on your laptop. The fix is either to switch the route to the Node.js runtime or to replace incompatible libraries with Web API equivalents.
Can I connect to a database from an edge function?
Not with traditional database drivers that use TCP connections. Standard PostgreSQL clients like pg, MySQL clients like mysql2, and ORMs like Prisma (in default mode) all rely on TCP networking — which the edge runtime doesn't support. You need HTTP-based database clients specifically designed for edge environments: Neon's serverless driver (@neondatabase/serverless), PlanetScale's fetch API, Supabase's REST client, Turso's HTTP driver, or Prisma with Prisma Accelerate. These communicate over HTTP instead of raw TCP and work within edge runtime constraints.
Do edge functions have cold starts?
Edge functions have near-zero cold starts — typically under 10 milliseconds. Traditional serverless functions like AWS Lambda can have cold starts of 200ms to several seconds because they run in full containers that take time to spin up. Edge functions use lightweight V8 isolates that start almost instantly. This is one of the main advantages of edge computing: your code is always effectively "warm" because starting an isolate costs almost nothing. The flip side is that this speed is achieved by stripping out everything that makes containers (and Node.js) powerful — which is why the edge runtime has so many limitations.
Last updated: March 21, 2026. Tested with Next.js 15, Vercel Edge Runtime, and Cloudflare Workers with V8 isolates.