Reusing DB instance with process.env

Hi everyone, now that accessing process.env is possible, when I create a db file with a connection to the db:
export const queryClient = postgres(process.env.DATABASE_URL);
export const db = drizzle({
client: queryClient,
schema,
logger: false,
casing: "snake_case",
});
export const queryClient = postgres(process.env.DATABASE_URL);
export const db = drizzle({
client: queryClient,
schema,
logger: false,
casing: "snake_case",
});
I get this error:
Error: Cannot perform I/O on behalf of a different request. I/O objects (such as streams, request/response bodies, and others) created in the context of one request handler cannot be accessed from a different request’s handler. This is a limitation of Cloudflare Workers which allows us to improve overall performance.
What's the explanation here? I thought that since every request is different, the module would be initiated per request.
12 Replies
Hard@Work
Hard@Work2w ago
You should be building the DB client on request, rather than exporting it So like,
export const getDbClient = () => {
const queryClient = postgres(process.env.DATABASE_URL);
const db = drizzle({
client: queryClient,
schema,
logger: false,
casing: "snake_case",
});
return { queryClient, db };
};
export const getDbClient = () => {
const queryClient = postgres(process.env.DATABASE_URL);
const db = drizzle({
client: queryClient,
schema,
logger: false,
casing: "snake_case",
});
return { queryClient, db };
};
Franco Romano Losada
But that means that I have to pass it around or calling the func is enough? Thanks
Hard@Work
Hard@Work2w ago
Yeah, when you need a DB connection, call the function, don't pass the connection around
Franco Romano Losada
Thanks a ton. Im wondering, do you do something under the hood with the connection? Or should I memoize it?
Hard@Work
Hard@Work2w ago
You should be creating it anew on every request. You can also use https://developers.cloudflare.com/hyperdrive/, which handles connection-pooling and caching for you
Franco Romano Losada
Mmm yes, Im usingn it. Maybe I didn’t explain myself well. I want to create the connection once and then import the db in any place as clean as possible without having to pass the created instance all over the places. im not sure this solves that without caching the output of the function
Hard@Work
Hard@Work2w ago
You probably want AsyncLocalStorage then. You would build the database connection when you start processing, and then the DB connection is automatically passed down on import
Cloudflare Docs
AsyncLocalStorage · Cloudflare Workers docs
Cloudflare Workers provides an implementation of a subset of the Node.js AsyncLocalStorage API for creating in-memory stores that remain coherent through asynchronous operations.
Franco Romano Losada
Thats better, thank you Do you mind explaining to me why this doesn't work?
import { drizzle } from "drizzle-orm/postgres-js";
import postgres from "postgres";

import * as schema from "./schema";

export type DBClient = ReturnType<typeof drizzle<typeof schema>>;

let db: DBClient;

export function initDb(
connectionString: string,
) {
if (!connectionString) {
throw new Error("No connection string provided");
}

const queryClient = postgres(connectionString);
db = drizzle({
client: queryClient,
schema,
logger: false,
casing: "snake_case",
});

return db;
}

export { db, schema };
import { drizzle } from "drizzle-orm/postgres-js";
import postgres from "postgres";

import * as schema from "./schema";

export type DBClient = ReturnType<typeof drizzle<typeof schema>>;

let db: DBClient;

export function initDb(
connectionString: string,
) {
if (!connectionString) {
throw new Error("No connection string provided");
}

const queryClient = postgres(connectionString);
db = drizzle({
client: queryClient,
schema,
logger: false,
casing: "snake_case",
});

return db;
}

export { db, schema };
Where initDb is called in a middleware, before anything imports the db. Why the console yields:
Error: Cannot perform I/O on behalf of a different request. I/O objects (such as streams, request/response bodies, and others) created in the context of one request handler cannot be accessed from a different request’s handler. This is a limitation of Cloudflare Workers which allows us to improve overall performance.
From my POV, I'm building the db client once in the middleware and then importing it where its used. This would make working with workers much better. I mean, it works but sometimes it breaks when two requests to the server are done almost at the same time. I thought that a worker is a traditional serverless instance, where the files are initiated by request. So, the first request runs middleware and initiates the db for that specific worker, then different files imports it and the next requests do the same thing, like isolated code. req -> middleware creates db connection -> file imports it -> res | -> req -> middleware creates db connection -> file imports it -> res | same
Hard@Work
Hard@Work2w ago
Because the db variable is shared across requests It should be inside the function
Franco Romano Losada
But that means memory is shared across requests?
Hard@Work
Hard@Work2w ago
No, it means the Drizzle Instance and DB Connection are rebuilt for each request, and not reused
Franco Romano Losada
Cloudflare Workers are designed to be super fast and lightweight. So when possible, they reuse the same isolate (kind of like a lightweight V8 sandbox) between requests. That means top-level variables can persist between invocations, but you should not rely on them for request-specific logic or state, especially for I/O objects. 🧠 What’s Really Going On in Cloudflare Workers When you do something like: let db; export function initDb() { db = drizzle(...); } You’re creating a shared global variable. And while it may seem isolated per request in theory, Cloudflare does reuse the execution context between requests when it can.
Okay, is this is what you mean I get it now 😄

Did you find this page helpful?