Stop db connection usage from rising while avoiding "Cannot perform I/O of a different request"

I'm trying to create a db connection function that doesn't create a new db connection for every new request. I'm using Kysely and MYSQL:
export function useDb() {
return new Kysely({
dialect: new MysqlDialect({
pool: createPool({
host: env('NITRO_DB_LEGACY_HOST'),
user: env('NITRO_DB_LEGACY_USER'),
password: env('NITRO_DB_LEGACY_PASSWORD'),
port: 3306,
connectionLimit: 1,
disableEval: true,
typeCast: (field, next) => {
if (field.type === 'DECIMAL' || field.type === 'NEWDECIMAL') {
return parseFloat(field.string())
}

return next()
},
}),
}),
})
}
export function useDb() {
return new Kysely({
dialect: new MysqlDialect({
pool: createPool({
host: env('NITRO_DB_LEGACY_HOST'),
user: env('NITRO_DB_LEGACY_USER'),
password: env('NITRO_DB_LEGACY_PASSWORD'),
port: 3306,
connectionLimit: 1,
disableEval: true,
typeCast: (field, next) => {
if (field.type === 'DECIMAL' || field.type === 'NEWDECIMAL') {
return parseFloat(field.string())
}

return next()
},
}),
}),
})
}
I've tried:
let db

export function useDb() {
if (db) return db

db = new Kysely({
let db

export function useDb() {
if (db) return db

db = new Kysely({
However this leads to fatal errors in workers with:
Cannot perform I/O on behalf of a different request. I/O objects (such as streams, request/response bodies, and others) created in the context of one request handler cannot be accessed from a different request's handler. This is a limitation of Cloudflare Workers which allows us to improve overall performance. (I/O type: Writable)
Cannot perform I/O on behalf of a different request. I/O objects (such as streams, request/response bodies, and others) created in the context of one request handler cannot be accessed from a different request's handler. This is a limitation of Cloudflare Workers which allows us to improve overall performance. (I/O type: Writable)
How do people handle db re-use?
5 Replies
James
James6mo ago
There’s only two real options for DB connection reuse in Workers: - manually store and manage the connection in durable object and run all queries there - or more easily and what I’d recommend, just use Hyperdrive: https://developers.cloudflare.com/hyperdrive/
Cloudflare Docs
Hyperdrive
Hyperdrive is a service that accelerates queries you make to existing databases, making it faster to access your data from across the globe from Cloudflare Workers, irrespective of your users' location.
Titan
TitanOP6mo ago
thanks, I did give that a go but hit a wall with "Connection lost: The server closed the connection."
Titan
TitanOP6mo ago
did console log to ensure the envs were being read correctly
No description
Titan
TitanOP6mo ago
"mysql2": "3.14.1" I'm connecting to the hyperdrive using mysql createPool, is that an issue? It seems like Kysely only supports connecting with a pool
James
James6mo ago
Hmm I wouldn't expect it to be an issue. You might try posting in #hyperdrive - the team there are very active and receptive to questions and feedback

Did you find this page helpful?