Say you need to delete a few rows of data from a D1 DB - do you use the dashboard UI, dashboard cons
Say you need to delete a few rows of data from a D1 DB - do you use the dashboard UI, dashboard console (SQL), wrangler (SQL) or something else?

Edit Cloudflare Workers template, it doesn't include D1. This caught me out a little bit running D1 commands in CI, as it does include KV and R2, so seems a little inconsistent it doesn't include D1, but perhaps there is a reason for that
PRAGMA defer_foreign_keys=ON not disable on update and on delete actions? Kinda weird when SQLite often requires recreating tables (and thus triggering those actions)select * from pragma_table_info('TABLE_NAME_HERE');on delete and on update features because of this? Or should I dump my current db, create a new one, and upload the data?Error: D1_ERROR: The destination execution context for this RPC was canceled while the call was still running the docs just define it as Generic error.. I'm getting it while using the db.batch() command and my code has been working fine until today. Is there a limit to how many prepared statements db.batch can run?
INSERT INTO services (id,name,type,created_at) VALUES ("1", "2", "3", "4") ON CONFLICT (id) UPDATE SET name = excluded.name, type = excluded.type WHERE name != excluded.name OR type != excluded.type does not workINSERT INTO services (id,name,type,created_at) VALUES ("1", "2", "3", "4") ON CONFLICT DO NOTHINGEdit Cloudflare Workers [cause]: Error: parser stack overflow
at D1Database._sendOrThrow (cloudflare-internal:d1-api:67:24)
at async D1PreparedStatement.raw (cloudflare-internal:d1-api:184:32) {
[cause]: undefined
}PRAGMA defer_foreign_keys=ONon updateon updateon deleteon deleteselect * from pragma_table_info('TABLE_NAME_HERE');Error: D1_ERROR: The destination execution context for this RPC was canceled while the call was still runningGeneric error.db.batch()db.batchError: D1_ERROR: The destination execution context for this RPC was canceled while the call was still running.\n at D1Database._sendOrThrow (cloudflare-internal:d1-api:66:19)\n at async D1Database.batch (cloudflare-internal:d1-api:36:23)\n at async updateTranscripts (index.js:15762:9)\n at async Object.fetch (index.js:16415:16)let r : D1Response = await DATABASE.prepare(query).bind(id,name,type,created_at).run();
console.log("Response success: " + r.success);
console.log("Response error: " + r.error);const query = "INSERT INTO services(id,name,type,created_at)"
+ " VALUES (?,?,?,?)"
+ " ON CONFLICT (id) UPDATE"
+ " SET name = excluded.name, type = excluded.type"
+ " WHERE name != excluded.name OR type != excluded.type";INSERT INTO services (id,name,type,created_at) VALUES ("1", "2", "3", "4") ON CONFLICT (id) UPDATE SET name = excluded.name, type = excluded.type WHERE name != excluded.name OR type != excluded.typeINSERT INTO services (id,name,type,created_at) VALUES ("1", "2", "3", "4") ON CONFLICT DO NOTHINGconst transcriptPrepared = db.prepare(`
INSERT INTO transcripts
(
feed_id, archive_date, archive_date_time,
object_key, segment_start, segment_end,
segment
)
VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7)
`);
const transcriptQueriesToRun = allSegments.map(segmentObj => transcriptPrepared.bind(
segmentObj.feed_id,
segmentObj.archive_date,
segmentObj.archive_date_time,
segmentObj.object_key,
segmentObj.segment_start,
segmentObj.segment_end,
segmentObj.segment
));
console.log("transcriptQueriesToRun.length: ", transcriptQueriesToRun.length);
console.log("transcriptQueriesToRun[0] ", JSON.stringify(transcriptQueriesToRun[0]));
if (transcriptQueriesToRun.length > 0) {
try {
await db.batch(transcriptQueriesToRun);
} catch (e: any) {
console.error({
message: e.message
});
if (e instanceof Error) {
return new Response(JSON.stringify({
"error": e.message,
"traceback": e.stack
}), {
status: 500,
headers: getCorsHeaders()
});
}
}
}