workers-help
pages-help
general-help
durable-objects
workers-and-pages-discussions
pages-discussions
wrangler
coding-help
kv
🦀rust-on-workers
miniflare
stream
general-discussions
functions
zaraz
⚡instant-logs
email-routing
r2
pubsub-beta
analytics-engine
d1-database
queues
workers-for-platforms
workerd-runtime
web3
🤖turnstile
radar
web-research
logs-engine
cloudflare-go
terraform-provider-cloudflare
workers-ai
browser-rendering-api
logs-and-analytics
next-on-pages
cloudflare-ai
build-caching-beta
hyperdrive
vectorize
ai-gateway
python-workers-beta
vitest-integration-beta
workers-observability
workflows
vite-plugin
I am trying to follow the instructions

What are your thoughts of separate
hi all, i'm developing a worker that has
Any recommendations for using d1 data
Does `PRAGMA defer_foreign_keys=ON` not
PRAGMA defer_foreign_keys=ON
not disable on update
and on delete
actions? Kinda weird when SQLite often requires recreating tables (and thus triggering those actions)Yes, since those virtual tables reside
That seems like the current bookmark id
Alpha database migration guide · Cloudfl...
Internal error: failed to run backup: Failed to query D1: D1 API returned: (500 Internal Server
Error) {"error":"Error 9000: something went wrong","success":false} [code: 7500]
when i run wrangler d1 backup command , any idea how to resolve this ?...Can prepared statements bind arrays for `IN (?)`
db.prepare('SELECT * FROM table WHERE status IN (?)').bind(['active','expired']).all()
, would that work?Hey all, is something up with `wrangler
wrangler d1 info
+ wrangler d1 insights
?
info
is reporting 0 read queries in the past 24h (not the case unless it's somehow using some local version? The ID is the same as in my CF dash)
insights
is returning an empty array regardless of configuration...Hello, I have had success with the D1 in
i'm getting `Error: D1_ERROR: D1 is
Error: D1_ERROR: D1 is overloaded. Too many requests queued.
for the last while. its a tiny little d1 database with no trafficAre older d1 databases still 2gb? Mine
Load testing w/ weird results (1k req, 40s resp times)
SELECT * FROM table_name WHERE updated_at >= ? AND name > ? ORDER BY updated_at ASC, name LIMIT 1000
- 100k row table, 17mb...So I think I found a bug where simply

Importing a 13MB SQL File

only alpha dbs shouldn't have a usage