workers-and-pages-help
pages-help
general-help
durable-objects
workers-and-pages-discussions
pages-discussions
wrangler
coding-help
kv
🦀rust-on-workers
miniflare
stream
general-discussions
functions
zaraz
⚡instant-logs
email-routing
r2
pubsub-beta
analytics-engine
d1-database
queues
workers-for-platforms
workerd-runtime
🤖turnstile
radar
logs-engine
cloudflare-go
terraform-provider-cloudflare
workers-ai
browser-rendering-api
logs-and-analytics
next-on-pages
cloudflare-ai
build-caching-beta
hyperdrive
vectorize
ai-gateway
python-workers-beta
vitest-integration-beta
workers-observability
workflows
vite-plugin
pipelines-beta
containers-beta
You need to provide us database IDs to
This is from the HKG PoP and with a D1 instance in APACAPAC is a huge region and you could be 100-200ms away from your database unfortunately. Having said that anything more than a few hundreds of ms even in those cases is a bigger unexpected issue, either with the responses getting back to your worker or something else. ...
was anyone able to use Prisma + D1? My
status dashboard says no, but just for
This is good advice, but it’s not
how do you make this work, are you
D1 connection string
Low latency D1 for hourly config updates
Migration timeout
What is your database ID?I've just send you a DM.
Are other queries working, and only that specific migration failing?...
Durable Objecte SQLite
Is anyone aware of an existing tool that
How would I get just the `results` from
results
from the return Response.json(results) area? I'm getting this: {"success":true,"meta":{"served_by":"v3-prod","duration":0.1977,"changes":0,"last_row_id":0,"changed_db":false,"size_after":16384,"rows_read":1,"rows_written":0},"results":[{"test":"a"}]}
{"success":true,"meta":{"served_by":"v3-prod","duration":0.1977,"changes":0,"last_row_id":0,"changed_db":false,"size_after":16384,"rows_read":1,"rows_written":0},"results":[{"test":"a"}]}
yes i did this query
No, it first fails on parsing the sql
Hmm unless I'm missing something, this
.all
and .batch
do return those metrics, but it's very intrusive to modify your code....I'm trying to import data from an sql
statement too long: SQLITE_TOOBIG
error. The whole sql file is 24mb and has one insert statement per line (each statement ends with a semicolon followed by a line break). The table has 13 columns I'm inserting into. The longer insert statement line is 751 byte. Are there any limitations for the D1 HTTP API I'm not aware of? Could this be a bug? Or I'm maybe doing something wrong?
Update: Problem was with quotes not been escaped. Escaped those by doubling them and the problem was solved!...Unexplained row reads in billable usage
Hello. I'm getting this error when
npx drizzle-kit push
command. Anyone know what's going on?
```
[âś“] Pulling schema from database...
Error: 7500: not authorized: SQLITE_AUTH...