It has some ability to queue built in. When you exceed that, you get back a 429
It has some ability to queue built in. When you exceed that, you get back a 429
Original question: This might be a dumb question if a 1000 users try to write to the database at the same time or make a query it won’t fault correct.
hash(id) % N to get the DB to use. And then do the normal thing you would do using that DB.we are scared of the 10GB limit more we might not reach it until after two years




CREATE, INSERT, and DELETE requests, correct? If so, it would've helped assure me that READ would not be blocked during an export (if that's true). Thanks
/link and fill the info, so that we can figure out your account by your profile.
d1 export for one of my databases and now it says Error in worker: Error: D1_ERROR: Currently processing a long-running export. and i'm not sure it's currently processing because it errored out on my terminal. Is there a way to cancel this process?hash(id) % NCREATEREAD/linkd1 exportError in worker: Error: D1_ERROR: Currently processing a long-running export.