hi qq -- im trying to convert a largely kv-based lookup metadata of ~7GB in size of json and gets 10

hi qq -- im trying to convert a largely kv-based lookup metadata of ~7GB in size of json and gets 10+ GB in size later with parallel reads, updates. Metadata needs persistence storage for consistency and availability too.

My metadata has 10M+ entries where each entry is a small dict of mostly nested strings of a loose schema, mixed with floats.

Is this size a good use case for cloudflare d1 / kv? thanks!
Was this page helpful?