Oh, there really is such a thing. I thought something like that would be implemented with some proprietary packages.
But the migrations are implemented in SQL here, right? When I was still working a lot with Java, there was Liquibase or something like that. There you could define such migrations database-agnostically.
I think that would be really cool. I personally like D1 and Cloudflare. However, I write my applications for customers who sometimes have strong compliance regulations, such as hosting everything themselves, or only certain cloud providers are prescribed.
High-performance (around the globe) would be my concern. That's why I was thinking about D1.
Of course there are other larger cloud providers that offer something like geo-replicated storage - but I'm not sure how well they perform.
In fact, all data within these 2 TB is intended for public use. A web app that I would have implemented would then retrieve and display this data, for example.
It's basically a huge product database. Images separately in the CDN.
i would still recommend a non sql database for this. its much more easily horizontally scalable than SQL. ofc planetscale has achieved horizontal scalability with a mysql databases using vitess, but i dont think its enough for a 2 TB data.
Personally, I am a big fan of relational databases. I always assumed that they could be optimised well with the appropriate data structure.
Do you think NoSQL would make more sense for my use case? A few years ago when I looked into it, NoSQL was usually still behind relational databases. But that might have changed in the meantime ...
There are big updates 1-2 times a day because the data changes regularly. However, this would not be performance-critical. Only the reads are performance-critical.
Basically, I aggregate product data from different providers. As the descriptions are often in different languages, they are run through the Deepl API again and are to be written to a database. And a few more preprocessing steps to check how reputable the provider is... And the product images are uploaded to a blob storage or CDN to avoid relying on the provider's image service
In the end, this data should be displayed on a website where you can view and compare the active products.
Basically, a user will not be able to retrieve large amounts of data in one go. However, I would like to keep every single data record with very good performance.
A search function is also planned, but not necessarily the most important thing. I have already tried it with Meilisearch, but I had problems searching for a single ID, which is sometimes useful for my use case.
Just imagine a huge "products" table or bucket. An end user either clicks on a link that contains an ID or a "slug" and only wants the one product to be displayed.
Hello everyone, I have a question, I am sharing my cloudflare service with other friends, as a super administrator it allows me to use cloudflare Images but they ask my friends to pay again, do I have to give them any permission apart from cloud images?
Does cloudflare still allow increasing the storage limit on a database? I've already submitted a request on the form but I'm not sure if they still allow it
i also have a cron trigger which run every 30 mins, but it only pokes into a table which has no more than 2 records and this record is well indexed. I'm also not enable any cron trigger on my dev site