P
Prisma3w ago
xhulz

Backup does not work with heavy tables

Hi, I'm encountering an issue when trying to create a backup. 1. The error occurs with only one table, which has around 1.5 million records (Users) 2. I’ve already tried requesting a backup for just that specific table, but the error still persists 3. The tunnel returns only this message: Connection closed { code: 1006, reason: '' } Could you please help me with this? Thank you! Tunnel: npx @prisma/ppg-tunnel@latest --host 127.0.0.1 --port 5432 Command: PGSSLMODE=disable \ pg_dump \ -h 127.0.0.1 \ -p 52604 \ -Fc \ -v \ -d postgres \ -f ./mydatabase.bak \ -t "core."Users"" \ && echo "-complete-" Response: pg_dump: dumping contents of table "core.Users" pg_dump: error: Dumping the contents of table "Users" failed: PQgetCopyData() failed. pg_dump: detail: Error message from server: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. pg_dump: detail: Command was: COPY core."Users" (id, "walletAddress", "referralCode", active, "createdAt", "emailAddress", banned, "bannedAt", "bannedReason", "firstName", "isEmailVerified", "lastName", "userName", "chainId", "tokenId") TO stdout;
15 Replies
Prisma AI Help
Skipping coffee for instant knowledge! Let's brew that AI answer now. It appears you are encountering a connection closure when attempting to back up a large table (1.5 million records) using pg_dump over a Prisma Postgres tunnel. The error message indicates that the server closed the connection unexpectedly during the COPY operation, which is often a sign of resource exhaustion or a timeout on the database or tunnel side. Based on the knowledge sources, here are some key points and suggestions: 1. Resource Limits: Large tables can cause the database server to run out of memory or hit other resource limits, especially if running in a constrained environment (e.g., Kubernetes, small cloud instances). One user resolved a similar issue by increasing the memory allocated to their PostgreSQL instance—from 512Mi to 1024Mi—which stopped the server from crashing during heavy operations Upsert of big JSON throws the error "P1017 Server has closed the connection". 2. Tunnel Stability: The Prisma tunnel (@prisma/ppg-tunnel) is required for secure connections, but if the tunnel process is interrupted or the connection is unstable, it can cause the backup to fail. Ensure the tunnel process remains open and stable throughout the backup. If possible, try running the backup from a machine with a reliable network connection and minimal background processes Backups prerequisites. 3. PostgreSQL Version: Prisma Postgres is based on PostgreSQL 17. Using other versions of the CLI tools may cause errors. Make sure you are using pg_dump from PostgreSQL 17 as recommended Backups prerequisites. 4. Support for Large Backups: If you continue to experience issues, and especially if you are on a Pro, Business, or Enterprise plan, you can contact Prisma Support for assistance with large backups or to request a snapshot [Backups](https://www.prisma.i o/docs/postgres/database/backups). Summary of Actions: - Increase the memory/resources available to your PostgreSQL server if possible. - Ensure you are using the correct version of pg_dump (PostgreSQL 17). - Keep the tunnel process running and stable during the backup. - If the problem persists, contact Prisma Support for direct assistance. If you need more detailed troubleshooting, please provide information about your database environment (e.g., cloud provider, resource limits, etc.). 🤔 Alright, thanks for letting us know that there's still some help needed. A team member will reply to you shortly. In the meantime, please let us know any additional info, questions, or which parts of the answer may need clarification.
xhulz
xhulzOP3w ago
@Nurul please, could you help us on it?
Nurul
Nurul3w ago
Hi @xhulz! Can you please share the email or the GitHub Handle with which you signed up? I would need your project details so that I can escalate it to our Prisma Postgres team
xhulz
xhulzOP3w ago
sure thing @Nurul Thank you for your response It's: [email protected] Project name is: G7 Postgres Environment: Development Thank you again
Nurul
Nurul3w ago
Thank you @xhulz! I see your workspace "Game7 workspace" has several projects and environments. Can you provide me the project and environment name whose backup you want to create?
xhulz
xhulzOP3w ago
Project name is: G7 Postgres Environment: Development
Nurul
Nurul3w ago
Thank you! Can you let me know your use case? Are you trying to restore your database with a backup file or do you just want your raw backup file? If you want to restore the database, we can create a new environment for you from the backup file.
xhulz
xhulzOP3w ago
Yes, of course! The idea is to make sure the STG and PRD databases have the same data as DEV before we start modifying anything in DEV. We're migrating from Atlas MongoDB to Prisma with PostgreSQL, and we’ve only moved the tables that actually matter to us at this point, with the most relevant data already included. If it’s possible to restore DEV to STG and PRD, I’d really appreciate it, as the plan was to back up DEV and then restore it to the other environments. Thank you!
Nurul
Nurul3w ago
@xhulz Sure, we can do that. So to confirm, we should take the latest backup from Development environment and restore it to Production and Staging environment of project "G7 Postgres", correct?
xhulz
xhulzOP3w ago
Yes, exactly. The STG / PRD is empty 🙂 Thank you!
Nurul
Nurul3w ago
Awesome! I have shared this with our engineering team. I'll keep you updated and get back to you with an update
xhulz
xhulzOP3w ago
Thank you! 🙂
Nurul
Nurul3w ago
@xhulz Done! Prod and Staging should be identical to Dev. Can you check?
xhulz
xhulzOP3w ago
Confirmed @Nurul! Everything is identical right now 🙂 Thank you so much again!
Nurul
Nurul3w ago
No worries at all! Please let us know if we can assist in anyway 🙂

Did you find this page helpful?