Unable to upload large files into network volume using aws s3 command
I have been trying to upload a model directory that consists of small files and 1 big model file.
❌ I used the aws sync command to upload the whole directory, but it always fail when uploading the big model file.
❌ I also tried to upload the single model file using the aws cp command, but same error response.
❌ I have also tried using the upload_large_file.py script from the github page. But same error.
The error generally is Bad Gateway. But the specific error changes every time. I have attached some examples.
I would like to also point out that sometimes using aws ls command takes a long time to return a response, and sometimes just failed. I have used a network volume the same way before and had no issues. Appreciate anyone's help regarding this. Thank you so much.
💾 Network volume info:
Size: 12 GB
Datacenter: EU-RO-1
Sample commands :
1) aws s3 sync --region EU-RO-1 --endpoint-url https://s3api-eu-ro-1.runpod.io/ whisper-model/ s3://123abcdef/models/tts/whisper-model/
2) aws s3 cp --region EU-RO-1 --endpoint-url https://s3api-eu-ro-1.runpod.io/ model.safetensors s3://123abcdef/models/tts/whisper-model/
3) ./upload_large_file.py --file whisper-model/model.safetensors \
--bucket 123abcdef \
--key models/stt/model.safetensors \
--access_key user_key12345678890 \
--secret_key rps_pass987654320 \
--endpoint https://s3api-eu-ro-1.runpod.io/ \
--region EU-RO-1
3 Replies
we had the same problem. S3 is not working currently on EU-RO-1. We switched to EU-IS-1 and there it is working. We contacted Runpod Support yesterday, they said that the engineers are working on it, but it seems they didn't solve it yet
I see. I thought I was doing something wrong. Thank you so much for the info @JanE !
I second that @JanE , saved my sanity! thanks a mil