best way to handle image upload in t3 stack

I Am trying to upload image to digital ocean space which is similar to s3 bucket But i would like to do it from the backend How i approach this before now i create a function on the front-end then i upload them from the front-end straight then return the url location to the backend but am scared of my keys been public So i want to move the whole upload to thr backend but now i don't know how to implement this how should the images be passed from front-end to backend
31 Replies
arete
arete2y ago
i came across this vid should be helpful
arete
arete2y ago
the video using s3 but i think the upload process on the backend should be the same
Lopen
Lopen2y ago
Thanks I find it hard to follow up on his explanation so i tried to DIY on my own The problem is i can't send the file object to the backend Type of File Which is the default returned from file input I tried using useMutation But when i send it over to backend Nothing is received
JulieCezar
JulieCezar2y ago
You should generally not send any files to your backend! Instead generate a presigned URL which basically goes like this: 1. On a trpc endpoint, create a function that when called sends a request to AWS to get a presigned URL (you must send your api key to get it - which is why you want it on the backend) 2. Client side, when the user clicks upload or w/e get the presigned url with the trpc query 3. With the presigned url, which has a limited duration, you can just upload the file directly with that URL to AWS 4. If the upload is successful you will get the actual URL to that file, which you can then save in your DB With this you ensure all your api keys are secret. And the reason you don't want to send it to your backend is that will result in inbound traffic to your server which you will probably have to pay. I recommend using Uppy uploader to do that. From what I've used it's the best one 👍 Fun fact, I've found a CDN storage better than AWS called Bunny.net. It's extremely cheap, fast and has a beautiful UX and features. However they don't have the presigned url upload, which means I would have to send the files to my server first. However, that would cost too much because of the extra traffic, which is why I have to stick to AWS...
Lopen
Lopen2y ago
So after the presigned url is generated then the files can uploaded on the front-end directly without the server involvement? Am using digital Ocean space though But they are s3 compatible So right now how i am doing it Am using a next js route to upload the images then return the url And parsed with formidable and using @aws-sdk/lib-storage to upload the file
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Lopen
Lopen2y ago
I tried this his implementation didn't work for me Am using digital Ocean space and they support everything s3 has and uses s3 sdk
JulieCezar
JulieCezar2y ago
Sry was out of town for a while... But generally yes it should work like that Let me try
JulieCezar
JulieCezar2y ago
JulieCezar
JulieCezar2y ago
I will post the repo after I upload it, but here's the short version: I didn't use tRPC for this but created a simple api endpoint to get the presigned url. I used Uppy uploader for this and the AWSS3 plugin which calls the /api/get-presigned-url endpoint when the user clicks upload and then automatically takes that url and uploads it. What you need to do to get the repo to work: 1. go to Spaces > Manage keys > Generate new key and secret > Copy it into .env.example (note: the endpoint shoul be like this: https://fra1.digitaloceanspaces.com not the full one) 2. Rename .env.example to .env 3. In /api/get-presigned-url change your region in s3Client, bucketName and, path to your bucket/folder in the key variable 4. IMPORTANT! Go to your Spaces bucket > Settings > Add CORS > Allowed headers and Origin should be * , and select all methods for now 5. It should work now
JulieCezar
JulieCezar2y ago
GitHub
GitHub - zanzlender/dos-upload-test: DigitalOcean Spaces - upload test
DigitalOcean Spaces - upload test. Contribute to zanzlender/dos-upload-test development by creating an account on GitHub.
Lopen
Lopen2y ago
Thank you
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
JulieCezar
JulieCezar2y ago
Sadly no... I don't know if they implemented it since I checked but they don't have a good imeplementation for uploading from the frontend --> To upload you have to pass you API key as a header... Meaning you basically can't upload from the frontend without exposing your keys, which then means you SHOULD NOT do it like that, and then you need to proxy it through your server... Which then means this counts as traffic on your server and that can be potentially expensive, depending on which platform you use. They could have changed it now, and if they did that would be excellent 👍
JulieCezar
JulieCezar2y ago
JulieCezar
JulieCezar2y ago
They didn't
Лев Нефедов
Hey guys! I'm tryna solve the same problem but I save images in public folder. What's your advice on how to do that properly?
barry
barry2y ago
dont save images in public folder
zendev
zendev2y ago
Yeah I have a related issue too but I think it’s just a lack of understanding. I’m using Cloudflare R2 and uploading is all fine but then I don’t know how to get the permanent URL to the object stored in the S3 bucket after successful upload. I get an unauthorized error when trying to visit the url that R2 calls “S3 url”. Anyone know what I’m doing wrong?
Jaaneek
Jaaneek2y ago
Kinda off Topic but Supabase has amazing storage permission system that is connected to auth
JulieCezar
JulieCezar2y ago
the images folder is created as a static folder when you build your project. You should not upload images to it after the project is already built. Sounds like you made the folder where the images are private. Meaning you would need to get an authorization token each time you want to view the image. Now, you can do that if you want, but think about this: do the images really have to be private, or can they be public so that anyone can view, but only the author can edit? If it's the latter, you can change the permissions on that folder to public for view and it should work. If not then you have to implement getting a token/authorization each time you want to view it.
zendev
zendev2y ago
Yeah I just figured this out, thanks for confirming my assumption! Appreciate your help with all this
Лев Нефедов
so what folder do you recommend?
barry
barry2y ago
by not saving it in a folder in your app 🤣 what
Лев Нефедов
but what if I don't want to store images on cloud?
Лев Нефедов
You either answer normally or leave the conversation
Amit
Amit2y ago
@Lev the main problem why TRPC can't do this is because of json is a textual format, so sending binary data requires additional encoding (base64 can be used). This works for small files but for large files it wastes a lot of bandwidth and processing power. Also, TRPC does not want to fix this as far as I know(if the serialisation format is changed to something like msgpack etc you can send images as it is). Since TRPC is out of question, alternative is creating a custom api end point if you want to avoid cloud storage. Hypothetically, if you used zodios, ts-rest this wouldn't be a problem since they allow multi part form data. But in that case, you are depending on less mature libraries and loose the abstraction that TRPC provides (zodios/ts-rest are less abstract so knowing how http status codes/error handling works is required). Imo, since there are a very few cases where you need large file upload, I will just keep using TRPC and handle the rare cases with api endpoints.
barry
barry2y ago
he is talking about where to store the images but yh maybe this will deter him from committing dumb
Want results from more Discord servers?
Add your server