Error when uploading to Storage (Swift)
I keep getting this error when trying to upload an image file to my Supabase Storage bucket: "StorageError(statusCode: Optional(400), message: Optional("Error"))". I am trying to figure out if there is an issue with my code or with the way I have set up my Storage policies. I have attached an image of my code, any help would be greatly appreciated!

52 Replies
I don't know the swift client, but in general, from looking at the test code in the repository, that looks correct if your bucket is really
user.pictures
What is your insert WITH CHECK policy?Thanks for the reply, my bucket is in fact called "user.pictures". I do not currently have any policies written for the bucket so maybe that is the issue.
Yes you have to have policies (at least
true
)I've added a new policy for insert: bucket_id = 'user.pictures', but am still getting this error. That should have done job right?
The docs say just insert policy (needs to be WITH CHECK not USING) but sometimes you need also select policy for inserts (that return data).
I added the bucket_id = 'user.pictures' constraint to every operation on my storage bucket but this error still seems to be occurring. I appreciate the help, so if I wanted to make anyone be able to be able to insert, delete, update, and select from the Storage bucket I should just be able to add a policy for each with the bucket_id = 'user.pictures' constraint and this should work? Or am I wrong in thinking that? (This is the way I currently have my policies set up)
That should work. RLS just has to evaluate true. Can you show your insert policy in a screen shot?
yea sure

Did you create the bucket?
yea, here's a screenshot. the image currently in there I uploaded in Supabase and 'folder1' is currently empty

You might look at the Edge logs in the dashboard and see if you can see the storage request and if the path looks correct.
Also try uploading with a different folder name than
public
in the path. That is ringing a bell for me, but it could be a really old issue.https://github.com/supabase/supabase/issues/10328 He seems to be saying public did not work for him as a path. I thought this was an old issue that was resolved somehow.
GitHub
Selfhosted storage getPublicUrl {"statusCode":"400","error":"ENOENT...
Selfhosted storage getPublicUrl bucket name:bucket path:public/pic001.jpg {"statusCode":"400","error":"ENOENT: no such file or directo...
OK the issue I was thinking of was https://github.com/supabase/storage-api/issues/119 which was a bucket named public. I've never heard of a restriction on public in the file path.
GitHub
A bucket named "public" will fail download of files with 404. · I...
Bug report If you create a bucket called "public", using the download command in the UI and doing a storage.from.download from the API will return 404 for any files. Public URL wo...
with the way the path written above I get this URL: supabase.co/storage/v1/object/user.pictures/public/README.md"
however, if i move public to the beginning of the bucket id I instead get: supabase.co/storage/v1/object/public/user.pictures/README.md
but sadly neither of these work :/ , I'll take a look at posts you sent.
You certainly don't want to move public to the bucket id. I assume you wanted a folder called public. Putting public in the path has nothing to do with a public bucket.
I just created a folder called public in the UI and was able to upload a file to it thru the UI, so I'm skeptical it is public in the path, but just odd the other user mentioned a few days ago.
That lifts some confusion. I saw public in the path on a few different posts and thought I needed it to reference the fact that my bucket was public.
Your first URL looks correct.
https://ugcxxxxq.supabase.co/storage/v1/object/public/public/sbcheck.jpg
worked for me. That uploads to a bucket called public and a folder called public in the path. Neither have anything to do with a bucket marked as public.I'll make a folder called public within my bucket and try that url again.
still not working for some reason, is there any way the 400 error could be due to an issue with the actual file being uploaded?
Should not matter. I just double checked by uploading
public\filename.png
with no public folder and it worked. I don't think there is an issue with public in the pathname, but still odd that last issue mentioning that in the path.
I don't think so. Usually on upload it is a permission issue or RLS issue.
You could try and download a file that is already there, to see if it is a permission issue.hmm, i've tried with and without the public folder and same issue, this is really weird.
Oh that (downloading) might not work as a test for RLS if it is a public bucket.
Yea that makes sense
I am allowed to download
should I make it private and see if I still can download with the policies I mentioned earlier?
That would be a good test. Everything acts like permission, but your insert policy looked OK to me.
I have never seen a bucket with period in it, but the UI you show seems to be OK with that and period is a valid s3 character.
I have never seen a bucket with period in it, but the UI you show seems to be OK with that and period is a valid s3 character.
I'll try that and will change the name also cause why not lol
nvm I forgot you cant change bucket names
Just confirming 400 is an RLS response... https://github.com/supabase/supabase/discussions/5191
GitHub
I can't upload to my bucket · Discussion #5191 · supabase/supabase
I'm trying to upload an image to my bucket but I get error 400. My function: const handleFilesUpload = async (e) => { const image = e.target.files[0] console.log(image) const { data,...
even with the bucket private I am able to download the image but still cannot upload anything
I'm out of ideas at the moment. I don't think Swift is involved as your URL looks good. You might verify you actually have file data, I don't know what happens if nothing is provided.
So you meet RLS to download.
I should have file data, I changed the upload to a text file where I'm converting the text directly to data. I'll make sure though
yea I do, but still not insert for some reason, even though the policy is pretty much the same besides the with check on my insert
thanks for all the help btw
Just to clarify on that picture of RLS it says Update. Don't know if that means update policy, or is some message it is updating a policy. Insert policy is needing the With Check.
I was just entering an update so I could screenshot the whole SQL policy
here is what it looks like in the actual UI

something weird – I am still able to download from the bucket even when it is private and I have removed the select policy...
You could have a second policy somewhere on storage.objects.
I just deleted every one of my policies, but am still able to access it
If I was using my service key would that affect what permissions I have?
Yes that would bypass RLS.
So if i use the service key I should be able to upload data regardless of RLS right?
I would think so.
Im going to see if I can upload a file with the service key
now I cannot download anything, I've run out of ideas, thanks for all of your help
I'll let you know if I find a solution
Are you sure you are on the same instance for all you are doing? Has happened to others with similar instances to each other.
the same supabase instance?
Yes. It is not unheard of to be changing one instance thru the UI and having the code access another by mistake.
I don't think I completely understand. By instance are you referring to the whole supabase project? Like maybe I could be referencing a separate project? If so, that shouldn't be an issue since I only have one supabase project. Or do you mean something else
No that was it, has happened several times here with people with multiple instances and running the client on a different one than they are making RLS (or whatever changes) to.
ah i see
they were using the wrong project url?
Yes to a similar instance like a dev or early version. It is a dumb move, but when you are moving code around it happens.
I wish that was my problem, I just copied and pasted my project url to make sure and it doesn't seem to be the case
I think my RLS actually was not working for the select
Yeah, just checking, I've spent hours working with a user and seen other spends hours only to discover the mixup. With one instance it can't happen.
Oh man, that would be frustrating
I think there may have been some lag time after I turned the bucket private to it actually being private
because all of a sudden I could not download anything, even with the select policy being active
but then I just switched it back to public and was able to download again
If that is your only bucket try clearing all RLS and starting over. Or maybe create a new bucket and start from scratch. It acts like RLS. But if somehow the storage.objects table grants are messed up that might do the same thing. They should not be though unless you messed with SQL directly on that table.
I dont think I messed with it directly, but I think you're right about starting over
Im just going to trash everything in storage and restart
going to grab some lunch and try that
Ill let you know how it goes
So I deleted everything and created a bucket defaulted as a private bucket. Then added policies that allowed all operations with the same condition as earlier (the bucket id = "blah blah"). Then when I initialized the supabase client within the app I chained off of the supabase client rather than using the storage client directly (i.e. my code looked like supabaseClient.storage.from()...etc rather than storageClient.from()...etc.) I then was able to upload and download data and everything seems to be working as expected. I am not sure whether deleting my existing storage schema and creating a new storage bucket defaulted as private was what fixed the issue or if using the supabase client directly within my code was what fixed the issue, but it is working now.
I really appreciate all of the help you gave me and the time you spent working through the issue with me, as someone who is always learning, it means a ton
Could very well be the storage client directly. I did not know you were doing that and it is possible it did not have the authentication header passed to it.
Although, I'm not going back thru to see if anything rules that out as far as the tests we did. Anyways glad it is working.