I am using FileUpload field in a filament panel resource. I can get it to upload to a local folder, but am striking out using S3. I have the disk set correctly, folder exists on S3.
What I want to do is: - user can upload multiple files (mostly pdf) - works n ow - to an S3 bucket - doesn't give any error message, but file not uploaded - the original titles will show. - works now - After saving, the user (or others) can download from S3, using a generated timed URL. The code below works if local - Some users can be authorized to delete files.
Most examples given are for images and I haven't seen any examples that doesn't require a publicly available folder.
Here is my existing code. I works when disk is local, stores both original and generated file names in JSON array fields in my table.
FileUpload::make('additional_files_random') ->label('Upload PDF Files') ->multiple() ->directory('mm33_documents') ->disk('local') ->storeFileNamesIn('additional_files_original') ->downloadable()
FileUpload::make('additional_files_random') ->label('Upload PDF Files') ->multiple() ->directory('mm33_documents') ->disk('local') ->storeFileNamesIn('additional_files_original') ->downloadable()