I personally am using two S3 buckets, the default data bucket is public, and the internal is restricted.
But thats not neccesary, afaik the System sets the restriction for internal files that they are not visible to the public.
So, that looks correct, yes. But check if the internal-data is really protected from the public.
Sorry for the german screenshot. But only the top entry should have access.
View attachment 253080
data
and internal_data
. Even though I un-blocked public assess for the whole bucket. I still can not see them.s3://xf-bucket/data/
and make all the folders/files public that way?avatars
folder, and that will take a long time.I personally am using two S3 buckets, the default data bucket is public, and the internal is restricted.
But thats not neccesary, afaik the System sets the restriction for internal files that they are not visible to the public.
So, that looks correct, yes. But check if the internal-data is really protected from the public.
Sorry for the german screenshot. But only the top entry should have access.
View attachment 253080
You need to change this. Esiest and fastes way is, to use the command line tool.Note: When copying your existing data files across, they will need to be made public. You can do this by setting the ACL to public while copying:
It’s the AWS IAM console"7. You now need to go to the "IAM" console."
Where is this?
Literally lost already, nothing says IAM![]()
Heads-up before you try that, you'd need an S3 compatible client instead of just an FTP client. The free version of FileZilla, for example, doesn't support S3 connections but the paid "Prod" version does.If I don't have confidence using S3cmd can I ftp in binary mode to my computer and then to AWS? Seems it is possible but just want to check it's okay to do.
s3cmd put * s3://yourfolder --acl-public --recursive
We use essential cookies to make this site work, and optional cookies to enhance your experience.