Using DigitalOcean Spaces or Amazon S3 for file storage in XF 2.1+

Using DigitalOcean Spaces or Amazon S3 for file storage in XF 2.1+

No permission to download
Is the bucket Name really attachments.mydomain.info ? Above you mentioned it without the plural s?

Normally that works out of the box. Use wanted Subdomain as bucket name and add CNAME. I did this many times, and never had issues.
 
I personally am using two S3 buckets, the default data bucket is public, and the internal is restricted.
But thats not neccesary, afaik the System sets the restriction for internal files that they are not visible to the public.
So, that looks correct, yes. But check if the internal-data is really protected from the public.
Sorry for the german screenshot. But only the top entry should have access.
View attachment 253080

That is a good idea, but I already placed both folders in to the same bucket. The problem I have is that all my folders and files are restricted, both data and internal_data. Even though I un-blocked public assess for the whole bucket. I still can not see them.

I think it may be the way I imported the folder from xenforo. All the individual files (objects) are set to private, so I guess my questions is:
  1. Do I have to go to my s3://xf-bucket/data/ and make all the folders/files public that way?
  2. Because there are 1 million files in the avatars folder, and that will take a long time.
  3. Or is there another global setting I can use?
Thanks.
 
I personally am using two S3 buckets, the default data bucket is public, and the internal is restricted.
But thats not neccesary, afaik the System sets the restriction for internal files that they are not visible to the public.
So, that looks correct, yes. But check if the internal-data is really protected from the public.
Sorry for the german screenshot. But only the top entry should have access.
View attachment 253080

Or would I change something here instead?

ALC.webp
 
these are the settings i use:

Block all public access
Off
  • Block public access to buckets and objects granted through new access control lists (ACLs)
    Off
  • Block public access to buckets and objects granted through any access control lists (ACLs)
    Off
  • Block public access to buckets and objects granted through new public bucket or access point policies
    On
  • Block public and cross-account access to buckets and objects through any public bucket or access point policies
    On
 
I assume you copied both folders the same way? That may the reason. You need to copy the data folder with enabling public access.
Note: When copying your existing data files across, they will need to be made public. You can do this by setting the ACL to public while copying:
You need to change this. Esiest and fastes way is, to use the command line tool.
 
So I've implemented this tutorial on my forums just to find out that attachments are served through PHP at a url like attachments/name.123 instead of a static file thus negating the benefits of a CDN.

Is there a way to allow users to attach images and videos on the forums on remote storage and have it served as a static file instead of PHP?
 
It doesn't exactly negate the benefits of a CDN if cheaper disk storage is your main concern.

Attachments are served like that so we can apply the permission checks before streaming the content from the CDN. There's no way to change that as it stands but it is a feature we have discussed for inclusion in the software at some point.
 
Cheaper disk storage is a benefit of course, but if you are using something like s3 that bills you per download (counts as an API request), then having the webserver re-download the attachment from remote storage to serve every page view is going to rack up the bills that way.

I suppose I didn't think about the use case of private attachments, but there really needs to be something to serve public static assets so users don't have to upload to imgur then paste the link.
 
"7. You now need to go to the "IAM" console."

Where is this?

Literally lost already, nothing says IAM 😭
 
Last edited:
Thanks! I'm now lost here:

Go back to the previous "Add user" page, click the "Refresh" button and search for the policy you just created.
Click "Next", followed by "Create user".
This will give you a key and a secret. Note them down.


I can see the policy under Policies but no Add User on that page. And the Add User page does not show the policy.

EDIT: Figured it out. Confusion, as to, would be easier to create policy from Policy section then Add User. Starting in Users is confusing.
 
Last edited:
I have the forbidden errors when tying an avatar.

EDIT:
Caused by pasting in the code blocks from the tutorial with the "xftest" data in them :ROFLMAO: not entirely my fault, it doesn't say to plug in your info. But my admin did have a strange orange smilie avatar for sec.

I think it's good now! Uploaded an avatar.
 
Last edited:
If I don't have confidence using S3cmd can I ftp in binary mode to my computer and then to AWS? Seems it is possible but just want to check it's okay to do.
 
If I don't have confidence using S3cmd can I ftp in binary mode to my computer and then to AWS? Seems it is possible but just want to check it's okay to do.
Heads-up before you try that, you'd need an S3 compatible client instead of just an FTP client. The free version of FileZilla, for example, doesn't support S3 connections but the paid "Prod" version does.

Depending on the amount of files you need to transfer, s3cmd may prove to be the most efficient method (since then it'd be a server-to-server transfer instead of a server-to-client-to-server transfer).
 
But binary mode ftp would be okay, just checking. I believe I can use Cyberduck.

EDIT: I have made a successful connection w/Cyberduck I think I will use that.
 
Last edited:
Using S3cmd and connected, now, trying to figure out the command in linux.

So I cd to the data directory correct?

Done.

s3cmd put * s3://yourfolder --acl-public --recursive

And I run this code?
"your folder" I put data?

Want to make sure I don't do data/data
 
Last edited:
no, that's the full path to your bucket and data folder. copy it from the config file.basically, swap https:// with s3://
 
Top Bottom