Using DigitalOcean Spaces or Amazon S3 for file storage in XF 2.1+

Using DigitalOcean Spaces or Amazon S3 for file storage in XF 2.1+

No permission to download
Got B2 to work quite well.

Moving quite a few files over with S3CMD is **** though.
Highly recommend S5CMD.

Getting 100x throughput on my PC compared to S3CMD. Not even running optimally.

To add to the above...

If you are using Cloudflare, make use of their features. Even more so if you are using BackBlaze, no bandwidth charges!
Use a CNAME to proxy your S3 bucket.
https://help.backblaze.com/hc/en-us/articles/217666928-Using-Backblaze-B2-with-the-Cloudflare-CDN
This will work for any S3 host. Saves bandwidth and S3 costs if your host does not already provide some sort of CDN. Digital Ocean does from the looks of it. Free too...

As your URL will include the bucket ID, and the bucket is public, hide it and make the URL look pretty with Cloudflare workers! https://jross.me/free-personal-image-hosting-with-backblaze-b2-and-cloudflare-workers/
Again this will work with any host.

AFAIK BackBlaze B2 does not have access control, unlike most other S3 providers. With access control, you can have a public bucket, but whitelist Cloudflare IP addresses so that no one can crawl your public bucket. Randomize your bucket names anyway! No need to share the bucket name with anyone when you proxy the traffic...
If you use Xenforo to proxy the traffic, you can set your bucket to private anyway.
Need to investigate if Xenforo caches these files... If it pulls a file for each request, not only is that a few ms of time added to the page render, it is S3 requests and bandwidth. Any input on this?
 
Last edited:
Cloudflare just announced their own object storage platform named R2 today. This is going to be very interesting as they handle CDN distribution automatically!

Hopefully, we are able to set where the data is stored. My CloudFlare guy does not work on it so wont give me any details :/
 
they do seem to indicate in that blog post that the data would be stored closest to where it is accessed from frequently. this is why i am interested in this product. i do not have to pick a location (Amazon charge extra for Indian servers iirc). i do not have to think about setting up a CDN. R2 takes care of both the things.
 
having the webserver re-download the attachment from remote storage to serve every page view is going to rack up the bills that way.
This makes this guide problematic, IMHO. Storage charges, egress charges/allowance, and webserver egress charges/allowance means double-dipping with your infrastructure solution - bandwidth charges/allowance when the webserver pulls the file, for each browser request, from the object storage, and then again webserver bandwith charges/allowance for delivery of the attachment to the browser.

it is a feature we have discussed for inclusion in the software at some point.
Any progress on this? End user file delivery directly from object storage, via time based meta, would eliminate the majority of the double dipping and IMHO make object storage for XF much more viable and cheaper.
 
Cloudflare just announced their own object storage platform named R2 today. This is going to be very interesting as they handle CDN distribution automatically!


I just signed up with Couldfare and I see a notification/option:
Cloudflare Images is now available. Store, resize and serve images on your website in just a few clicks while eliminating massive egress costs and storage bucket issues.

Resizing: Free
You can create up to 20 variants.
Storage: $5.00 per 100,000 images (prepaid)
You only pay for the original image. If you have 10 original images with 5 configured variants, only the 10 original images count towards your storage limit.
Delivery: $1.00 per 100,000 images served (postpaid)

Is this R2 already? I'm looking to set up a CDN .. and if it's ready to go on CF that'd be great. Says noting about video though ...
 
Going through the instructions it says to make the files public. I read that XF serves a local URL for images, but it's pulled from the CDN. Can somebody use DEV tools to fin that URL? If I got the CDN URL to a pic posted in a private forum can I share that and anybody pull it up?
 
Going through the instructions it says to make the files public. I read that XF serves a local URL for images, but it's pulled from the CDN. Can somebody use DEV tools to fin that URL? If I got the CDN URL to a pic posted in a private forum can I share that and anybody pull it up?
Bump for this.

also .... is there a way to keep the pic on the server as a backup while still pushing it & serving it from a CDN?
 
Bump for this.

also .... is there a way to keep the pic on the server as a backup while still pushing it & serving it from a CDN?
Don't use your server as a backup.
Backups for S3 should be done onto other S3 services, S3 Glacier, or a dedicated backup drive/service.


As for how images are served, depends on how you set it up.
The documentation suggests serving avatars and attachment thumbnails via the S3 CDN while serving the larger images via your server.
This is a sort of hybrid between offloading and security. Though it is not ideal.

You can actually serve both either way and even get Xenforo to serve JS and CSS via S3 via this config if you wish. Check the docs for more info on this. Quite a lot of customization and performance improvements can be made by setting a few things up in there.
It is a balance of security and offloading load onto CDN rather than your own server.

If you have private forums, and you don't want any content being accessible from non-logged-in/users without permission, then serve everything via your server, not the CDN. Hopefully, images and stuff are cached on the server so you don't pull it on every request...
Is anyone able to confirm if this is the case currently?

If your forum is fully public, then you can serve everything via the S3/CDN and completely offload those requests from your HTTP server.

Ideally, we need to make use of pre-signed URLs with S3. This would allow Xenforo to give permission based on user-level while serving content from S3/CDNs that support such authentication features.

This has been requested already as part of an overhaul of the attachment system, hopefully, they can work on it soon. Would be a massive improvement!
Would pay for this as an addon wink wink nudge nudge...


AFAIK, there is no solution to caching images and attachments outside of Xenforo while protecting files.
 
Last edited:
Don't use your server as a backup.
Backups for S3 should be done onto other S3 services, S3 Glacier, or a dedicated backup drive/service.
"Backup" may have been the wrong word, by definition. I was thinking more of a failsafe. If they are on your box and the CDN, and the CDN stops serving, that your box would pick right up serving the files. You know ... just incase the CDN uploads a bad config and takes their servers down for a day while or they lock themselves out of their buildings .... or something...
Awkward Season 4 GIF by The Office


AFAIK, there is no solution to caching images and attachments outside of Xenforo while protecting files.
Well thats a real bummer, I thought I read that it did but I can't find it. Damn, I was looking forward to implementing this.....
 
Can someone confirm we do not need to upload internal_data/image_cache (the folder where https proxied images are stored?
 
that should be fine iirc. if that folder is empty, i believe xenforo would fetch the images fresh automatically. it would only be a problem if you have long retention and some of those images are dead and cannot be retrieved fresh.
 
Ideally, we need to make use of pre-signed URLs with S3. This would allow Xenforo to give permission based on user-level while serving content from S3/CDNs that support such authentication features.
Nope. Ideally you'd need an edge node that is capable of authorizing the request via origin and afterwards serve the content directly.

This sohuld be doable with CloudFlare workers, Amazon Lambda@Edge, etc. - it just seems nobody has build smth. like this so far :)

Using signed URLs is IMHO somewhat band-aid as it would allow resources to be accessed byunauthorized 3rd parties (within the validity time of the token) and create usability issues (caching in browser, URL sharing, etc.) in the long run.

But if one wants to go that route, there already is an Add-on that apparently does this:
 
that should be fine iirc. if that folder is empty, i believe xenforo would fetch the images fresh automatically. it would only be a problem if you have long retention and some of those images are dead and cannot be retrieved fresh.
I'm talking in terms of - when using this remote attachment storage - are image_cache files also stored and pulled from S3 - or are they stored locally?
 
Is anyone cloning their DigitalOcean spaces to a second space (maybe even in another region) - for "backup" purposes? (Backup in the sense that if 1 space were to ever just fail or disappear at least you'd have a clone of it to fall back on).
 
Top Bottom