Using DigitalOcean Spaces or Amazon S3 for file storage

Using DigitalOcean Spaces or Amazon S3 for file storage

No permission to download
It was mentioned in one of the have you seen posts.

No documentation needed. To set it up you still need to follow the instructions in the resource guide, you just don’t need the add-on.

If it is already set up and working, you can just uninstall the add-on.
Fantastic, thank you. (y)
 
I am not on XF 2.3 but upgraded PHP to 8.2 so now I am getting two errors:
InvalidArgumentException: Must pass valid resource in src/XF/Http/ResponseStream.php:18
and
Aws\S3\Exception\S3Exception: Error executing "HeadObject" on

I was going to upgrade this add-on but when I download it and click to upgrade it is for 2.3. Is there an edit I need to do or a file I can downlead to fix for 2.2 on PHP 8.2?
 
The add-on version is 2.3.0 but that’s its own version number and isn’t anything to do with XF 2.3.

When you eventually upgrade to XF 2.3 you can remove the add-on (but leave the config changes in place) as the files within the add-on are no longer needed.
 
Oh that is confusing ha! (2.2.1 -> 2.3.0) thanks!
So since I was getting those errors a few days does that mean those pics are not on AWS and does something need to be done? Thank again!
 
Getting ready to update to 2.3 from 2.2 later this week.

Is there anything I should do with this installed? Should I remove the plugin after the update or before?

Trying to have as much of a smooth sailing in upgrading as possible. :)

Thanks
 
It's been awhile since I set up a S3 bucket for XenForo on AWS. I'm currently trying to set up for a new forum and I'm getting plagged by errors related to ACLs.

I'm only able to get image uploads to work by enabling ACLs which AWS strongly discourages and I had to acknowledge several warnings about enabling ACLs.


1724541752919.webp


Is this correct and required to enable ACLs like this for XenForo to work correctly with S3 on AWS, or am I missing something.

The IAM User and permissions are all set up per the guide.
 
How can I use S3 for one folder only which is (internal_data/attachments) as the size of this folder is 25GB out of 26.5GB and I don't bother to keep all the other files (1.5GB) inside the server.

I removed the code related to (data) folder from config, but still the entire (internal_data) folder is linked with S3, how can I link (internal_data/attachments) only

The current config:
Code:
// Start - Amazon S3 Cloud
$s3 = function()
{
   return new \Aws\S3\S3Client([
      'credentials' => [
         'key' => '',
         'secret' => ''
      ],
      'region' => '',
      'version' => 'latest',
      'endpoint' => ''
   ]);
};

$config['fsAdapters']['internal-data'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), 'bucket_name', 'internal_data');
};
// End - Amazon S3 Cloud
 
How can I use S3 for one folder only which is (internal_data/attachments) as the size of this folder is 25GB out of 26.5GB and I don't bother to keep all the other files (1.5GB) inside the server.

I removed the code related to (data) folder from config, but still the entire (internal_data) folder is linked with S3, how can I link (internal_data/attachments) only

The current config:
Code:
// Start - Amazon S3 Cloud
$s3 = function()
{
   return new \Aws\S3\S3Client([
      'credentials' => [
         'key' => '',
         'secret' => ''
      ],
      'region' => '',
      'version' => 'latest',
      'endpoint' => ''
   ]);
};

$config['fsAdapters']['internal-data'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), 'bucket_name', 'internal_data');
};
// End - Amazon S3 Cloud
So it's not going to be the "cleanest" solution because XenForo doesn't support "sub-adapters" out of the box. However, if you install my Cloudflare addon (even if you don't use Cloudflare or R2), it tweaks XenForo's abstracted filesystem a little bit to support sub-adapters (for example you can have an abstracted filesystem adapter for just a folder). It's backwardly compatible, so doesn't break anything if you don't use it that way.

It was done for the exact reason you are running into... where you would want to have internal_data/attachments in something like S3, but not the rest of the internal_data folders).

Can see a bit of the format here:


...but it should work if you change the last part of your config above to this (assuming you have the Cloudflare addon installed):
PHP:
$config['fsAdapters']['internal-data/attachments'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), 'bucket_name', 'internal_data');
};
 
So it's not going to be the "cleanest" solution because XenForo doesn't support "sub-adapters" out of the box. However, if you install my Cloudflare addon (even if you don't use Cloudflare or R2), it tweaks XenForo's abstracted filesystem a little bit to support sub-adapters (for example you can have an abstracted filesystem adapter for just a folder). It's backwardly compatible, so doesn't break anything if you don't use it that way.

It was done for the exact reason you are running into... where you would want to have internal_data/attachments in something like S3, but not the rest of the internal_data folders).

Can see a bit of the format here:


...but it should work if you change the last part of your config above to this (assuming you have the Cloudflare addon installed):
PHP:
$config['fsAdapters']['internal-data/attachments'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), 'bucket_name', 'internal_data');
};

Thanks, Shawn. I thought it would be straightforward from config, but I'll give it a try.
 
I've been a Vultr user for years now, and see that XF Cloud uses it now too.
Chris D submitted a new resource:

I tried creating object storage and using both the S3 and DigitalOceans Spaces tutorials to try and get Vultr Object Storage to work. However, I just get error 500 when it's applied, no matter what logins, keys, paths, etc. I use., when I believe it should support the same method as S3, though tried the DigitalOceans method too, just to see if that would work.

@Chris D, it would be nice of you if you could find the time to update this resource to see how, and if, it can work with Vultr Object Storage, given the established business partnership between Vultr and Xenforo for Cloud.

Thanks!
 
It uses the s3 api, so it should just work as the s3.

if you're on 2.3, you need to remove this addon as it's native now. just the config file edits required.
 
  • Like
Reactions: frm
It uses the s3 api, so it should just work as the s3.

if you're on 2.3, you need to remove this addon as it's native now. just the config file edits required.
That was it. I missed the part where 2.3 was native.

But I am still having errors displaying uploaded images as previews

I:
  • Removed the add on
  • Configured config.php
  • Used s3cmd to copy over /data/
  • Used s3cmd to copy over /internal_data/
  • Tried to fix the following issues with s3cmd setacl s3://spacename/ --acl-public
  • Since /internal_data/ was copied over, rebuilt the master data (if that was necessary).
And this is what it looks like when an attachment is made:

1727242387822.webp

But, it'll post the image and display it just fine.

And what admin.php?attachments/ looks like:
1727242588040.webp

And when I hover the image,
1727243522182.webp

No thumbnail previews. But, if I View host content (post) on any of them, the images load just fine.

1727242701510.webp

Also,

1727242794002.webp

I can Choose File and the file name will show, but 2 seconds later it reverts right back to No file chosen, let's me hit Okay, and serves a broken image for the avatar.

I tried s3cmd setacl s3://spacename/ --acl-public to see if it was a permissions error, and they're set public (should they?), but it didn't fix any of the issues.
 
Very weird that it says 'unknown user' too. Sounds like some bigger issue at play here as the meta data to the database is not being saved correctly either, or you allow guest uploads?
 
Also, something I needed to add was certificates for aws.
see #5 here:
 
Very weird that it says 'unknown user' too. Sounds like some bigger issue at play here as the meta data to the database is not being saved correctly either, or you allow guest uploads?
Yes, guests can upload. The attachment preview thumbnail shows just the name, but the image posts correctly when a thread or post is made.

Also, logged in users can't upload an avatar. I tried doing it from the ACP, where it has the file name remain in the box after "Choose File", but it uploads a seemingly broken image.

I don't know what to do for the SSL stuff in regards to Vultr, but I'll try with their endpoints over what you showed in that thread.
 
For information.

The settings for the new Hetzner Object Storage are identical with DigitalOcean.

PHP:
$s3 = function()
{
   return new \Aws\S3\S3Client([
      'credentials' => [
         'key' => 'ABC',
         'secret' => '123'
      ],
      'region' => 'fsn1',
      'version' => 'latest',
      'endpoint' => 'https://fsn1.your-objectstorage.com'
   ]);
};

$config['fsAdapters']['data'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), 'engn', 'community/data');
};

$config['fsAdapters']['internal-data'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), 'engn', 'community/internal_data');
};

$config['externalDataUrl'] = function($externalPath, $canonical)
{
   return 'https://engn.fsn1.your-objectstorage.com/community/' . $externalPath;
};

1727964436774.webp

1727964548805.webp
 

Attachments

  • 1727964486248.webp
    1727964486248.webp
    59.9 KB · Views: 3
The settings for the new Hetzner Object Storage are identical with DigitalOcean.
They are most likely identical for any S3 compatible object storage.

As pointed out in another thread I'd recommended using separate buckets for data and internal-data.

Right now Hetzner seems to be a bit limited to me as they only offer 10 buckets per account and no custom domain / CNAME?
 
Back
Top Bottom