XF 2.1 Configuring Wasabi S3

Venthas

Member
Hey all,

Brand new to the platform here. I've been following this resource:


And I've hit a bit of a snag.

Code:
InvalidArgumentException: Endpoints must be full URIs and include a scheme and host in src/addons/XFAws/_vendor/aws/aws-sdk-php/src/ClientResolver.php at line 595
Aws\ClientResolver::_apply_endpoint() in src/addons/XFAws/_vendor/aws/aws-sdk-php/src/ClientResolver.php at line 288
Aws\ClientResolver->resolve() in src/addons/XFAws/_vendor/aws/aws-sdk-php/src/AwsClient.php at line 161
Aws\AwsClient->__construct() in src/addons/XFAws/_vendor/aws/aws-sdk-php/src/S3/S3Client.php at line 263
Aws\S3\S3Client->__construct() in src/config.php at line 15
XF\App->{closure}() in src/config.php at line 27
XF\App->{closure}()
call_user_func() in src/XF/FsMounts.php at line 17
XF\FsMounts::loadDefaultMounts() in src/XF/App.php at line 1043
XF\App->XF\{closure}() in src/XF/Container.php at line 28
XF\Container->offsetGet() in src/XF/App.php at line 2423
XF\App->fs() in src/XF/Util/File.php at line 209
XF\Util\File::deleteFromAbstractedPath() in src/XF/Service/Attachment/Preparer.php at line 75
XF\Service\Attachment\Preparer->insertDataFromFile() in src/XF/Service/Attachment/Preparer.php at line 23
XF\Service\Attachment\Preparer->insertAttachment() in src/XF/Attachment/Manipulator.php at line 170
XF\Attachment\Manipulator->insertAttachmentFromUpload() in src/XF/Pub/Controller/Attachment.php at line 89
XF\Pub\Controller\Attachment->actionUpload() in src/XF/Mvc/Dispatcher.php at line 350
XF\Mvc\Dispatcher->dispatchClass() in src/XF/Mvc/Dispatcher.php at line 257
XF\Mvc\Dispatcher->dispatchFromMatch() in src/XF/Mvc/Dispatcher.php at line 113
XF\Mvc\Dispatcher->dispatchLoop() in src/XF/Mvc/Dispatcher.php at line 55
XF\Mvc\Dispatcher->run() in src/XF/App.php at line 2191
XF\App->run() in src/XF.php at line 391
XF::runApp() in index.php at line 20

Not exactly sure what's going wrong. The setup I've got in the config.php is:

Code:
$s3 = function()
{
   return new \Aws\S3\S3Client([
      'credentials' => [
         'key' => '{WasabiKey}',
         'secret' => '{WasabiSecret}'
      ],
      'use_path_style_endpoint' => true,
      'region' => 'us-east-1',
      'version' => 'latest',
      'endpoint' => 's3.us-east-1.wasabisys.com'
   ]);
};

$config['fsAdapters']['data'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), '{bucketname}', 'data');
};


$config['fsAdapters']['internal-data'] = function() use($s3)
{
   return new \League\Flysystem\AwsS3v3\AwsS3Adapter($s3(), '{bucketname}', 'internal_data');
};

$config['externalDataUrl'] = function($externalPath, $canonical)
{
   return '{cloudflareCDNurl}' . $externalPath;
};

Cloudflare is pointed to the wasabi bucket, which is in their us-east-1 region.

Any help greatly appreciated!
Ven
 
As a further question here. I'll be transferring over an Invision 4.x gallery that's hosted with Wasabi. Is there anything special I need to know about this?
 
PHP:
$config['externalDataUrl'] = function($externalPath, $canonical)
{
   return '{cloudflareCDNurl}/data/' . $externalPath;
};

Should have been this. But still having issues with the proper permissions, getting an "access denied" on the keys, even though the proper policy is added.
 
'endpoint' => 's3.us-east-1.wasabisys.com'
I assume it should be:

PHP:
'endpoint' => 'https://s3.us-east-1.wasabisys.com'

I see you may have corrected that now.

If you're getting an access denied error you may need to confirm with the service provider that their API is fully compatible with the Amazon Web Services SDK and that it is configured correctly.
 
I assume it should be:

PHP:
'endpoint' => 'https://s3.us-east-1.wasabisys.com'

I see you may have corrected that now.

If you're getting an access denied error you may need to confirm with the service provider that their API is fully compatible with the Amazon Web Services SDK and that it is configured correctly.

Yes, that's all corrected now, but we're still having issues uploading files and viewing files:

1600623475408.png
I've confirmed the policy for the user we're using:
Code:
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:ListBucket",
        "s3:GetObject",
        "s3:GetObjectAcl",
        "s3:putObject",
        "s3:putObjectAcl",
        "s3:DeleteObject",
        "s3:ListAllmyBuckets"
      ],
      "Resource": [
        "arn:aws:s3:::{BucketName}",
        "arn:aws:s3:::{BucketName}/*"
      ]
    }
  ]
}

The only missing permission is the "s3:ReplicateObject". Uploads are still failing as well.


Edit: The really confusing part of this is that I can connect to the bucket using this user via Cyberduck, and can upload and download files from it.
 
whoa. had no idea wasabi uses the same permission model of s3 that was kind of annoying to use. digital ocean spaces does not bother with that so much simpler to use. same with other services i have used so far. would give wasabi a try some day.
 
whoa. had no idea wasabi uses the same permission model of s3 that was kind of annoying to use. digital ocean spaces does not bother with that so much simpler to use. same with other services i have used so far. would give wasabi a try some day.
Yeah. The only issue with DOS is the costs. We're going to be importing quite a large community that has already over 500GB in image data alone. Wasabi's pricing is the main factor we stick with them (5.99/TB is real nice). Now if I could only figure out where the permissions have gone wrong.
 
Yeah! Wasabi is the cheapest I have seen so far! StackPath is another service I have considered in the past but I am not sure if they require S3 type permission model.

Looking at the issues you are facing, I am tempted to give it a go on my test install to see if I can manage to get it working. DO Spaces and Backblaze B2 were both incredibly simple to use after dumping S3. I am nowhere near the stage where Wasabi makes sense. But seems like something worth playing with.
 
Yeah! Wasabi is the cheapest I have seen so far! StackPath is another service I have considered in the past but I am not sure if they require S3 type permission model.

Looking at the issues you are facing, I am tempted to give it a go on my test install to see if I can manage to get it working. DO Spaces and Backblaze B2 were both incredibly simple to use after dumping S3. I am nowhere near the stage where Wasabi makes sense. But seems like something worth playing with.
Sounds good! We've got Cloudflare pointed to the s3 bucket, so that's why we use our cdn.cloudflareurl.url/data/. The issue is obviously with the bucket permissions, but I can't figure out why, for the life of me.
 
Got it working with a complete reinstall. The s3 setup has to be the first thing done.

Now I've just got to go about importing our test install of IPS (our Beta site) into xF, and I'm already running into issues there. We use Wasabi S3 to host all of our data except themes with the s3 bucket through Invision's built in file system. When trying to go through a conversion, and specifying the root IPS directory as /, we get a "directory doesn't contain the expected results", same with /upload.
 
The only missing permission is the "s3:ReplicateObject". Uploads are still failing as well.

Hello there Venthas,
I know it's been a long time, but hopefully you still remember 😅

I'm trying to set up Wasabi on my testing board, but I also am facing this ReplicateObject error when pasting the JSON code in the policy.

How did you set yours? Has everything been working regularly since you reinstalled from scratch?

Thanks in advance!
 
Top Bottom