Add-on Amazon S3 file uploads

Mike Tougeron

Well-known member
Has anyone written an add-on for xenforo that writes the image/file uploader to Amazon S3? I tried looking around the resource manager and didn't see anything but I wanted to check-in and make sure I didn't just miss it.
 
I wouldn't know where to begin, unfortunately Mike, and I don't think there's an existing add-on.

But for the benefit of yourself or anyone else who might want to give it a try, Zend Framework includes some Amazon S3 classes that may help as a starting point.

Check out: http://framework.zend.com/manual/1.12/en/zend.service.amazon.s3.html

The Zend classes that relate and are included in XenForo are:

Zend_Service_Amazon_S3
 
Yeah, if there isn't one already written I'll have to write one myself. I just didn't want to re-invent it if someone already did it.
 
So hopefully my changes will get accepted into the core, but to do this it doesn't actually need a full fledged add-on; yippee!
- Follow the changes mentioned in http://xenforo.com/community/threads/enable-streams-for-internal-external-data-paths.47011/
- Change your config to point to your bucket. e.g.,
PHP:
    'internalDataPath' => 's3://your-bucket-name/internal_data',
    'externalDataPath' => 's3://your-bucket-name/data',
    'externalDataUrl' => 'https://s3.amazonaws.com/your-bucket-name/data',
- Add the following to your index.php and admin.php (I'm not sure but perhaps this could be done by extending the dependencies...)
PHP:
require_once 'Zend/Service/Amazon/S3.php';
$s3 = new Zend_Service_Amazon_S3('your_aws_key', 'your_aws_secret');
$s3->registerStreamWrapper("s3");
- Write a quick script to pre-create the sub-directories in S3. This is necessary so that XenForo doesn't try to create the directories itself in S3 -- the Zend_Service_Amazon_S3 doesn't support mkdir().
PHP:
require_once 'Zend/Service/Amazon/S3.php';
$s3 = new Zend_Service_Amazon_S3('your_aws_key', 'your_aws_secret');
$s3->registerStreamWrapper("s3");
file_put_contents('s3://your-bucket-name/data/index.html','');
file_put_contents('s3://your-bucket-name/data/attachments/index.html','');
 
file_put_contents('s3://your-bucket-name/internal_data/.htaccess',"Order deny,allow\nDeny from all");
 
file_put_contents('s3://your-bucket-name/internal_data/index.html','');
file_put_contents('s3://your-bucket-name/internal_data/temp/index.html','');
file_put_contents('s3://your-bucket-name/internal_data/page_cache/index.html','');
file_put_contents('s3://your-bucket-name/internal_data/attachments/index.html','');
for ($i = 0; $i < 1000; $i++) {
    file_put_contents('s3://your-bucket-name/data/attachments/' . $i . '/index.html','');
    file_put_contents('s3://your-bucket-name/internal_data/attachments/' . $i . '/index.html','');
}

That should be it! I'm still testing the edge cases but it feels pretty solid so far. Feel free to hit me up if you have any questions.
 
Very Cool script.
How about the current files? Re-upload first?
It'd be something like
PHP:
$baseDir = '/original/path/to/xenforo/data/attachments';
for ($i = 0; $i < 1000; $i++) {
    $files = scandir($baseDir . '/' . $i . '/');
    foreach ($files as $file) {
        if ( is_file($baseDir . '/' . $i . '/' . $file) ) {
            copy($baseDir . '/' . $i . '/' . $file, 's3://your-bucket-name/data/attachments/' . $i . '/' . $file);
        }
    }
}
 
It'd be something like
PHP:
$baseDir = '/original/path/to/xenforo/data/attachments';
for ($i = 0; $i < 1000; $i++) {
    $files = scandir($baseDir . '/' . $i . '/');
    foreach ($files as $file) {
        if ( is_file($baseDir . '/' . $i . '/' . $file) ) {
            copy($baseDir . '/' . $i . '/' . $file, 's3://your-bucket-name/data/attachments/' . $i . '/' . $file);
        }
    }
}

This will be very easy run into time out or die if we have network issue.
We may download then upload manully
 
This will be very easy run into time out or die if we have network issue.
We may download then upload manully
Yeah, that'd work too. I just threw that idea out there for you off the top of my head. I haven't written our actual script yet; that's something I'm going to try to do early next week.
 
I received following error:
Code:
Fatal error: require_once() [function.require]: Failed opening required 'Zend/Service/Amazon/Abstract.php' (include_path='.:/usr/lib/php:/usr/local/lib/php') in /home/xxx/public_html/test/library/Zend/Service/Amazon/S3.php on line 26
 
- Add the following to your index.php and admin.php (I'm not sure but perhaps this could be done by extending the dependencies...)
PHP:
require_once 'Zend/Service/Amazon/S3.php';
$s3 = new Zend_Service_Amazon_S3('your_aws_key', 'your_aws_secret');
$s3->registerStreamWrapper("s3");

Would it be easier to just add this to the config.php or will it not work from there?

Thanks,
-Bill
 
lol.. You just went over my head.. lol.. Is that in a file? Sorry.. I am only moderately experienced with this system.
 
One of the code event listeners (admin.php?code-event-listeners) you can use is init_dependencies. Inside the class/method you have that listener execute is where you would register the S3 stream wrapper.
 
Back
Top Bottom