Chunked Uploads - XF2

Chunked Uploads - XF2 [Paid] 1.0.2

No permission to buy ($20.00)
So I set the max upload size in PHP to be 25600M and see it reflected in phpinfo, but I'm still receiving a generic error from XF: "The uploaded file is too large for the server to process.". Is there another setting to set outside of upload_max_filesize?
You need to setup the chunk size as well. The maximum chunk size should be at or below the upload_max_filesize value.
 
You need to setup the chunk size as well. The maximum chunk size should be at or below the upload_max_filesize value.
Max chunk size is below upload_max_filesize.

Checking the Virtually increase maximum attachment file size option worked, but not sure if I should leave this :/
 
Max chunk size is below upload_max_filesize.

Checking the Virtually increase maximum attachment file size option worked, but not sure if I should leave this :/
Yeah, leave it on and run your tests. That is the option to enable the addon. :)
 
Ah, interesting. So checked "Virtually increase maximum attachment file size" means use chunked uploads. Unchecked, it goes back to XF behavior?
 
@JulianD is there a way to modify the processing of chunks in batches to avoid timeouts? For large uploads, I continue to hit timeouts:
  • ErrorException: Fatal Error: Maximum execution time of 30 seconds exceeded
  • src/addons/ChunkedUploads/_vendor/flowjs/flow-php-server/src/Flow/File.php:185
 
@JulianD is there a way to modify the processing of chunks in batches to avoid timeouts? For large uploads, I continue to hit timeouts:
  • ErrorException: Fatal Error: Maximum execution time of 30 seconds exceeded
  • src/addons/ChunkedUploads/_vendor/flowjs/flow-php-server/src/Flow/File.php:185
You need to adjust your web server and/or PHP settings to increase the request timeout. Another option is to decrease the chunk size. Test those different options and see if it helps.
 
You need to adjust your web server and/or PHP settings to increase the request timeout. Another option is to decrease the chunk size. Test those different options and see if it helps.
I can increase request timeout, but that is a tradeoff to opening up risk of this being abused for ddos. If the chunk size is smaller won't that require more processing post upload?
 
I can increase request timeout, but that is a tradeoff to opening up risk of this being abused for ddos. If the chunk size is smaller won't that require more processing post upload?
I agree with you but that's the tradeoff that you need to evaluate if you'd want to have large uploads properly working. I'd start by increasing the chunk size and evaluate if the final file is properly rebuilt.
 
I'd start by increasing the chunk size and evaluate if the final file is properly rebuilt.
IMHO chunk size doesn't really matter that much in regards to reconstruction time.

No matter if a 20 GB file is splitted into 2000 x 10 MB or 200 x 100 MB:
The server has to read 20 GB and write 20 GB - the pure amount of data is the bottleneck, not the overhead for opening / deleting chunk files (this basically should be negligible).

Let's say the storage can provide a sustained read and write performance of 2.000 MB/s - even at that speed it would take ~ 10 seconds just for reading & writing.

Real-world performance (on VPS) is probably much less so I'd expect smth. around 20s+ at least.
 
IMHO chunk size doesn't really matter that much in regards to reconstruction time.

No matter if a 20 GB file is splitted into 2000 x 10 MB or 200 x 100 MB:
The server has to read 20 GB and write 20 GB - the pure amount of data is the bottleneck, not the overhead for opening / deleting chunk files (this basically should be negligible).

Let's say the storage can provide a sustained read and write performance of 2.000 MB/s - even at that speed it would take ~ 10 seconds just for reading & writing.

Real-world performance (on VPS) is probably much less so I'd expect smth. around 20s+ at least.
That is correct for the rebuild process. If the timeout is happening while uploading the chunks, then the size of the chunk matters.
 
That is correct for the rebuild process. If the timeout is happening while uploading the chunks, then the size of the chunk matters.
The timeout is happening during rebuild. That's why I was hoping that final step could be done via background process with a timeout check post-upload. This is the only way to really handle large file upload without risking the server to DDoS through upload.
 
The timeout is happening during rebuild. That's why I was hoping that final step could be done via background process with a timeout check post-upload. This is the only way to really handle large file upload without risking the server to DDoS through upload.
Not in the current stage, no. I'm sorry. The rebuild process should be completed in one go.
 
That is correct for the rebuild process. If the timeout is happening while uploading the chunks, then the size of the chunk matters.
The timeout (as posted by @stromb0li in https://xenforo.com/community/threads/chunked-uploads-xf2-paid.163964/post-1668497) is happening when rebuilding the original file :)


That's why I was hoping that final step could be done via background process with a timeout check post-upload.
Most certainly not without entirely changing the approach this Add-on uses to reconstruct the orginal file.
 
Does not seem to work for me. Even with chunked uploads enabled it still uses a single request to upload the file causing Cloudflare 413 error. It's not splitting the requests. I've set the chunked file size to 25000 but still fails.

View attachment 299348
Make sure to check "Virtually increase maximum attachment file size" in the add-on options. Can confirm it works, using cloudflare myself.
 
@JulianD received the following WARNING in error log today. PHP v8.2.X; XF 2.2.13

Code:
ErrorException: [E_WARNING] filemtime(): stat failed for /var/www/internal_data/chunked_uploads/temp/a9b5e1e19248c881ea850671ccdd40177b639d37_1
src/addons/ChunkedUploads/_vendor/flowjs/flow-php-server/src/Flow/Uploader.php:34

Stack trace
#0 [internal function]: XF::handlePhpError(2, '[E_WARNING] fil...', '/var/www/...', 34)
#1 src/addons/ChunkedUploads/_vendor/flowjs/flow-php-server/src/Flow/Uploader.php(34): filemtime('/var/www/...')
#2 src/addons/ChunkedUploads/XF/Pub/Controller/Attachment.php(17): Flow\Uploader::pruneChunks('/var/www/...')
#3 src/XF/Mvc/Dispatcher.php(352): ChunkedUploads\XF\Pub\Controller\Attachment->actionUpload(Object(XF\Mvc\ParameterBag))
#4 src/XF/Mvc/Dispatcher.php(258): XF\Mvc\Dispatcher->dispatchClass('XF:Attachment', 'Upload', Object(XF\Mvc\RouteMatch), Object(Snog\Forms\XF\Pub\Controller\Attachment), NULL)
#5 src/XF/Mvc/Dispatcher.php(115): XF\Mvc\Dispatcher->dispatchFromMatch(Object(XF\Mvc\RouteMatch), Object(Snog\Forms\XF\Pub\Controller\Attachment), NULL)
#6 src/XF/Mvc/Dispatcher.php(57): XF\Mvc\Dispatcher->dispatchLoop(Object(XF\Mvc\RouteMatch))
#7 src/XF/App.php(2487): XF\Mvc\Dispatcher->run()
#8 src/XF.php(524): XF\App->run()
#9 index.php(20): XF::runApp('XF\\Pub\\App')
#10 {main}
 
Top Bottom