XF 1.5 Resource Mgr - Fatal Error: Allowed memory size .. exhausted.

Aceros

Active member
I have some larger files I'd like to list in the resource manager (without raising the general upload size limit). I read a suggestion of uploading a dummy file & then replacing it on the backend to get around upload restrictions etc. This seemed like a great solution but I'm running into this error:

PHP:
ErrorException: Fatal Error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 1973093520 bytes) - library/XenForo/FileOutput.php:53
Generated By: Aceros, 21 minutes ago
Stack Trace
#0 [internal function]: XenForo_Application::handleFatalError()
#1 {main}
Request State
array(3) {
  ["url"] => string(81) "https://www.site.com/forums/resources/file.24/download?version=24"
  ["_GET"] => array(2) {
    ["/forums/resources/file_24/download"] => string(0) ""
    ["version"] => string(2) "24"
  }
  ["_POST"] => array(0) {
  }
}

I found the xf_attachment table attachment_id points to the data_id which is the prefix of the file in internal_data/attachments/#/prefix-hash
I replaced the file & on accessing get a 500 error + the above in the Server Error Log.

I'm running centminmod & have raised the following (and restarted) in effort to lift this limit:
  • client_max_body_size 2000M (nginx conf)
  • client_body_timeout 600s (nginx conf)
  • upload_max_filesize 2000M (php.ini)
  • post_max_size 2000M (php.ini)
  • memory_limit 2000M (php.ini)
How best do I accomplish the above? Where can I remove this limit or is there another route to go about this altogether? Also a bit curious how the hash on the file is calculated haha.
 
@Aceros are you using nginx?

Very simple solution is my (free) Attachment Improvements add-on and Nginx's X-Accel-Redirect feature. XFRM uses the standard attachment stuff under the hood so it should work properly.

Note; you will need to adjust your nginx config a bit.

This way php does the authentication and handling, and then tells php to redirect to the file to serve it. This means php's memory limit isn't relevant and you aren't using up a php worker to serve the actual file.
 
Last edited:
I'm back to this issue, @Xon's addon is great for the end-user downloading but I am looking for a way to get a few larger files into the resource manager.

I am using Cloudflare which limits uploads through the frontend to 100MB.

I have attempted (above) replacing dummy files via sftp but I think there is an issue with that. Does anyone else have an idea how I might go about resolving this?
 
Top Bottom