ChicagoCoin
Member
My site is on Shared Hosting and has been for the past 19 years, we have noticed sporadic site response times, sometimes it loads fine and other times it takes awhile to load, also have some 503 errors. I contacted the host and it's due to a procwatch daemon.
I'm wondering if Cloudflare and/or a robots.txt file would help, I should mention that I have a custom phprc file (php.ini) file that I altered a long time ago to allow large file uploads, maybe that also needs to be altered?
Sorry for the trouble with your site. It turns out that the account has
processes are being killed by our procwatch daemon. Procwatch is a
daemon that runs constantly on shared servers to monitor the usage of
RAM/CPU and execution time so that no single user can use an
inappropriately high percentage of the shared resources and impact the
overall health of the server or the server’s ability to serve all users’
pages.
I was trying to get things back to normal before receiving this response and updated XF to the latest version, they first suggested to use a robots.txt file to limit crawlers, did some research and it looks like a different option might be CloudFlare? It's available but not activated. They mention that it's memory related and then suggest a VPS or private server. This has just been a hobby of mine and due to the size of the site, a VPS is not something that I'd consider.Additionally, all processes run by all users on the server from the same account are also counted together. When a process is killed it is generally not using too much memory by itself, it was just the process that tipped the total usage over the limit.
I'm wondering if Cloudflare and/or a robots.txt file would help, I should mention that I have a custom phprc file (php.ini) file that I altered a long time ago to allow large file uploads, maybe that also needs to be altered?
Last edited: