We have got a forum site that has a lot of threads/posts. Thus, if we go to the forum listing page, it has a "lot of" pages (In paging) of threads.
Now some crawlers access these deep forum pages, like 60000th page, 70000th page etc.
"GET /forum/latest-news/page-64099 HTTP/1.1"
"GET /forum/latest-news/page-65000 HTTP/1.1"
This creates a heavy load on the db because MySQL offset queries will be heavy to fetch this kind of content. And eventually, the site goes down in front-end for a certain period until MySQL server load is back to normal.
I am already using the following in my Nginx conf to apply rate limit:
limit_req zone=mysitezone burst=20;
limit_conn phpblock 5;
But not sure if that's effective, any way to solve this issue..?
Thanks in advance!
Now some crawlers access these deep forum pages, like 60000th page, 70000th page etc.
"GET /forum/latest-news/page-64099 HTTP/1.1"
"GET /forum/latest-news/page-65000 HTTP/1.1"
This creates a heavy load on the db because MySQL offset queries will be heavy to fetch this kind of content. And eventually, the site goes down in front-end for a certain period until MySQL server load is back to normal.
I am already using the following in my Nginx conf to apply rate limit:
limit_req zone=mysitezone burst=20;
limit_conn phpblock 5;
But not sure if that's effective, any way to solve this issue..?
Thanks in advance!