XF 2.0 Block specific page only to guests?

V3NTUS

Well-known member
Hi,

I'm having some issues as we're getting DDoS attacks and we noticed they're pointing a precise url: /whats-new/

Now, I know I can remove public access to the link by having the "What's New" menu vieweable only to registered users, but it doesn't solve the problem as it simply "hides" it (it can be still accessed via direct url), while I'm looking for a way to require users to be logged in to ACCESS this page.

For the moment, being still under attack and not finding any alternatives, I disabled the access to that page via .htaccess, but of course regular users can't access the page, too, this way, which isn't the ideal solution.

If there was a way to make the .htaccess recognize guests and/or registered members, it might work, but I think there must be a way to completely disable the /whats-new/ for guests via Xenforo Admin Panel.

Any ideas?

Thanks.
 
Quick and dirty temporary solution until someone comes up with something better.
In src/XF/Pub/Controller/WhatsNew.php add this function
PHP:
    protected function preDispatchController($action, ParameterBag $params)
    {
        if(!\XF::visitor()->user_id)
            throw new \Exception("No permission");
    }

You could ban the ips aswell. Keep in mind that this will target legit robots / crawlers aswell.
 
Last edited:
Quick and dirty temporary solution until someone comes up with something better.
In src/XF/Pub/Controller/WhatsNew.php add this function
PHP:
    protected function preDispatchController($action, ParameterBag $params)
    {
        if(!\XF::visitor()->user_id)
            throw new \Exception("No permission");
    }

You could ban the ips aswell. Keep in mind that this will target legit robots / crawlers aswell.

Thank you, I'll try it! If it was easy, I'd do it, but I have to ban nearly 500 different IPs every day and it's a quite time-consuming thing. For some reasons CSF&LFD doesn't detect these multiple connections, altough CT_LIMIT is set to a low value, so it must be a different type of attack, but hey, I'm going too off-topic here. I'll end my message simply thanking you once again :)
 
So far it's working, site's up and running :)

Screenshot_10.webp

The list is waaaaay longer (at least 50 times longer), but having a server with 128GB of Ram we can "survive" to these attacks as soon as we block them or at least try to leverage their effects.

Maybe I'll have to tweak apache somehow to handle more connections, but they're endless so I'm not sure it'd fix the problem.
 
In that case you might want to replace the throw statement with a simple die(). This would interrupt the request instead and wouldn't blow up your error log. Not sure though what the side effects are, if any.
 
Thank you, altough I think they stock up because of the keep-alive directive of Apache, but I might be wrong.

This attack has been going for 6 days non-stop already, hopefully I'll find a good way to detect and block all the ips automatically.
 
RewriteCond %{HTTP_COOKIE} !xf_user
RewriteRule .* [F,L]

That will send a 403 access denied to anything not sending the xf_user cookie logged in users have.

Doesn't this mean visitors will be blocked, too?

UPDATE

I just checked, guests can still access the site, altough it looks different, similarly to when css is broken.
 
With the second rule I wrote with whats-new, if they log out they will only be blocked from whats-new.

Also, do you have access to raw server logs? If so, paste some of the log entries in this thread, and if it looks like they are using a unique user agent, I can redo the ruleset to block only these bots, and from all pages and resources (css, javascript, etc).
 
Thank you, I truly like all the help I'm getting from you guys, it's amazing. Yes, it's a dedicated server, so I can paste a few lines here, or I can even send you the access_log if you would like to.
 
Sure thing, paste a few lines with hits from the IPs you know are part of the attack. Keep in mind though, if the attack is using hijacked browsers I won't be able to write a ruleset blocking via user agent, because the user agent strings will be from real browsers and not unique. I'll have a better idea either way after you post log entries.
 
So it looks like these requests keep being listed in Apache Status, and taking slots even after restarting Apache or PHP-FPM. They seems to not be affecting the site right now, which makes me thing it's something left from a previous attack we had earlier probably.

But anyways, here are the logs which show the attack is still going on, as attachment.

Looks like they are mixed, some from *common* user agents, but many of them seems to be coming from this agent:

"Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; AS; rv:11.0) like Gecko"


At least 50% of the IPs seems to come from this: Windows NT 6.1
 

Attachments

According to the logs you supplied, it would be difficult to block by user agent since there are so many. The page you referenced in the post above will help, but I suggest this:

None of the fetches from the fake users have a referral page in the log. 99% of legit browsers will. So you can block them this way:

RewriteCond %{HTTP_REFERER} ^-?$
RewriteRule whats-new [F,L]
 
Would this block google bots from crawling our pages too? Thank you again for your support!

UPDATE

I think it should be somehow updated, as many users reach our site by direct url, including me and all the staff. It might be a good *temporary* solution, but the problem is we're under attack for nearly a week already, so we'd cause many inconveniences to our users (which is anyways the goal of the ddos).
 
Last edited:
Code:
RewriteEngine on
RewriteCond %{HTTP_COOKIE} !xf_user
RewriteCond %{HTTP_REFERER} ^-?$
RewriteRule whats-new [F,L]

This should block only guests who don't have a referral, right? If so we're getting closer and closer to the ideal solution, thanks again, I hope we'll keep improving, so far it's a huge change in just a few minutes. I'm forever grateful!
 
Top Bottom