Cloudflare optimizations for XenForo

There aren't any universal WAF rules you need for XenForo. But this is what it looks like for one of my XenForo sites:

1701710708216.webp

The first one was setup by my Cloudflare addon, the other ones are just ones I added manually to deal with referrer bot spam and certain spiders (both of which are going to be unique to my site).
 
There aren't any universal WAF rules you need for XenForo. But this is what it looks like for one of my XenForo sites:

View attachment 294856

The first one was setup by my Cloudflare addon, the other ones are just ones I added manually to deal with referrer bot spam and certain spiders (both of which are going to be unique to my site).
You might like this one.

Screenshot 2023-12-04 at 10.34.45 AM.webp

Blocks all references to Wordpress URLS.

(http.request.uri.path contains "/wp-") or (http.request.uri.path contains "/xmlrpc.php") or (http.request.uri.path contains "/robots.txt")
 
You might like this one.

View attachment 294859

Blocks all references to Wordpress URLS.

(http.request.uri.path contains "/wp-") or (http.request.uri.path contains "/xmlrpc.php") or (http.request.uri.path contains "/robots.txt")
It will be a never-ending task of trying to block requests of script kiddies seeing if something is installed on your site or not. Also, not sure you'd really want to block robots.txt... that's going to mess with normal spidering of legit search engines. Not sure how Googlebot would handle it being told it's not allowed to spider robots.txt (with a 403 Forbidden response vs. the normal 404 Not Found). It probably would be okay, but why risk it?
 
It will be a never-ending task of trying to block requests of script kiddies seeing if something is installed on your site or not. Also, not sure you'd really want to block robots.txt... that's going to mess with normal spidering of legit search engines. Not sure how Googlebot would handle it being told it's not allowed to spider robots.txt (with a 403 Forbidden response vs. the normal 404 Not Found). It probably would be okay, but why risk it?
Yeah, should have mentioned: I am using that one on a completely private site: https://hooplanation.com/

Things become very very simple when you have a USA-only private site!
 
Yeah, should have mentioned: I am using that one on a completely private site: https://hooplanation.com/

Things become very very simple when you have a USA-only private site!
Right but even then, you should use robots.txt to block legit spiders.

I have a XenForo 2 site that is private, and the robots.txt instructs spiders to not index anything:


Blocking spiders from accessing the thing that tells them not to spider/index the site doesn’t seem like a good idea to me.
 
Like I said, it's going to be unique to the site, so it's not going to work right for your site, but here you go:

Referrer spam bots
Code:
(http.request.uri eq "/tools/search?domain=.carrefour.com.br") or (http.request.uri eq "/tools/search?domain=.insight.com") or (http.request.uri eq "/tools/search?domain=.elcorteingles.es") or (http.request.uri eq "/tools/search?domain=.dtgweb.com") or (http.request.uri eq "/tools/search?domain=.thebay.com") or (http.request.uri eq "/tools/search?domain=.homedepot.com") or (http.request.uri eq "/tools/search?domain=.staples.com") or (http.request.uri eq "/tools/search?domain=.macys.com")

Spiders that don't respect robots.txt
Code:
(http.user_agent contains "BrandVerity") or (http.user_agent contains "paloaltonetworks.com") or (http.user_agent contains "DataForSeoBot") or (http.user_agent contains "HeadlessChrome") or (http.user_agent contains "Neevabot")

Again... don't blindly add those without fundamentally understanding why you are adding them, they are specific to my site, not anyone else's site.
 
Like I said, it's going to be unique to the site, so it's not going to work right for your site, but here you go:

Referrer spam bots
Code:
(http.request.uri eq "/tools/search?domain=.carrefour.com.br") or (http.request.uri eq "/tools/search?domain=.insight.com") or (http.request.uri eq "/tools/search?domain=.elcorteingles.es") or (http.request.uri eq "/tools/search?domain=.dtgweb.com") or (http.request.uri eq "/tools/search?domain=.thebay.com") or (http.request.uri eq "/tools/search?domain=.homedepot.com") or (http.request.uri eq "/tools/search?domain=.staples.com") or (http.request.uri eq "/tools/search?domain=.macys.com")

Spiders that don't respect robots.txt
Code:
(http.user_agent contains "BrandVerity") or (http.user_agent contains "paloaltonetworks.com") or (http.user_agent contains "DataForSeoBot") or (http.user_agent contains "HeadlessChrome") or (http.user_agent contains "Neevabot")

Again... don't blindly add those without fundamentally understanding why you are adding them, they are specific to my site, not anyone else's site.
Can definitely use the second one! Thank you.

Question: do you bother to block any nations? I have CSF on the server and use WAF Rules to block China, Russia, North Korea and a host of African and Asian nations on all of my sites. Opinion?
 
Can definitely use the second one! Thank you.

Question: do you bother to block any nations? I have CSF on the server and use WAF Rules to block China, Russia, North Korea and a host of African and Asian nations on all of my sites. Opinion?
I don't for any of my sites, no. Not all, but most of my sites tend to have legit users everywhere.
 
 
From the cloudflare article it would seem like there was an issue at some point but it seems to have been squashed in xf 2.0 which is great. This is what caught my attention before finding this thread though. Well whatever probably overthought it anyways being so tired heh, it's only 2 am here.
1. Cloudflare acts as a reverse proxy, meaning that all visitor IP addresses will become Cloudflare-affiliated IP addresses. If you are using services like Stopforumspan or blocking registration by IP address, you need to restore original visitor IPsOpen external link.
 
Gents, thought it might be best to ask this here as opposed to a new thread:

Is there compelling reason to pay for the Pro plan for xf forums?
The things I use in Pro:
  • TCP turbo
  • Mobile optimization / Mirage
  • Lossless Image Optimization, Polish
  • Image resizing
  • Firewall Analytics. Really valuable. Especially in combination with the WAF rules. Analyse where attacks are coming from and add them to WAF rules.
  • 20 WAF rules. Really valuable to me because by differentiating between rules, you can be much more specific and between rules and their sequence of action. Instead of lumping everything together that needs to be blocked or challenged or whitelisted.
  • Captcha / JS Challenge suspect users. Instead of just outright blocking a group of suspect users, it allows you to stop the bots and let in the valid users. The solve rates and analytics can be used to further block or whitelist subgroups in WAF rules.
  • DDoS alerts. Get notified when CF finds an attack, so you can see whats happening and block where its coming from. Not essential, because most hosts allow you to set bandwidth and CPU warnings, that can also alert you to it.
  • Exposed credentials check: protect members from getting their accounts breached by reuse of breached credentials.
  • Site analytics: 15 minutes. Not a big one, but 1hr is annoying.
  • Ticket support can be useful depending on the issue at hand.
Business:
  • Depending upon your website, the 100mb upload max can be annoying. Video upload without chuncked uploads will fail. Business Plan will increase that to 200MB.
  • Security Analytics is nice.
  • Machine Learning Detection for attacks seems interesting, but I've yet to see it in action.
  • Chat support can be helpful depending on the issue.
  • 100% uptime guarantee.
I use Pro or Business for high traffic websites or websites that need the extra optimizations, while using Free for low traffic websites.
 
Last edited:
I'm a paid CF user. Got images and attachments in R2.

Should I enable image/unfurl proxies? What should I do? Just toggle them on?

1702904326212.webp
 
Back
Top Bottom