Compatibility for CSRF protection & Cloudflare full HTML page caching

eva2000

Well-known member
I'm curious if there's any better way for CSRF protection that would work with Cloudflare or other CDN's guest full HTML page caching which uses cookies to differentiate between logged in/logged out guest users?

The issue that arises with Xenforo 2.x in CSRF and full page HTML caching is similar to the one outlined by Cloudflare for Magento and includes the workaround Magento did at https://blog.cloudflare.com/the-curious-case-of-caching-csrf-tokens/. Easy work around for register/login, guest posting and guest search and contact links can be made. But a better solution would be nice like the one outlined in Cloudflare blog :)

from https://blog.cloudflare.com/the-curious-case-of-caching-csrf-tokens/

There's quite a lengthy Github thread which outlines this issue and references the Pull Requests which fixed this behaviour in the the Magento Turpentine plugin. I won't repeat the set-up instructions here, but they can be found in an article I've written on the Cloudflare Knowledge Base: Caching Static HTML with Magento (version 1 & 2)

Effectively what happens here is that the dynamic CSRF token is only injected into the web page the moment that it's needed. This is actually the behaviour that's implemented in other e-commerce platforms and Magento 2.0+, allowing Full Page Caching to be implemented quite easily. We had to recommend this plugin as it wouldn't be practical for the site owner to simply update to Magneto 2.

and https://support.cloudflare.com/hc/e...Caching-Static-HTML-with-Magento-version-1-2-

Most Secure: The best alternative is to use AJAX to dynamically fill in the value of the CSRF token in your Magento site. When a user clicks the button to add something to their cart, some JavaScript jumps in to update the CSRF token in the forms to match the user’s session. This can enable most of the page to be served from cache but will still require a request back to the origin to fetch the token.


Edit: seems like a compromise is to have an option in Admin which can globally turn off overlays for guest visitor clicked on login, register, contact, search, and guest post thread link/buttons. So that they aren't done on the cached HTML page but redirected to a separate HTML which can be excluded from cache path etc.
 
Last edited:
Upvote 25
Not related to caching, but I started logging invalid CSRF tokens to pinpoint issues and the ones that always pop up are from Googlebot. They are probably fetching pages and then putting associated AJAX requests into a queue for later spidering.

Going to header-based CSRF (at least as one option) would fix the errors Googlebot gets from taking too long to make the AJAX request.
 
Ya, you couldn’t do away with CSRF unfortunately, but you could add support for it to the existing method that does CSRF checks with 1 or 2 lines of code. And that would solve some real-world CSRF issues I see with Googlebot doing delayed AJAX requests at least.
 
What do you think about relying on the "Origin" HTTP header instead of CSRF tokens? Afaik, the last major browser that got reliable support for this was Firefox 70 in 2019. You could even fall back to the "Referer" header for even older browsers (then only really old browsers that for some reason don't even send a Referer header would become unsupported). See: https://security.stackexchange.com/...-headers-enough-to-prevent-csrf-provided-that

Alternatively (or in addition), you could modify XenForo to use SameSite cookies which protect against CSRF, too:
Diff:
--- b/src/XF/Http/Response.php
+++ a/src/XF/Http/Response.php
@@ -427,6 +427,16 @@ class Response
                     'samesite'
                 ], $cookie);
 
+                // SameSite-Cookies for everyone except Safari ≤ 15.3
+                // https://bugs.webkit.org/show_bug.cgi?id=219650
+                // https://bugs.webkit.org/show_bug.cgi?id=226386
+                $userAgent = $_SERVER['HTTP_USER_AGENT'] ?? '';
+                $isOldSafari = preg_match('/^((?!chrome|android).)*safari/i', $userAgent) && preg_match('/Version\/(10\.|11\.|12\.|13\.|14\.|15\.[0123])/', $userAgent);
+                if (!$isOldSafari)
+                {
+                    $options['samesite'] = 'Lax';
+                }
+
                 if (empty($options['samesite']))
                 {
                     unset($options['samesite']);

I think CSRF tokens are not necessary anymore.
 
What do you think about relying on the "Origin" HTTP header instead of CSRF tokens?
That would work (for modern browsers) if everything that needs protection is sent via POST / XHR / fetch - but sadly this is not the case:

You could even fall back to the "Referer" header for even older browsers (then only really old browsers that for some reason don't even send a Referer header would become unsupported).
Referer is kinda unreliable - it might very well be turned off by the browser settings, blocked by personal firewalls or anonymized / randomized by browser Add-ons.
This would also require every protected request to be POST.

Alternatively (or in addition), you could modify XenForo to use SameSite cookies which protect against CSRF, too
Still same issue with GET unfortunately and just setting all existing XF cookies samsite lax (which at least in Chrome is the implied behaviour anyway by now) would not prevent CSRF for guests.[/CODE]
 
That would work (for modern browsers) if everything that needs protection is sent via POST / XHR / fetch - but sadly this is not the case:
True. IMHO that should be changed (i.e. turn these into POST requests). In modern browsers (unfortunately not in Safari, as always ;)) there is even a solution for GET requests: https://web.dev/fetch-metadata/ (https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Sec-Fetch-Site)

Referer is kinda unreliable - it might very well be turned off by the browser settings, blocked by personal firewalls or anonymized / randomized by browser Add-ons.
I'd guess that not many users have a browser that is both old (doesn't send the "Origin" header) and has an extension that blocks the "Referer" header. I might be wrong but as a fallback (!), checking the "Referer" header would be good enough IMHO.
 
In modern browsers (unfortunately not in Safari, as always ;)) there is even a solution for GET requests: https://web.dev/fetch-metadata/ (https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Sec-Fetch-Site)
Yes ... and no :)


Even if the browser supports Sec-Fetch-Site the server couldn't determine if a navigational (-> Sec-Fetch-Mode / Sec-Fetch-Dest) request for Logout originated from an intended user action (eg. clicking Logout in the Accout menu) - or by being tricked into clicking such a link (embedded in a post like here).

I might be wrong but as a fallback (!), checking the "Referer" header would be good enough IMHO.
Yeah, might be good enough as "last resort" - but not primary source.
 
Last edited:
Even if the browser supports Sec-Fetch-Site the server couldn't determine if a navigational (-> Sec-Fetch-Mode / Sec-Fetch-Dest) request for Logout originated from an intended user action (eg. clicking Logout in the Accout menu) - or by being tricked into clicking such a link (embedded in a post like here).
You could also log someone out just by viewing a post (wouldn't even need to click) if you were to put the logout link as an image in a post if it was relying purely on Sec-Fetch-Site. There's never going to be a realistic way to completely do away with CSRF tokens when the site has user generated content.
 
You could also log someone out just by viewing a post (wouldn't even need to click) if you were to put the logout link as an image in a post if it was relying purely on Sec-Fetch-Site.
Yes, that's why I said that Sec-Fetch Site by itself is not enough - combined with POST it should be sufficient, hence why I think that GET requests should not change state.

There's never going to be a realistic way to completely do away with CSRF tokens when the site has user generated content.
I am tempted to disagree :)

IMHO user generated content isn't a problem per-se (as long as the UGC can't create a POST, if it can ... that would indeed be a problem).
So as long as UGC can't do that and all protected actions are only performed via POST, tokens wouldn't be required.

In fact, having CSRF tokens in URLs is potentially dangerous as they can leak rather easily.

CSRF tokens in GET requests are potentially leaked at several locations, such as the browser history, log files, network utilities that log the first line of a HTTP request, and Referer headers if the protected site links to an external site.

I've got at least one PoC where CSRF tokens in URL could be leaked from XenForo via Referer header.
 
Last edited:
Back
Top Bottom