[DigitalPoint] App for Cloudflare®

[DigitalPoint] App for Cloudflare® 1.8.4.1

No permission to download
Sorry for the question which has probably been answered.

How do I use rclone to move my data to R2? Can someone explain that part to me, please?
It's going to depend on where your existing files are. It's going to be a different command based on the source.

The docs for rclone are here: https://rclone.org/docs/

I don't personally have much experience with rclone because I never had a ton of data to move (I used the CLI command I made to move my stuff). Probably best to search this forum for rclone sync and you should find others doing it.
 
So I installed this plugin and got to say it fairly easy to get started. I did run into an issue though and also got a general question.

I'll start with the question. Does this only copy files to R2 or does it also remove files from my server? Because I don't want files on my server as I'm looking at several hundreds of gigabytes in storage.

Issue, testing this with profile image and forum images, I only get a blank space, images are not showing. Except in forums (articles) where the image is showing in the preview but not in the full article.

noimage.webp
bucket.webp
 
The addon does not move existing files to R2, it only handles stuff going forward. There's a CLI tool to help move existing stuff, but if you have a ton of data to move, a tool like rclone is better suited. The tool that it comes with to help move data does not delete old files (you need to do that manually after you move it).

A couple things to note here... you are trying to use the same bucket for both. That will kind of work in theory, but it's a terrible idea. The data bucket needs to be exposed publicly (XenForo data directory is intended to be public), while the internal-data is intended to be private. So if you want to use a single bucket, you need to expose all of that bucket publicly... and then your internal-data is public.

Long story short is I would redo your config and use different buckets for each.
 
Right, I don't have any data on this site yet and once/if I get this to work I will not be migrating anything, I will be uploading everything from scratch as I'm using IPS right now. So my question would be if I upload a file, say in resources, will it only copy the file to R2 or will it copy it and then delete it from my web host server? Since I do not want to use up my limited space on my web host.

So you think if I use separate buckets that will resolve the broken images?
 
I'm also getting this error when trying to the addon automatically create a bucket for me

Client error: `GET https://api.cloudflare.com/client/v4/zones/7c20811b002586319439c2b87de411cb/rulesets/phases/http_request_cache_settings/entrypoint` resulted in a `403 Forbidden` response: { "result": null, "success": false, "errors": [ { "message": "missing the permissions required to read z (truncated...) / {"result":null,"success":false,"errors":[{"message":"missing the permissions required to read zone rulesets in the http_request_cache_settings phase"}],"messages":null}
 
Right, I don't have any data on this site yet and once/if I get this to work I will not be migrating anything, I will be uploading everything from scratch as I'm using IPS right now. So my question would be if I upload a file, say in resources, will it only copy the file to R2 or will it copy it and then delete it from my web host server? Since I do not want to use up my limited space on my web host.

So you think if I use separate buckets that will resolve the broken images?
No, it does not store files/objects in the local file system as they are uploaded. So if you have this setup before you have the files, they should only end up on R2.

It won't resolve the broken images, but it's the right thing to do (use 2 different buckets) since the permissions for each is different. The broken images is related to your other message.

I'm also getting this error when trying to the addon automatically create a bucket for me

Client error: `GET https://api.cloudflare.com/client/v4/zones/7c20811b002586319439c2b87de411cb/rulesets/phases/http_request_cache_settings/entrypoint` resulted in a `403 Forbidden` response: { "result": null, "success": false, "errors": [ { "message": "missing the permissions required to read z (truncated...) / {"result":null,"success":false,"errors":[{"message":"missing the permissions required to read zone rulesets in the http_request_cache_settings phase"}],"messages":null}
Have you verified you have the right permissions for your Cloudflare API Token? There should be 14 permissions. From the error message, it sounds like you might be missing the Zone.Cache Rules: Edit permission.

You can compare what you have with what you should have by looking at XF Admin -> Options -> External service providers under Cloudflare authentication.

You can check your API Tokens here:

 
Thanks! This was exactly it and everything seem to be working. One last question if yo know. Does R2 have any file size limits on uploading and if so can it be bypassed using for example the chunked upload plugin or by using R2 workers (I have no idea how they work)?
 
You can have unlimited objects per bucket, with each object limited to a maximum of 4.995 TB.

You can upload 5GB per request, so yes... if your files are above 5GB each, you would need to do chunked uploads (my addon does not do chunked uploads, so you are effectively limited to 5GB per file using it). But yes... technically you can upload files larger than 5GB with chunked uploads.

 
Unless you're on a business plan there is a 100MB per request limit (when using Cloudflare in proxied mode)
Ya, but if you are using a chunked upload plugin or something, you could do more than that. You also don't technically need to use Cloudflare proxy mode when using R2. In the context of the question, if you can get files to your server (however you do that), this addon can send them to R2 if they are under 5GB.
 
Still have problem when guest caching is enabled.
Logged in some pages, logged out for some others unless refreshing the pages.
The active user has changed.......
 
Free plan. But I have files that are 10GB+ so I might have to test chunked uploads before moving on.

If you are going through proxy mode, then that might be an issue.

The chunked uploader works very well, but you'll be limited to 100MB chunks (which in itself is ok)

However, after the last chunk is uploaded, the script only has 100 seconds (Cloudflare limit) minus the time it already used uploading the last chunk, to put all the chunks together and then process it.

At over a hundred chunks min with 10gb, that might not be long enough ..
 
I'm using S3 with no problems of uploading 10GB+ files in 95mb chunks using the IPS platform.
If you are using Cloudflare as a proxy it should be no different. The 100s limitation is in relation to getting the files to your server, not sending them to R2.
 
Back
Top Bottom