[TH] Image Optimizer [Deleted]

Read the majority of thread posts, and my question didn't get covered...is there an option to run this in batches?

2.3 million attachments, over 750GB. Pretty sure running this in "one shot" will break some things...
 
Read the majority of thread posts, and my question didn't get covered...is there an option to run this in batches?

2.3 million attachments, over 750GB. Pretty sure running this in "one shot" will break some things...

I bought this today and didn't see one :(
 
I've run the CLI command 2 days ago... after running it for 39 hours and done 42% only of the total attachment I cancel it to change the batch count.

Now running again for 11 hours, seems to skip old attachments that are already optimize.

Problem is /internal_data/temp/ has 285 GB already.

Can I clear this folder now?
Or i should wait for it to finish?
Estimate 2-3 days more to go.

And I have 261 GB free disk only now.
My total forum size before running the command is only 313 GB (2 days ago).

@Lukas W.

Any ideas for a drive which is already 90% full (with no way to increase the size)? Could I watch the disk space and just empty /internal_data/temp/ periodically (maybe with a time criteria so it doesn't remove files currently being processed)? If I kill it while it's processing (to sleep, for example), then re-run it, will it skip the attachments already processed?
 
@Lukas W.

Any ideas for a drive which is already 90% full (with no way to increase the size)? Could I watch the disk space and just empty /internal_data/temp/ periodically (maybe with a time criteria so it doesn't remove files currently being processed)? If I kill it while it's processing (to sleep, for example), then re-run it, will it skip the attachments already processed?
Going to tag @mattrogowski on this as should be able to assist :)
 
Another solution could be to migrate your data to an S3 bucket, use the s3 adapter, and process the images on infinite scale of storage.

if you don't want to keep them there, move them back to your local drive when you're done (with overwrite) and it should be a pretty seemless process.
 
Read the majority of thread posts, and my question didn't get covered...is there an option to run this in batches?

2.3 million attachments, over 750GB. Pretty sure running this in "one shot" will break some things...

Check out my previous posts. Essentially, this thing never cleans up after itself until it's done running. If you have a cloud provider where you can attach block storage (ie, Linode has it, DO, etc), I would suggest doing that. I've optimized 1TB of attachments but I had to create an ADDITIONAL 2TB storage that I symlinked internal_data/temp to.

You'll find a ton of temporary copies of that data while the thing is running and until it's completely done with all attachments, it won't clean. Then poof, it wipes them all.

I deleted some as it went and it didn't lose any attachments but I don't like doing that. It did end up completing but the process took about 5 days. From the command line it never crashed... so YMMV.

Also, I gave up even trying to optimize gif's or webp -- the external providers are slow as heck and none of them ever compress anyway. Was a waste of time to even try.

In addition, I only use pngquant and jpegoptim. They are going to give you the best bang for buck/speed comparison. Also pay attention to the quality flag on pngquant. It's not lying when it tells you this:

Speed/quality trade-off from 1 (brute-force) to 10 (fastest). The default is 3. Speed 10 has 5% lower quality, but is 8 times faster than the default.

The speed up really is 8 times faster for such a slight difference. If you are doing a ton of attachments, I'd go with 10.
 
My current settings, not using any external provider.

PNG > Pngquant / OptiPNG / Pngcrush
JPEG > Jpegoptim
GIF > Gifsicle
 
This resource has been removed and is no longer available.
Despicable Me Reaction GIF
 
Big and exciting changes are coming as we adjust our trajectory, just know that we’ll be with you every step of the way. Take a look at our announcement here for more information. If you have any questions, please contact us here so we can assist.

*People who have purchased this product in the past will continue to have access to updates as long as the product is maintained. The license it was purchased under will remain intact.
 
I'm on 100,000 of around 500,000 attachments that have been optimized via cmd.php -- The process seems to have slowed down and internal_data/temp has grown quite a bit, about 60G.

Is there a way to stop this process and restart at a certain number? I don't want to stop it now, lest I have to start over from scratch. But if I can tell it to continue from attachment X, then that would be ideal.

This is about 12 hours into the process by the way.
What is the command to optimize via cmd.php?
We have a very large number of attachments and it keeps timing out after some time...
 
What is the command to optimize via cmd.php?
We have a very large number of attachments and it keeps timing out after some time...
Here ya go:

Code:
php cmd.php | grep -i image
  xf-rebuild:thimageopt-optimize-existing-image  Optimizes existing content
 
Here ya go:

Code:
php cmd.php | grep -i image
  xf-rebuild:thimageopt-optimize-existing-image  Optimizes existing content
Thank you!
I also found this:
Yes, you can run php cmd.php xf-rebuild:thimageopt-optimize-existing-image, and optionally pass through a content type with --type (for example --type=attachment - by default will do all).
In case anyone else needs it...
 
iirc there is an actively supported addon from another developer which offers similar features.

EDIT: and use this coupon: https://xenforo.com/community/threads/nobita-me-offers-15-sales-off.149211/post-1587394
 
Top Bottom