Image Management - best practise - discuss

skicomau

Active member
Hello everyone.

I have been running a forum for almost twenty years.

With the rise of sophisticated global social media the last six years (and obviously this means FB, Instagram, Twitter, Snapchat etc) - 1st class image management (uploading, optimising, serving) has become de rigueur -- especially if one expects your community to remain loyal and also to attract millennial members for whom a brilliant image experience is just a given.

To that end I am interested in what other forum operators have implemented to enable as frictionless an image experience as possible for their members.

Specifically...

Theme choice,
ie the graceful handling of responsiveness of images across devices including delivering for retina screens.

Lazyload techniques
best JS library

Image upload
Default XF flash uploader and fallback - or something else.

Image Optimisation
GDN/ImageMagick v's Add-ons that hand off file optimisation to coupled services such as Kraken, Optimus, Gifsicle etc.

Image storage for huge volumes of images
Local v's AWS v's offload to separate domain.

CDN v's Dynamic Image servers
eg maxcdn/cloudflare v's imgix/imagefly/cloudinary (cloudinary is expensive)

Linked images.
SSL solution, XF Proxy Images v's addon's like AndyB's Convert Image that prevent image link-rot.


My UX goals are the following;

1. Allow users to upload reasonably large images of 10MB with no image size (w x h) restrictions.
2. Upload with as little delay as possible (ie defer all processing)
3. Store the master image at original size - but after it has been file size optimised (ie manage my total disk storage overhead).
4. Desktop: Deliver version of image optimised for thread column width (905 pixels in my case) - if image is larger clicking images pulls in the master image at full size.
5. Mobile: deliver version of image optimised for device using srcset, provide 3 srcset choices. Tapping image in thread delivers best image for device (but does not deliver the master image if it is stupid large)


My stack to achieve all the above at the moment is;

-- UI.X theme (not current release)
-- Default XF uploader with Flash uploader enabled.
-- GFNio with Kraken.io as my Image Handler - not deferred.
-- [bd] Attachment Store using AWS buckets - deferred. Local Copy not kept - not using cloudflare (see next).
-- Imagefly as Dynamic image server for default desktop and mobile serving. (required editing template bb_code_tag_attach)

I have not yet got sorted for diferentiated retina experience.
I have not yet implemented a proxy/local solution for linked images.
I have not yet implemented lazy load.

My stack as described is still not ideal. Specifically GFNio and [bd] Attachment Store do not play nice and I am unable to set GFNio to deferred this means upload is 2x to 3x longer than necessary which is my primary gripe - plus I have not been unable to process with kracken the tens of thousands of legacy images that were already on AWS.


So....

If you have got to here and also consider Image Management one of the most important UX issues with running a great forum - please share your solution stack and experiences

thanks.
 
Last edited:
This is a critical topic for me since my sites are photography forums. Eg: https://www.mu-43.com/

Currently I am using this addon to serve images via KeyCDN:

https://xenforo.com/community/threads/tinhte-image-attachment-optimization.34400/page-15

KeyCDN works well and is relatively affordable. However, it doesn't include any image optimization.

I optimize images using addons. Have tried both GFNio and Nobita's addon. Currently I use the latter one because it is compatible with AndyB's Convert Image addon, which I also use. I previously used Xon's lazy load addon which works well, but I had to stop using it due to incompatibility with my CDN addon.

The main issue which remains unsolved on my sites and is a huge deal is that of scaling images and converting formats (eg webp) based on the user's device type, browser, internet speed, etc. I tried many services to solve this: ImageEngine, Cloudinary, Imgix, Imagefly, Cloudflare and others. The only one which worked consistently out of the box to resize images based on user was ImageEngine.

Unfortunately I could not afford the ImageEngine pricing. If you have less than 5GB/month image bandwidth, I would highly recommend using ImageEngine Lite (free) along with the Tinhte addon linked above. It works like magic both as CDN and image optimizer (including automatic resize based on user agent detection).

mod_pagespeed and ngx_pagespeed will do some of the above image optimization (also lazy loading), but they require more than one page load to optimize images. They also don't seem to work well on XenForo image attachments unless you move the attachments to data or external storage using the [bd] addon by @xfrocks .

I am worried about using the chance of losing my data when using the [bd] addon, so I've yet to make that move. Also, I read that it was not compatible with AndyB's Convert Image addon. But the idea of using the bd addon in conjunction with ngx_pagespeed remains appealing to me.

Tagging @eva2000 and @MattW since they have both done a lot of work on this topic.
 
Last edited:
The main issue which remains unsolved on my sites and is a huge deal is that of scaling images and converting formats (eg webp) based on the user's device type, browser, internet speed, etc. I tried many services to solve this: ImageEngine, Cloudinary, Imgix, Imagefly, Cloudflare and others. The only one which worked consistently out of the box to resize images based on user was ImageEngine.

Since posting this thread I have sorted this issue on my site and it works quite well including lazyload.

I deployed both Imgix and Imagefly.

IMGIX
For attached images I use Imgix with BD Attachment store. I choose Imgix because it has a nice S3 Bucket solution and also because the transform string is an append which works neatly with the template code.

I configured a S3 Source in Imgix pointing at the S3 bucket that BD attachment store uses.

I then used the CDN option in BD attachment store to set my Imgix URL.

NQ8H+



Then I edited template bb_code_tag_attach replacing 'canview AND full' elseif statment like this;

Code:
<xen:elseif is="{$canView} AND {$full}" />

<picture>
  <!--[if IE 9]><video style="display: none;"><![endif]-->
  <source media="(min-width: 480px) and (-webkit-min-device-pixel-ratio: 1.25), (min-width: 480px) and (min-resolution: 120dpi)" sizes="905px" data-srcset="{xen:link full:attachments, $attachment}?w=905&q=70&fit=max&auto=format 905w" >
  <source media="(min-width: 480px)" sizes="905px" data-srcset="{xen:link full:attachments, $attachment}?w=905&q=70&fit=max&auto=format 905w" >
  <source media="(min-width: 0px) and (-webkit-min-device-pixel-ratio: 1.25), (min-width: 0px) and (min-resolution: 120dpi)" sizes="415px" data-srcset="{xen:link full:attachments, $attachment}?w=415&q=70&fit=max&auto=format 415w" >
  <source media="(min-width: 0px)" sizes="415px" data-srcset="{xen:link full:attachments, $attachment}?w=415&q=75&fit=max&auto=format 415w" >
  <!--[if IE 9]></video><![endif]-->
  <img class="bbCodeImage LbImage lazyload" alt="" src="{xen:link full:attachments, $attachment}?q=70&fit=max&auto=format">
</picture>

In this way I am using <picture> tags to manage my breakpoints, I'm only using one breakpoints effectively of below and above 480 px viewports. Basically mobile and not mobile.

Then I append the $attachement placeholder with my Imgix string and voila, images get delivered that are highly optimised file size and correct for viewport. The addition of '&auto=format' provides a little 'retina' friendly processing on png images where appropriate.

However, to make this work for IE 9, 10 & 11 you need to also add the polyfil 'picturefill.js' to your site. (Google it, it's a highly recommended polyfil for the <picture> element)

Lastly I added 'lazysizes.js' to my site. Not quite as compact as Unveil but with more options and just as robust and is compact enough. Note the use of 'data-srcset=' instead of 'srcset=' and the addition of the lazyload class in the code above.

IMAGEFLY
For linked images I use imagefly because random images hosted elsewhere don't require secure signing in order to be grabbed and processed by Imagefly.

To implement all I needed to do was edit ..\library\XenForo\BbCode\Formater\Base.php

original code;
Code:
protected $_imageTemplate = '<img src="%1$s" class="bbCodeImage%2$s" alt="[&#x200B;IMG]" data-url="%3$s" />';

new code;
Code:
    protected $_imageTemplate = '<picture>
  <!--[if IE 9]><video style="display: none;"><![endif]-->
  <source media="(min-width: 480px) and (-webkit-min-device-pixel-ratio: 1.25), (min-width: 480px) and (min-resolution: 120dpi)" sizes="905px" srcset="http://skicomau.imagefly.io/w_905,n/%1$s 905w" >
  <source media="(min-width: 480px)" sizes="905px" srcset="http://skicomau.imagefly.io/w_905,n/%1$s 905w" >
  <source media="(min-width: 0px) and (-webkit-min-device-pixel-ratio: 1.25), (min-width: 0px) and (min-resolution: 120dpi)" sizes="415px" data-srcset="http://skicomau.imagefly.io/w_415,n/%1$s" >
  <source media="(min-width: 0px)" sizes="415px" srcset="http://skicomau.imagefly.io/w_415,n/%1$s 415w" >
  <!--[if IE 9]></video><![endif]-->
  <img src="%1$s" class="bbCodeImage%2$s" alt="[&#x200B;IMG]" data-url="%3$s" />
  </picture>';

I have played around with turning lazyload on and off testing performance. At the moment I have lazyload turned on for Imgix and turned off for imagefly which seems to work very well.

CACHE EXPIRE HEADERS
I also had to do a small hack on bd attachement store so that images pushed to S3 got a cache expire header of 365 days. Otherwise Imgix would default to 30 minutes cache expire header which is too short (imo).

I added line 54 in ..\library\bdAttachmentStore\ShippableHelper\S3.php
by56+



A short cache expire is my main gripe with Imagefly and it is not something that can be set. Basically the Imagfly business model is primarily a resale of bandwidth and a long cache expire is a disincentive to that model. The Imgix business model is primarily a resale of CPU cycles so they will re-process an image every 30 days into their edge servers - but those who have previously been served will at least have a browser cached image to load.

I am very pleased with how it is working, though I feel that it is still somewhat convoluted, I still used GFNio and kraken.io on upload but feel that is now an unnecessary process. Saving only 10% file size makes the arbitrage between the kraken fee and the additional AWS less of a win and a longer payout window by the time the $ saving from indefinite AWS hosting tips the upfront fee of processing it.

However, whichever way it goes, my objective of holding the best possible 'Mezzanine' image long term is achieved.
 
Last edited:
Sure does! Unfortunately I can't afford Imgix $3 per 1,000 master images accessed each month plus bandwidth charges. The bandwidth charge is cheap enough, but $3 per 1000 master images adds up quickly!
 
Other notes:

With the death of Flash, I have decided to turn off the Flash Uploader option completely (ACP -> Attachments) --- this also means I know categorically that all members are experiencing the same upload process when it comes to debugging complaints.

Initially I set both Bd attachment store and GFNio to 'defer' processing. Thinking that the fastest possible upload was the best UX. However this resulted in unexpected member screwups such that I now have turned both processes on to be immediate. It results in a longer upload process but the UX is better understood by the member.

Example complaint.
I uploaded a pic straight from my iPhone (a screen shot) to the Niseko wins Heliski award thread. As a full pic it showed up in the draft but when I posted it was blank. If I clicked on the blank space I got the larger version of the screen shot. To make it work I had to insert the photo as a thumbnail.

Basically the delays meant they thought things were not working and then attempted workarounds that broke things. Better to have a long delay and a linear workflow.
 
Sure does! Unfortunately I can't afford Imgix $3 per 1,000 master images accessed each month plus bandwidth charges. The bandwidth charge is cheap enough, but $3 per 1000 master images adds up quickly!

Yep,

Ideally this whole process should be similar to Wordpress where you can register additional images sizes to process on upload, store those indefinitely along with the original and not have to use any 3rd party services other than a CDN (if desired) to serve the images.

In that way you would only be loosely coupled to any 3rd party optimisation processes such as Kraken.io, Imgix or Imagefly - not dependent on them.

This is where I would like to get to.

(would happily commission such a plugin)

FYI - I have a subscription model on my forums which effectively covers the cost of all this. My outgoings are ~$3K per annum and the subs bring in just a little more than that. So I'm not going backwards.
 
I started to use an image service that charged on a similar model to Imgix and rang up a $70 tab after about 8-10 hours. Pulled the plug. Maybe the burden of "master images" is front loaded, but I didn't want to chance it.

What I started to do beginning yesterday is as follows:

-Logged in members see image attachments up to 1600px, optimized by the Nobita addon.

-Guests only see thumbnails, which in my case are 788px, so not tiny. These are optimized by both the Nobita addon as well as ngx_pagespeed. ngx_pagespeed resizes them to fit rendered dimensions and converts them to webp where applicable.

-Everything gets cached by Cloudflare, which doesn't perform nearly as well as KeyCDN, but as far as I can see there is no way to get the images on KeyCDN without losing the distinction between logged in guest and member permissions.
 
Yes,

it's resized for desktop too.

this is the whole point, I allow 10Mb uploads with no width x height size restrictions other than a max of 40,000,000 pixels.

In this way I am future proofing my image store, plus mobile phones can upload large images these days and I want it to be as frictionless an experience as possible. Resizing to share an image is a pain. The users compare forums with Facebook. Facebook is pure frictionless. That is what all us forum operators are competing with in terms of attracting and holding members.

using the <picture> tag the desktop view loads a max 950width image which loads fast and a full size image can be viewed if clicked.
 
Last edited:
In this way I am future proofing my image store

Understood.

plus mobile phones can upload large images these days and I want it to be as frictionless an experience as possible. Resizing to share an image is a pain.

This part is a non-issue IMO. I allow up to 50MB uploads but have a 1600px limit in attachment options. If someone exceeds 1600px, XenForo automatically resizes it to 1600px. It is completely frictionless for the user. Much like Facebook, which doesn't give the user full res images either.

using the <picture> tag the desktop view loads a max 950width image which loads fast and a full size image can be viewed if clicked.

I can see that they are resized on desktop and that the file sizes are much smaller. On my desktop using Chrome, nothing happens when I click on them. Do I have to be logged in? Also, PageSpeed Insights is loading them full size which would give me some concerns, perhaps unfounded, about SEO.
 
Yeah

I used to fuss about what Pagespeed insights said too until I realised it was stupid and had no connection with 'actual' UX speed a real user experiences.

I could rant for ages but couple of points;

1. Google does not rank forum pages (in the aggregate) well anymore because they get in the way of making Google any money.

2. Most forum pages are not worthy of ranking well in google search anyway. See point 1.

3. Pagespeed insights constantly raises a red flag on the cache value on google's own products. Analytics the single most ridiculous example. What's going on there ffs?

4. Pagespeed insights is really just Google hegemony pushing us publishers to build sites that help their AI initiatives.

5. The fact they have not adjusted that tool in years is a big clue . eg. why does it not interpret <picture> tags correctly?

I concur with most everything this bloke writes
https://blog.wp-rocket.me/the-truth-about-google-pagespeed-insights/
 
On my desktop using Chrome, nothing happens when I click on them.

that's a bug in Chrome, the large image loads in, but does not bust out of frame because it has no width tag - so looks like nothing is happening. If you right click open image in new window you will see that it is full size image. I have not sorted a fix for that yet.

I'm not 100% settled on this setup. But it is a big improvement on previous. I expect to try something different in the new year that will eliminate the dynamic image servers of Imgix and Imagfly, but won't attempt until I have moved from current VPS to a VPS on Amazon.
 
I expect to try something different in the new year that will eliminate the dynamic image servers of Imgix and Imagfly

Don't forget to like my suggestion (if you like it):

https://xenforo.com/community/threa...ges-using-srcset-and-sizes-attributes.123766/

Most of page weight comes from images. Currently Xenforo does little to help with that, and the current method of attachment storage and display doesn't work with most third party image optimization solutions unless you edit the core files.
 
Hi Guys, great thread and very timely as we're in the process of migrating to XF and looking for how to handle the images from 10 years of previous posts as well as providing a slick, modern way of letting our users upload and share images.

We will be migrating 10 years of posts, pulling in all hotlinked images using the Convert Image plugin and converting to attachments.
We'll set the image attachment options to max attachment size 15mb, max attachment dimensions 1600x1600px
Planning to optimise all attachments with the Nobita Image Optimiser using the jpegoptim set to 90
We will send all attachments up to Amazon S3 (not sure exactly which method yet) and have this set to a specific subdomain like pictures.mydomain.com
Then in Cloudflare Pro we'll set caching on for this subdomain including the Polish and Mirror settings.

I think this gives what we need without having to pay out for 3rd party solutions which would get expensive with our hundreds of thousands of images and likely monthly tb or so of data transfer.
Are there any tricks I've missed?

thanks,
James
 
Top Bottom