

Well, from what I understand for admins you have some config keys being PF_OPTIMIZE_IMAGES
to toggle the entire optimization pipeline (or accept supported formats as is) and the IMAGE_QUALITY
percentage as an integer to tweak the lossy compression for formats that support it.
The image resize to 1080 is even hardcoded in the optimizrtion pipeline. I think I saw a toggle for it on the PHP side, but it seems they only expose the toggling of storage optimization as a whole for admins. The 1080 is currently not exposed as a parameter to set, sadly.
As a creator, I was interested in the maximum possible quality to retain. As PNG is often supported and by design only features lossless compression at best while remaining well under 15MB for a file with common image aspect ratio’s, that was the winner in that regard. My uncropped 24MP images then become 3MB-ish.
Other formats tend to be way smaller in filesize due to lossy compression being so effective and most images I checked on Pixelfed are resized&optimized JPEGs well under 1MB (around 600-800KB). That is probably the file format and size you’ll encounter most.
My own filesize comparisons were for RAW exports using Darktable for different file formats, qualities and resolutions. The PHP image pipeline used by Pixelfed will probably yield comparable results for the same image.
If I were to advocate new settings, that would be cranking up the resolution to more modern standards (like fitting a 4k monitor) and converting to WebP at some 85% (or sticking with 80%).
It’s difficult, though, as that may introduce double-lossy pipelines when converting other lossy formats. That’s why I looked into resolution settings first. If you upload an image that is too large, it currently decodes your (maybe lossy) image, resizes that (lossy, probably?) and re-encodes that using the set lossy quality if applicable.
Thus, first order of business: at least publish ideal image sizes.
Second, better quality control. Might involve settings per file format or setting a unified output file format.
And then you have a trained model that requires vast amounts of energy per request, right? It doesn’t stop at training.
You need obscene amounts GPU power to run the ‘better’ models within reasonable response times.
In comparison, I could game on my modest rig just fine, but I can’t run a 22B model locally in any useful capacity while programming.
Sure, you could argue gaming is a waste of energy, but that doesn’t mean we can’t argue that it shouldn’t have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.