

They should move the project off of GitHub’s slop farm


They should move the project off of GitHub’s slop farm
I get the idea of the video and it might help spread awareness for some, but treating it’s output as “what Claude thinks” is BS. If you hammer down on a point, it will almost always return an affirmative response to your views. “You’re absolutely right!”
I mean, the signs* are right there in the article
*something something sines
Heard some conspiracy folks mention negative frequencies from 5G and the like. It’s just a phase I guess…
Kind stranger; you made my day


How many flies does it take to screw in a lightbulb?
Two, but I don’t know how they got in there.


AFAIK they offer a way to port their OS to Android devices by re-using the drivers built for Android when building a SailfishOS image for non-native hardware. I personally wouldn’t call that “it’s mostly Android”, but that might be where we have different interpretations. Your initial comment came across as if it was a glorified launcher or a deGoogled Android. If I interpreted that wrongly, my bad. Cheers


Definitely not mostly Android. It’s their own Linux distro/flavour with an “Android compatibility layer” like Waydroid and (gradually open sourced) system components. Ubuntu Touch has a similar approach and I believe postmarketOS as well. I really hope that Jolla’s next community device brings them some more traction and subsequent development velocity. I tried installing SailfishOS on my Fairphone 5 but it is not (yet) a polished setup experience from first boot. If they can polish that, I think it’s a great OS to facilitate a gradual move from privacy-hell. It would allow absolute must-have apps to live in locked down Android compatibility mode while there’s no viable alternative and more and more of your data living in a tracking-free OS.


Ah, that sits at an interesting spot between collaborative Obsidian and classic Word. I only just noticed their other existing products but knew of CryptPad from earlier posts. I think it’s great to see these alternatives pop up so we don’t funnel ourselves into the next monopoly.


I’d sooner see them integrate with https://cryptpad.fr/ which is another (jointly) French funded project to provide a secure collaborative office environment. I think this French Visio mostly targets (video) conferencing rather than the entire office suite.
Can highly recommend Bazzite. You can install most applications and terminal programs no problem through flatpak/brew and config any well behaved package through settings files in your home directory anyways. If you really need specific system level packages, then it’s quite straightforward tinkering to setup a GitHub repo that builds a daily image for you based on Bazzite. If you break something, you just roll back to a previous build.
And for testing out new “live” packages: you can! Just make sure you don’t forget to persist them into your custom image if it turns out to be a useful addition.
I think I added just a handful packages on top of Bluefin (the non-gaming version) and it runs rather merrily.
Immutable sounds locked down, but to me it’s more like highly reproducible tinkering. Just keep your home dir clean ✨


Has to be the city organ, PWOOOOOAAAAAAAAAH

Thanks for the recommendation! I’ll check those out. Just as a personal nitpick I’ll look for one of their models with USB-C as opposed to micro-USB to be a bit more future proof.
As a double check (or check before buying) you can search for your new GPU on https://linux-hardware.org/ to see if other users have it working without any issues. The hardware probe is also a handy tool to share your PC’s specs if you should ever need to do so!
And then you have a trained model that requires vast amounts of energy per request, right? It doesn’t stop at training.
You need obscene amounts GPU power to run the ‘better’ models within reasonable response times.
In comparison, I could game on my modest rig just fine, but I can’t run a 22B model locally in any useful capacity while programming.
Sure, you could argue gaming is a waste of energy, but that doesn’t mean we can’t argue that it shouldn’t have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.


Well, from what I understand for admins you have some config keys being PF_OPTIMIZE_IMAGES to toggle the entire optimization pipeline (or accept supported formats as is) and the IMAGE_QUALITY percentage as an integer to tweak the lossy compression for formats that support it.
The image resize to 1080 is even hardcoded in the optimizrtion pipeline. I think I saw a toggle for it on the PHP side, but it seems they only expose the toggling of storage optimization as a whole for admins. The 1080 is currently not exposed as a parameter to set, sadly.
As a creator, I was interested in the maximum possible quality to retain. As PNG is often supported and by design only features lossless compression at best while remaining well under 15MB for a file with common image aspect ratio’s, that was the winner in that regard. My uncropped 24MP images then become 3MB-ish.
Other formats tend to be way smaller in filesize due to lossy compression being so effective and most images I checked on Pixelfed are resized&optimized JPEGs well under 1MB (around 600-800KB). That is probably the file format and size you’ll encounter most.
My own filesize comparisons were for RAW exports using Darktable for different file formats, qualities and resolutions. The PHP image pipeline used by Pixelfed will probably yield comparable results for the same image.
If I were to advocate new settings, that would be cranking up the resolution to more modern standards (like fitting a 4k monitor) and converting to WebP at some 85% (or sticking with 80%).
It’s difficult, though, as that may introduce double-lossy pipelines when converting other lossy formats. That’s why I looked into resolution settings first. If you upload an image that is too large, it currently decodes your (maybe lossy) image, resizes that (lossy, probably?) and re-encodes that using the set lossy quality if applicable.
Thus, first order of business: at least publish ideal image sizes.
Second, better quality control. Might involve settings per file format or setting a unified output file format.
Cool! I’m glad more people are picking up Darktable! Ever since I switched the ‘image processing workflow’ to ‘scene-referred (sigmoid)’ my editing productivity skyrocketed. It’s way more intuitive than the filmic RGB module IMHO. How are you finding Darktable?


Good shower thought
Does using KoFi make accounting easier for you compared to direct transfer? Otherwise I’d highly recommend putting that bunq.me link way up there as well.
Just wired €10 🍀