

I use the grapheneos pin fingerprint combo with a longer password if that fails or bfu.
I use the grapheneos pin fingerprint combo with a longer password if that fails or bfu.
No. Lubuntu is designed to use very little resources which makes it faster on slow hardware where the os is a lot of the load. If you have fast hardware, regular Ubuntu might use (making this up but the point generally stands) 2%CPU and 3G of RAM and lubuntu would use 1%CPU and 2G of RAM. That would be a much larger boost if you have a much weaker CPU and only 4G of RAM, but you likely wouldn’t notice a difference on fast hardware.
Edit: spelling
Ollama can pull info from the web using multiple sites, but yes local AIs are more prone to hallucination. Google did release Gemma3 which has a 27B model which is probably the most cost effective way to get into local models that rival chatgpt (if you can call about 2k cost effective). That was why I recommended duck.ai as well, as it has access to gpt and llama3.3:70b which will do a lot better.
You could check out localllama on lemmy to run foss ai models locally, or you could check out duck.ai as someone else mentioned. Your mental health should come first so do what you can for privacy but don’t feel bad about making compromises.
I don’t think so. I think they sort of have to branch off as lemmy gains users.