Coincidentally, I have just been trying this using the llama.cpp server and two machines on my local LAN.
I made a post about it in https://sh.itjust.works/c/localllama. I’m brand new to lemmy (literal hours) so I’ll probably do this all wrong, but maybe this is a link to my post? https://sh.itjust.works/post/39137051 I’m a little confused about posting links in this federated system, but I hope that works. The upshot is that I got it working fine across two machines, and it was easy to set up, but it has a few minor (to me) drawbacks.
Coincidentally, I have just been trying this using the llama.cpp server and two machines on my local LAN.
I made a post about it in https://sh.itjust.works/c/localllama. I’m brand new to lemmy (literal hours) so I’ll probably do this all wrong, but maybe this is a link to my post? https://sh.itjust.works/post/39137051 I’m a little confused about posting links in this federated system, but I hope that works. The upshot is that I got it working fine across two machines, and it was easy to set up, but it has a few minor (to me) drawbacks.