CallMeButtLove@lemmy.worldtoSelfhosted@lemmy.world•Self-Hosted AI is pretty darn coolEnglish
2·
3 months agoI really hate when companies do that kind of crap. I just imagine a little toddler stomping around going “No! No! Nooo!”
I really hate when companies do that kind of crap. I just imagine a little toddler stomping around going “No! No! Nooo!”
Is there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?
Thank you for that answer! That makes sense.
Lurking beginner here, why is this bad?
Just looked this up and subscribed to the channel.
Is that actually true or is that just their legal team playing it overly safe? Because if it is true that’s incredibly stupid.
Wait I’m only making my way through all the Trek series for the first time. Do we not like Ro Laren? I like Ro Laren. Currently 2/3s through TNG season 5.
What is it about Ricks?
I have a similar setup except I use pfSense as my router and pihole for DNS, but I’m sure you can get the same results with your setup. I’m running HAProxy for my reverse proxy and configs for each of my docker containers so any traffic on 443 or 80 gets sent to the container IP on whatever unique port it uses. I then have DNS entries for each URL I want to access the container by, with all of those entries just pointing to HAProxy. Works like a charm.
I have HAProxy running on the pihole itself but there’s no reason you couldn’t just run that in it’s own container. pfSense also let’s you install an HAProxy package to handle it on the router itself. I don’t know if opensense supports packages like that though.
You can even get fancy and do SSL offloading to access everything over HTTPS.