this post was submitted on 03 May 2025
126 points (96.3% liked)

Selfhosted

46639 readers
1601 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Since selfhosted clouds seem to be the most common thing ppl host, i'm wondering what else ppl here are selfhosting. Is anyone making use of something like excalidraw in the workplace? Curious about what apps that would be useful to always access over the web that aren't mediaservers.

you are viewing a single comment's thread
view the rest of the comments
[–] AtHeartEngineer@lemmy.world 16 points 19 hours ago (2 children)

Local LLMs, I'm surprised no one brought that up yet. I've got an old GPU in my server, and I'm running some local models with openweb-ui for use in the browser and Maid for an Android app to connect to it.

[–] Evotech@lemmy.world 2 points 11 hours ago

To add to this, I host Confusion for image generation

[–] ifItWasUpToMe@lemmy.ca 7 points 18 hours ago (3 children)

You’re a brave one admitting that on here. Don’t you know LLM’s are pure evil? You might as well be torturing children!

[–] AtHeartEngineer@lemmy.world 4 points 10 hours ago

I think looking through the comments on this post about AI stuff is a pretty good representation of my experience on lemmy. Definitely some opinions, but most people are pretty reasonable 🙂

[–] 3dmvr@lemm.ee 6 points 16 hours ago (1 children)

Ais fine as a tool, trying to replace workers and artists while blatantly ripping stuff off is annoying, it can be a timesaver or just helpful for searching through your own docs/files

[–] ifItWasUpToMe@lemmy.ca 4 points 14 hours ago (2 children)

If you agree it’s a time saver, then you agree it makes workers more efficient. You now have a team of 5 doing the work of a team of 6. From a business perspective it’s idiotic to have more people than you need to, so someone would be let go from that team.

I personally don’t see any issue with this, as it’s been happening for the existence of humanity.

Tools are constantly improving that make us more efficient.

Most of people’s issue with AI is more an issue with greedy humans, and not the technology itself. Lord knows that new team of 5 is not getting the collective pay as the previous team of 6.

[–] bluesheep@lemm.ee 2 points 5 hours ago

Nor will they get the workload of 6 people. They might for a couple of months, but at some point the KPI's will suddenly say that it's possible to squeeze out the workload of 2 more people. With maybe even 1 worker less!

[–] 3dmvr@lemm.ee 1 points 5 hours ago

more work can get done and more work can be show in progress, its like a marginal timesaver, itll knock off 25% of a human maybe if that, not replace a whole one

[–] AtHeartEngineer@lemmy.world 6 points 17 hours ago (1 children)

I think most people on here are reasonable, and I think local LLMs are reasonable.

The race to AGI and companies trying to shove "AI" into everything is kind of insane, but it's hard to deny LLMs are useful and running them locally you dont have privacy concerns.

[–] ifItWasUpToMe@lemmy.ca 7 points 17 hours ago (2 children)

Interesting, this has not been my experience. Most people on here seem to treat AI as completely black and white, with zero shades of grey.

[–] iegod@lemm.ee 0 points 3 hours ago

Concur. In particular models focused on image output.

[–] AtHeartEngineer@lemmy.world 3 points 17 hours ago (1 children)

I see a mix, don't get me wrong, Lemmy is definitely opinionated lol, but I don't think it's quite black and white.

Also, generally, I'm not going to not share my thoughts or opinions because I'm afraid of people that don't understand nuance, sometimes I don't feel like dealing with it, but I'm going to share my opinion most of the time.

OP asked what you self host that isn't media, self hosted LLMs is something I find very useful and I didn't see mentioned. Home assistant, pihole, etc, all great answers... But those were already mentioned.

I still have positive upvotes on that comment, and no one has flamed me yet, but we will see.

[–] treyf711@lemm.ee 2 points 17 hours ago (1 children)

I’ll give my recommendation to local LLMs as well. I have a 1060 super that I bought years ago in 2019 and it’s just big enough to do some very basic auto completion within visual studio. I love it. I wouldn’t trust it to write an entire program on its own, but when I have hit a mental block and need a rough estimate of how to use a library or how I can arrange some code, it gives me enough inspiration to get through that hump.

[–] AtHeartEngineer@lemmy.world 2 points 16 hours ago

Ya exactly! Or just sanity checking if you understand how something works, I use it a lot for that, or trying to fill in knowledge gaps.

Also qwen3 is out, check that out, it might fit on a 1060.