this post was submitted on 14 May 2025
-18 points (21.9% liked)
Asklemmy
48007 readers
756 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I talked with ChatGPT about this and it is about as smart as a rock. Talking about being a good idea and such, how it would enrich a community, how the generated images would be beneficial for everyone. Then I asked if it would still say the same if the LLM was rogue, it then said that an AI like that should be stopped (I never called it AI). Then I asked what if the rogue LLM would only act upon its best interest, and followed up with how its view would change if it wasn’t clear that the LLM is actually a human or an LLM. It also said that it’s non consensual if one wouldn’t know it was an AI, how it would diminish trust and stuff.
Edit: screenshot
But what do you think?
I think that it has its uses. Like when you have a clickbaity post, an AI gets the article and summarizes the article into the title. Or as an NSFW flagger to highlight the possible nsfw content to a mod for review. Maybe even an option to translate posts and comments to make communication easier.
Just useful little things like that.