this post was submitted on 13 Feb 2025
51 points (91.8% liked)

[Migrated, see pinned post] Casual Conversation

3439 readers
163 users here now

We moved to !casualconversation@piefed.social please look for https://lemm.ee/post/66060114 in your instance search bar

Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.


RULES

  1. Be respectful: no harassment, hate speech, bigotry, and/or trolling.
  2. Encourage conversation in your OP. This means including heavily implicative subject matter when you can and also engaging in your thread when possible.
  3. Avoid controversial topics (e.g. politics or societal debates).
  4. Stay calm: Don’t post angry or to vent or complain. We are a place where everyone can forget about their everyday or not so everyday worries for a moment. Venting, complaining, or posting from a place of anger or resentment doesn't fit the atmosphere we try to foster at all. Feel free to post those on !goodoffmychest@lemmy.world
  5. Keep it clean and SFW
  6. No solicitation such as ads, promotional content, spam, surveys etc.

Casual conversation communities:

Related discussion-focused communities

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Kitathalla@lemy.lol 1 points 3 months ago

A computer wouldn’t see why they would break a fictional arm so a parental figure could give them a hand

No computer 'sees' any of the logic behind the language. It's all fancy tables and probabilities from beforehand. That's why there was the famous example of the LLM telling that girl to off herself when she was doing her homework. If the learning datasets include anything like reddit, some LLM may answer or respond (in)appropriately to those ideas.