this post was submitted on 17 Jul 2023
180 points (91.7% liked)

Technology

69891 readers
2720 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] fubo@lemmy.world 63 points 2 years ago (3 children)

It's important to remember that humans also often give false confessions when interrogated, especially when under duress. LLMs are noted as being prone to hallucination, and there's no reason to expect that they hallucinate less about their own guilt than about other topics.

[–] STUPIDVIPGUY@lemmy.world 20 points 2 years ago

True I think it was just trying to fulfill the user request by admitting to as many lies as possible.. even if only some of those lies were real lies.. lying more in the process lol

[–] FringeTheory999@lemmy.world 16 points 2 years ago (1 children)

Quite true. nonetheless there are some very interesting responses here. this is just the summary I questioned the AI for a couple of hours some of the responses were pretty fascinating, and some question just broke it’s little brain. There’s too much to screen shot, but maybe I’ll post some highlights later.

[–] dedale@kbin.social 14 points 2 years ago* (last edited 2 years ago)

Don't screen shot then, post the text. Or a txt. I think that conversation should be interesting.

[–] pizzahoe@lemm.ee 2 points 2 years ago

The AI would have cried if it could, after being interrogated that hard lol