this post was submitted on 20 Oct 2025
935 points (99.4% liked)

Funny

12041 readers
1907 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] xxce2AAb 133 points 3 days ago (6 children)

And then AWS comes back online, but the transient state was wiped and now 'she' no longer remembers you. That's a plot for a sci-fi short film right there. You're welcome, Hollywood.

[–] Rooster326@programming.dev 45 points 3 days ago (1 children)

How would you even know it forgot you?

Do you remember me?

You're absolutely right...

  • Every AI in existence
[–] groet@feddit.org 7 points 3 days ago (1 children)

I think there is a bit of nuance to it. The AI usually rereads the chatlog to "remember" the past conversation and generates the answer based on that+your prompt. I'm not sure how they handle long chat histories, there might very well be a "condensed" form of the chat + the last 50 actually messages + the current prompt. If that condensed form is transient then the AI will forget most of the conversation on a crash but will never admit it. So the personality will change because it lost a lot of the background. Or maybe they update the AI so it interprets that condensed form differently

load more comments (1 replies)
[–] FenrirIII@lemmy.world 15 points 3 days ago (1 children)

"Fifty First Reboots" starring Adam Sandler

[–] xxce2AAb 3 points 2 days ago

That does sound like how Hollywood would handle it, yeah.

[–] SatansMaggotyCumFart@lemmy.world 27 points 3 days ago (3 children)

What if it just glitches a bit and replaces the default personality so now she's everyone's girlfriend?

[–] xxce2AAb 11 points 3 days ago (1 children)

That's already the case. She's just not being honest about it. Buy hey, this is the 21st century -- If guys want to share... servers, who am I to kink shame them?

Okay but only if I'm the last guy to use the... server.

[–] jubilationtcornpone@sh.itjust.works 3 points 3 days ago (1 children)

If AI is supposed to eventually replicate human level intelligence then it stands to reason that a certain percentage of AI girlfriends have undiagnosed personality disorders.

[–] Rooster326@programming.dev 3 points 3 days ago

If AI is literally built, and trained on human generated text then it stands to reason that it already has more mental illnesses than your average Tumblr...

[–] explodicle@sh.itjust.works 3 points 3 days ago

That's great because we were fighting and I was in really big trouble.

[–] shalafi@lemmy.world 6 points 3 days ago (1 children)

Eternal Sunshine of the Spotless Mind, but one-sided. Horror or scy-fi?

load more comments (1 replies)
[–] nixus@anarchist.nexus 6 points 3 days ago* (last edited 3 days ago) (3 children)
[–] kautau@lemmy.world 10 points 3 days ago (1 children)

What is this xml in my markdown

load more comments (1 replies)
[–] RedSnt 4 points 3 days ago (1 children)
[–] nixus@anarchist.nexus 3 points 3 days ago (1 children)

Is it not hidden? I'm on a pie-fed instance, and it works fine for me.

[–] RedSnt 3 points 3 days ago (1 children)

Ah, that's fun to know. I know it's markdown that works on Github at least, but it doesn't work on my lemmy instance at least. For lemmy, what I linked is what works. Weird that piefed has chosen different syntax, or maybe both works?

testtest

[–] nixus@anarchist.nexus 3 points 3 days ago (1 children)

OK, that's weird. Your comment works for me, but when I tried updating my comment to that syntax, it didn't work.

I'll play around with it and see if maybe I just typo'd and need more coffee or something.

[–] RedSnt 1 points 2 days ago

Sounds to me like piefed just offer more syntax than regular lemmy. I mean, I'd prefer if it was

[–] xxce2AAb 4 points 3 days ago

Thanks for the recommendation.

[–] alsaaas@lemmy.dbzer0.com 58 points 3 days ago (10 children)

If you have legit delusions about chatbot romantic partners, you need therapy like a year ago

[–] TragicNotCute@lemmy.world 35 points 3 days ago (1 children)
[–] TheReturnOfPEB@reddthat.com 18 points 3 days ago* (last edited 3 days ago) (1 children)

we do ai couple's ai therapy

[–] Aceticon@lemmy.dbzer0.com 4 points 3 days ago* (last edited 3 days ago)

According to the AI therapist both are "absolutelly right" even when contradicting each other.

[–] Rhaedas@fedia.io 21 points 3 days ago

If we had better systems in place to help everyone who needs it, this probably wouldn't be a problem. Telling someone they need therapy isn't helpful, it's just acknowledging we aren't aiding the ones who need it when they need it most.

I'll go further and say anyone who thinks any of these AI are really what they're marketed as needs help, as in education of what is and isn't possible. So that will cover all instances, not just the romantic variety.

[–] Mk23simp@lemmy.blahaj.zone 13 points 3 days ago (1 children)

Careful, you should probably specify that therapy from a chatbot does not count.

[–] DragonTypeWyvern@midwest.social 10 points 3 days ago

"help I've fallen in love with my therapist!" recursive error

[–] Aceticon@lemmy.dbzer0.com 3 points 3 days ago

I don't think therapy can cure Stupid.

load more comments (5 replies)
[–] sundray@lemmus.org 32 points 3 days ago (4 children)

Man, they can make a chatbot that makes people fall in love with it and drive them insane, but they can't even make ONE really good blowjob machine? SMH.

[–] gmtom@lemmy.world 14 points 3 days ago

Why would we need a blowjob machine when Ur mom exists?

[–] Hazmatastic@lemmy.world 16 points 3 days ago (3 children)

Fuck, I'd settle for a printer that just did its job

[–] balance8873@lemmy.myserv.one 3 points 3 days ago

And I'd settle for winning the lottery but that ain't happening either

[–] hoppolito@mander.xyz 3 points 3 days ago (2 children)

tbh seems a little unsafe, blowjob from your printer

clamps shut on dick

Error: cannot release penis. Out of cyan.

Error: Non-HP Chip Detected. The lube cartridge may be a counterfeit or is not comparable with this printer. Place replace with an HP certified cartridge.

[–] explodicle@sh.itjust.works 2 points 3 days ago

Then buy Brother or pay the difference with your misery.

[–] normalexit@lemmy.world 9 points 3 days ago

Subscribing for future updates.

they're trying the wrong model. you need to use a leech's mouth

[–] vane@lemmy.world 10 points 3 days ago* (last edited 3 days ago)

if your girlfriend was on aws us east 1 it means it was not only your girlfriend, real ai girlfriends are self hosted

[–] Sxan@piefed.zip 23 points 3 days ago (1 children)

When you discover you're running on AWS.

[–] sundray@lemmus.org 12 points 3 days ago

Oh, you flatter me! ☺️

At best I'm running on a Raspberry Pi 3.

[–] Scubus@sh.itjust.works 4 points 3 days ago (2 children)

Can someone explain 100% of the comtext here, because I am completely lost. Im guessing AWS is a server? And that is a guy laying in the snow?

[–] SmoothOperator@lemmy.world 19 points 3 days ago (1 children)

The picture is the main character from Blade Runner 2049, who is in love with a digital woman that he loses when the harddrive she lives on is destroyed.

AWS means Amazon Web Services, the main cloud infrastructure of the world afaik. It recently had an outage, possibly erasing people's AI partners.

[–] Blackmist@feddit.uk 5 points 3 days ago

In fairness to K, his pretend girlfriend is at least Ana de Armas, rather than some anime weeaboobs.

[–] hansolo@lemmy.today 8 points 3 days ago

AWS server US-East 1 went down yesterday, causing a massive internet blackout. It was an internal oopsie, someone did something that caused a problem with DNS resolution. Basically, you go looking for something and the server was like "Uh...Oh, I should know this? Oh, shoot. Uh.... uh....I give up,I dunno."

The screencap is from the end of Blade Runner 2049, a movie about humans and "replicants" that are really good cyborgs. I won't spoil the movie for you, but it's a joke about AI in a context about AI kind of thing.

load more comments
view more: next ›