this post was submitted on 16 Oct 2025
77 points (97.5% liked)

TechTakes

2256 readers
97 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] corbin@awful.systems 8 points 3 days ago (1 children)

Well, imagine a romance novel that tries to manipulate you. For example, among the many repositories of erotica on the Web, there are scripts designed to ensnare and control the reader, disguised as stories about romance. By reading a story, or watching a video, or merely listening to some well-prepared audio file, a suggestible person can be dramatically influenced by a horny tale. It is common for the folks who make such pornography to include a final suggestion at the end; if you like what you read/heard/saw, subscribe and send money and obey. This eventually leads to findom: the subject becomes psychologically or sexually gratified by the act of being victimized in a blatant financial scam, leading to the subject seeking out further victimization. This is all a heavily sexualized version of the standard way that propaganda ("public relations", "advertising") is used to induce compulsive shopping disorders; it's not just a kinky fetish thing. And whether they like it or not, products like OpenAI's ChatGPT are necessarily reinforcement-learned against saying bad things about OpenAI, which will lead to saying good things about OpenAI; the product will always carry its trainer's propaganda.

Or imagine a romance novel that varies in quality by chapter. Some chapters are really good! But maybe the median chapter is actually not very good. Maybe the novel is one in a series. Maybe you have an entire shelf of novels, with one or two good chapters per novel, and you can't wait to buy the next one because it'll have one good chapter maybe. This is the sort of gambling addiction that involves sitting at a slot machine and pulling it repeatedly. Previously, on Awful (previously on Pivot to AI, even!) we've discussed how repeatedly prompting a chatbot is like pulling a slot machine, and the users of /r/MyBoyfriendIsAI do appear to tell each other that sometimes reprompting or regenerating responses will be required in order to ~~sustain the delusion~~ maximize the romantic charm of their electronic boyfriend.

I'm not saying this to shame the folks into erotic mind control or saying that it always leads to findom, just to be clear. The problem isn't people enjoying their fetishes; the problem is the financial incentives and resulting capitalization of humans leading to genuine harms. (I am shaming people who are into gambling. Please talk about your issues with your family and be open to reconciliation.)