If I had my druther’s I’d make my own hosting and call it “UnaGit”, and pretend it’s unagi/eel themed, when it is actually teddy K themed
swlabr
NASB: A question I asked myself in the shower: “Is there some kind of evolving, sourced document containing all the reasons why LLMs should be turned off?” Then I remembered wikis exist. Wikipedia doesn’t have a dedicated “criticisms of LLMs” page afaict, or even a “Criticisms” section on the LLM page. RationalWiki has a page on LLMs that is almost exclusively criticisms, which is great, but the tone is a few notches too casual and sneery for universal use.
Someone should write a script that estimates how much time has been spent re-fondling LLMPRs on Github.
you all joke, but my mind is so expanded by stimulants that I, and only I, can see how this dogshit code will one day purchase all the car manufacturers and build murderbots
Saw a six day old post on linkedin that I’ll spare you all the exact text of. Basically it goes like this:
“Claude’s base system prompt got leaked! If you’re a prompt fondler, you should read it and get better at prompt fondling!”
The prompt clocks in at just over 16k words (as counted by the first tool that popped up when I searched “word count url”). Imagine reading 16k words of verbose guidelines for a machine to make your autoplag slightly more claude shaped than, idk, chatgpt shaped.
Ghoul shit on ghoul shit
Ahh yes, freeze peaches, buttery males etc.
I mean “the right” has managed to corrupt all kinds of fine phrases into dog whistles. I think “virtue signalling” as you have formulated it is a valid observation and criticism of someone’s actions. I blame “liberals” for posturing and virtue signalling as leftist, giving the right easy opportunities to score points.
Just thinking about how LLMs could never create anything close to Rumours by Fleetwood Mac (specifically Dreams but, uh, you can go your own way, ig)
since you’re married to not fucking getting it
AKA commitment to a denial play forward lifestyle
Lmao so many people telling on themselves in that thread. “I don’t get it, I regularly poison open source projects with LLM code!”
Just thinking about how I watched “Soylent Green” in high school and thought the idea of a future where technology just doesn’t work anymore was impossible. Then LLMs come and the first thing people want to do with them is to turn working code into garbage, and then the immediate next thing is to kill living knowledge by normalising people relying on LLMs for operational knowledge. Soon, the oceans will boil, agricultural industries will collapse and we’ll be forced to eat recycled human. How the fuck did they get it so right?