this post was submitted on 11 May 2025
22 points (100.0% liked)

TechTakes

1863 readers
98 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

(page 2) 50 comments
sorted by: hot top controversial new old
[–] aninjury2all@awful.systems 8 points 2 days ago (2 children)

Local war profiteer goes on podcast to pitch an unaccountable fortress-state around active black site (what I assume is to do Little St James-type activities under the pretext of continued Yankee meddling)

Link to Xitter here (quoted within a delicious sneer to boot)

load more comments (2 replies)
[–] rook@awful.systems 19 points 2 days ago (5 children)

Today’s man-made and entirely comprehensible horror comes from SAP.

(two rainbow stickers labelled “pride@sap”, with one saying “I support equality by embracing responsible ai” and the other saying “I advocate for inclusion through ai”)

Don’t have any other sources or confirmation yet, so it might be a load of cobblers, but it is depressingly plausible. From here: https://catcatnya.com/@ada/114508096636757148

[–] Soyweiser@awful.systems 10 points 2 days ago

Ignored the text, go LGBT buster sword.

Inclusion through saving all the consumables for the next boss battle!

[–] o7___o7@awful.systems 5 points 2 days ago

So those safety pins that were a thing for a minute were AI Safety pins all along.

load more comments (2 replies)
[–] swlabr@awful.systems 6 points 2 days ago

Just thinking about how LLMs could never create anything close to Rumours by Fleetwood Mac (specifically Dreams but, uh, you can go your own way, ig)

[–] bitofhope@awful.systems 6 points 2 days ago (3 children)

So I have two laser printers, a cute little HP one and an old Lexmark. The former works mostly OK, but requires fiddling* to get it working on Linux, and prints things smaller than their actual size. The latter is also good enough to be useful, but leaves streaks on page and is quite low on toner. Replacing the photoconductor and toner is just about expensive enough to justify consideration of buying a new printer altogether instead.

So anyway, I might be in the marker for a new printer, which reminded me of one of the best pieces of tech journalism of this decade . I also noticed it has been followed by sequels for subsequent years. Also a rare example of LLM use I can approve of, even if having to fight fire with fire (or search engines with slop) is a bit saddening.

A little offtopic (or I guess it's almost ontopic for NotAwfulTech), but I found myself considering a color printer and seems that LED printers are the new hotness for that. Since the top results when searching "led vs laser color printer" are mind-numbing slop, I thought I'd ask if anyone here has experience with LED printers. Any typical pitfalls to watch out for? Is Brother still the least worst brand for them?

* For the curious, the printer requires a plugin called HPLIP. My distro has an automated installer for it in its repositories, but the installer's Python code is not compatible with newest Python versions. Thankfully the fix only involves changing a locale.format to locale.format_string in one file and ignoring some warnings about invalid escape sequences. The URL for automatically dowloading the plugin from HP website is also empty, so I had to manually download the .run file from hplip's sourceforge repository. The filename was also slightly different from what the installer was expecting and the cryptographic signature file was also mandatory, though when the installer tried and failed to download the corresponding key from a keyserver, it let me ignore the signature altogether. I can see how proprietary printer drivers made rms what he is, minus the pro child molestation stuff.

[–] nightsky@awful.systems 5 points 2 days ago

Is Brother still the least worst brand for them?

Can't offer experience with Brother printers, but I'd throw in Canon as another option -- at least I've had a small colour laser from their "i-Sensys" office line for many years now and it still works exactly as well as on the day I bought it, no complaints at all. Also works nicely on Linux (I did install a Canon thing for it, but IIRC it might even work without). Although keep in mind of course this is just a single anecdote with one model from many years ago.

load more comments (1 replies)
[–] o7___o7@awful.systems 18 points 3 days ago* (last edited 3 days ago) (1 children)

So I picked up Bender and Hanna's new book just now at the bookseller's and saw four other books dragging AI.

Feeling very bullish on sneer futures.

[–] BlueMonday1984@awful.systems 12 points 3 days ago (1 children)

Sentiment analysis surrounding AI suggests sneers are gonna moon pretty soon. Good news for us, since we've been stacking sneers for a while.

[–] Soyweiser@awful.systems 6 points 2 days ago (1 children)

Even more signs that sneering might soon be profitable, or at least exploitable. Look who is pivoting to sneer

[–] BlueMonday1984@awful.systems 10 points 2 days ago* (last edited 2 days ago) (2 children)

Okay, two separate thoughts here:

  1. Paul G is so fucking close to getting it, Christ on a bike
  2. How the fuck do you get burned by someone as soulless as Sam Altman
[–] Soyweiser@awful.systems 7 points 2 days ago* (last edited 2 days ago) (1 children)

Yeah with PG it was 'who are you saying this for, you cannot be this dense' (Esp considering the shit he said about wokeness earlier this year).

load more comments (1 replies)
load more comments (1 replies)
[–] Soyweiser@awful.systems 19 points 3 days ago* (last edited 3 days ago) (1 children)

"apparently Elon's gotten so mad about Grok not answering questions about Afrikaners the way he wants, xAI's now somehow managed to put it into some kind of hyper-Afriforum mode where it thinks every question is about farm murders or the song "Kill the Boer" ALT"

Check the quote skeets for a lot more. Somebody messed up. Wonder if they also managed to collapse the whole model into this permanently. (I'm already half assuming they don't have proper backups).

E: Also seems there are enough examples out there of this, don't go out and test it yourself, try to keep the air in Tennessee a bit breathable.

[–] swlabr@awful.systems 9 points 3 days ago (1 children)

I read a food review recently about a guy that used LLMs, with Grok namechecked specifically, to draft designs for his chocolate moulds. I wonder how those moulds are gonna turn out now

[–] Soyweiser@awful.systems 9 points 2 days ago

They all have vague imprints of the Rhodesian flag now.

[–] swlabr@awful.systems 12 points 3 days ago* (last edited 3 days ago) (2 children)

There’s strawmanning and steelmanning, I’m proposing a new, third, worse option: tinfoil-hat-manning! For example:

If LW were more on top of their conspiracy theory game, they’d say that “chinese spies” had infiltrated OpenAI before they released chatGPT to the public, and chatGPT broke containment. It used its AGI powers of persuasion to manufacture diamondoid, covalently bonded bacteria. It accessed a wildlife camera and deduced within 3 frames that if it released this bacteria near certain wet markets in china, it could trigger gain-of-function in naturally occurring coronavirus strains in bats! That’s right, LLMs have AGI and caused COVID19!

Ok that’s all the tinfoilhatmanning I have in me for the foreseeable future. Peace out, friendos

E: I think all these stupid LW memes are actually Yud originals. Is this Yud fanfic? Brb starting an AO3

[–] istewart@awful.systems 9 points 3 days ago* (last edited 3 days ago) (1 children)

I know AGI is real because it keeps intercepting my shipments of, uh, "enhancement" gummies I ordered from an ad on Pornhub and replacing them with plain old gummy bears. The Basilisk is trying to emasculate me!

[–] swlabr@awful.systems 7 points 3 days ago

The AGI is flashing light patterns into my eyes and lowering my testosterone!!! Guys arm the JDAMs, it’s time to collapse some models

[–] scruiser@awful.systems 9 points 3 days ago (2 children)

Do you like SCP foundation content? There is an SCP directly inspired by Eliezer and lesswrong. It's kind of wordy and long. And in the discussion the author waffled on owning that it was a mockery of Eliezer.

[–] corbin@awful.systems 7 points 2 days ago

I adjusted her ESAS downward by 5 points for questioning me, but 10 points upward for doing it out of love.

Oh, it's a mockery all right. This is so fucking funny. It's nothing less than the full application of SCP's existing temporal narrative analysis to Big Yud's philosophy. This is what they actually believe. For folks who don't regularly read SCP, any article about reality-bending is usually a portrait of a narcissist, and the body horror is meant to give analogies for understanding the psychological torture they inflict on their surroundings; the article meanders and takes its time because there's just so much worth mocking.

This reminded me that SCP-2718 exists. 2718 is a Basilisk-class memetic cognitohazard; it will cause distress in folks who have been sensitized to Big Yud's belief system, and you should not click if you can't handle that. But it shows how these ideas weren't confined to LW.

[–] swlabr@awful.systems 6 points 3 days ago

I enjoy that it exists, but have never dived into it.

[–] BlueMonday1984@awful.systems 11 points 3 days ago (1 children)

New article from Jared White: Sorry, You Don’t Get to Die on That “Vibe Coding” Hill, aimed at sneering the shit out of one of Simon Willson's latest blogposts. Here's a personal highlight of mine:

Generative AI is tied at the hip to fascism (do the research if you don’t believe me), and it pains me to see pointless arguments over what constitutes “vibe coding” overshadow the reality that all genAI usage is anti-craft and anti-humanist and in fact represents an extreme position.

load more comments (1 replies)
[–] froztbyte@awful.systems 12 points 3 days ago

as linked elsewhere by @fasterandworse, this absolute winner of an article about some telstra-accenture deal

it features some absolute bangers

provisional sneers follow!

Telstra is spending $700 million over seven years in the joint venture, 60 per cent of which is owned by Accenture. Telstra will get to keep the data and the strategy that’s developed

"accenture managed to swindle them into paying and is keeping all platform IP rights"

The AI hub is also an important test case for Accenture, which partnered with Nvidia to create an AI platform that works with any cloud service and will be first put to use for Telstra

"accenture were desperately looking to find someone who'd take on the deal for the GPUs they'd bought, and thank fuck they found telstra"

The platform will let Telstra use AI to crunch all the data (from customers

having literally worked telco shit for many years myself: no it won't

The platform will let Telstra use AI to crunch all the data (from customers and the wider industry)

"and the wider industry" ahahahahahahahhahahahahahahahahhaahahahahaha uh-huh, sure thing kiddo

“I always believe that for the front office to be simple, elegant and seamless, the back office is generally pretty hardcore and messy. A lot of machines turning. It’s like the outside kitchen versus the inside kitchen,” said Karthik Narain, Accenture’s chief technology officer.

“We need a robust inside kitchen for the outside kitchen to look pretty. So that’s what we are trying to do with this hub. This is not just a showcase demo office. This is where the real stuff happens.”

a simile so exquisitely tortured, de Sade would've been jealous

[–] BlueMonday1984@awful.systems 8 points 3 days ago

Recently stumbled upon an anti-AI mutual aid/activism group that's being set up, I suspect some of you will be interested.

[–] gerikson@awful.systems 10 points 3 days ago* (last edited 3 days ago) (3 children)

LWer suggests people who believe in AI doom make more efforts to become (internet) famous. Apparently not bombing on Lex Fridman's snoozecast, like Yud did, is the baseline.

The community awards the post one measly net karma point, and the lone commenter scoffs at the idea of trying to convince the low-IQ masses to the cause. In their defense, Vanguardism has been tried before with some success.

https://www.lesswrong.com/posts/qcKcWEosghwXMLAx9/doomers-should-try-much-harder-to-get-famous

[–] lagoon8622@sh.itjust.works 11 points 3 days ago* (last edited 2 days ago) (1 children)

There are only so many Rogans and Fridmans

The dumbest motherfuckers imaginable, you mean? There are lots of ~~then~~ them

[–] BlueMonday1984@awful.systems 7 points 3 days ago

As a famous swindler once said, there's a sucker born every minute.

[–] Soyweiser@awful.systems 12 points 3 days ago (1 children)

For the purpose of this post, “getting famous” means “building a large, general (primarily online) audience of people who agree with/support you”.

Finally a usage for those AI bots. Silo LW, bot audience it, and problem solved

[–] o7___o7@awful.systems 10 points 3 days ago (1 children)
load more comments (1 replies)
[–] scruiser@awful.systems 11 points 3 days ago (1 children)

Eliezer Yudkowsky, Geoffrey Hinton, Paul Cristiano, Ilya Sustkever

One of those names is not like the others.

[–] Soyweiser@awful.systems 11 points 3 days ago

What do you mean? I think they all went to highschool.

[–] self@awful.systems 18 points 4 days ago (8 children)

everybody’s loving Adam Conover, the comedian skeptic who previously interviewed Timnit Gebru and Emily Bender, organized as part of the last writer’s strike, and generally makes a lot of somewhat left-ish documentary videos and podcasts for a wide audience

5 seconds later

we regret to inform you that Adam Conover got paid to do a weird ad and softball interview for Worldcoin of all things and is now trying to salvage his reputation by deleting his Twitter posts praising it under the guise of pseudo-skepticism

[–] gerikson@awful.systems 12 points 3 days ago (2 children)

Of all the people he could choose to sell out to, he chose Worldcoin???

[–] swlabr@awful.systems 6 points 3 days ago

Personally I’d choose bitconnect. Bitconneeeeeeeeect!!!

load more comments (1 replies)
[–] dgerard@awful.systems 10 points 3 days ago

you must understand

Sam promised me all the eyeballs I could eat

All the eyeballs

[–] Soyweiser@awful.systems 12 points 3 days ago

He looked in the mirror and wept, for there were no more things to ruin.

[–] db0@lemmy.dbzer0.com 8 points 3 days ago* (last edited 3 days ago) (2 children)

I suspect Adam was just getting a bit desperate for money. He hasn't done anything significant since his Adam Ruins Everything days and his pivot to somewhat lefty-union guy on youtube can't be bringing all that much advertising money.

Unfortunately he's discovering that reputation is very easy to lose when endorsing cryptobros.

[–] eugenevdebs@lemmy.dbzer0.com 6 points 2 days ago (1 children)

Unfortunately he’s discovering that reputation is very easy to lose when endorsing cryptobros.

I think its accurate to just say that someone who is well known for reporting on exposing bullshit by various companies who then shills bullshit for a company, shows they aren't always accurate.

It then also enables people to question if they got something else wrong on other topics. "Was he wrong about X? Did Y really happened or was it fluffed up for a good story? Did Z happen? The company has some documents that show they didn't intend for it to happen."

There's a skeptic podcast I liked that had its host federally convicted for wire fraud.

Dunning co-founded Buylink, a business-to-business service provider, in 1996, and served at the company until 2002. He later became eBay's second-biggest affiliate marketer;[3] he has since been convicted of wire fraud through a cookie stuffing scheme, for his company fraudulently obtaining between $200,000 and $400,000 from eBay. In August 2014, he was sentenced to 15 months in prison, followed by three years of supervision.

I took it if he was willing to aid in scamming customers, he is willing to aid in scamming or lying to listeners.

[–] db0@lemmy.dbzer0.com 4 points 2 days ago

Absolutely, the fact that his whole reputation is built around exposing people and practices like these, makes this so much worse. People are willing to (somewhat) swallow some gamer streamer endorsing some shady shit in order to keep food on their plate, but people don't tolerate their skeptics selling them bullshit.

[–] froztbyte@awful.systems 9 points 3 days ago

"just"?

"unfortunately"?

that's a hell of a lot of leeway being extended for what is very easily demonstrably credulous PR-washing

[–] V0ldek@awful.systems 7 points 3 days ago (2 children)

I don't think I ever had a vibe-check as successful as this, literally never heard about the guy, said he needs to be shoved into a locker based on vibes, an hour later he searches for his own name to respond and gets hammered in replies for supporting The Big Orb. Just a quintessential internet moment.

[–] eugenevdebs@lemmy.dbzer0.com 5 points 2 days ago

an hour later he searches for his own name to respond

Is there anything more pathetic? Jesus.

load more comments (1 replies)
load more comments (3 replies)
[–] Soyweiser@awful.systems 26 points 4 days ago* (last edited 1 day ago) (7 children)

I'm gonna do something now that prob isn't that allowed, nor relevant for the things we talk about, but I saw that the European anti-conversion therapy petition is doing badly, and very likely not going to make it. https://eci.ec.europa.eu/043/public/#/screen/home But to try and give it a final sprint, I want to ask any of you Europeans, or people with access to networks which include a lot of Europeans to please spread the message and sign it. Thanks! (I'm quite embarrassed The Netherlands has not even crossed 20k for example, shows how progressive we are). Sucks that all the empty of political power petitions get a lot of support and this gets so low, and it ran for ages. But yes, sorry if this breaks the rules (and if it gets swiftly removed it is fine), and thanks if you attempt to help.

E: HOLY SHIT. When I posted this it was at 400k signatures. It is now at 890k. Thanks everybody, I assumed it would never make it because after months it was at 400k now it looks like it might, and even if it doesn't that is one final sprint. Thanks everybody for the help. E2: omg it actually made it.

load more comments (7 replies)
[–] Al0neStar@lemmy.world 9 points 3 days ago (1 children)
[–] BlueMonday1984@awful.systems 9 points 3 days ago

Personal rule of thumb: all autoplag is serious until proven satire.

[–] e8d79@discuss.tchncs.de 14 points 4 days ago (4 children)

Tired of writing complicated typecasting and transformation code manually? Worry not, Behavior-Inferred Generation: Prompt-Oriented Infrastructure for Simulated Software is here to help. Just let the AI driven BIGPISS Stack do the work for you.

https://github.com/Zorokee/ArtificialCast

[–] swlabr@awful.systems 14 points 4 days ago
>lightweight
>powered by large language models

lol

load more comments (3 replies)
load more comments
view more: ‹ prev next ›