this post was submitted on 06 May 2025
601 points (96.4% liked)

Programmer Humor

23247 readers
865 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] heavyboots@lemmy.ml 102 points 1 week ago (5 children)

Had to click through to change my downvote to an upvote, lol.

load more comments (5 replies)
[–] Maxxie@lemmy.blahaj.zone 99 points 1 week ago* (last edited 1 week ago) (1 children)

(let me preach a little, I have to listen to my boss gushing about AI every meeting)

Compare AI tools: now vs 3 years ago. All those 2022 "Prompt engineer" courses are totally useless in 2025.

Extrapolate into the future and realize, that you're not losing anything valuable by not learning AI tools today. The whole point of them is they don't require any proficiency. It "just works".

Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.

[–] drmoose@lemmy.world 13 points 1 week ago* (last edited 1 week ago) (1 children)

Key skill is to be able to communicate your problem and requirements which turns out to be really hard.

[–] pennomi@lemmy.world 7 points 1 week ago (1 children)

It’s also a damn useful skill whether you’re working with AI or humans. Probably worth investing some effort into that regardless of what the future holds.

load more comments (1 replies)
[–] cjk@discuss.tchncs.de 86 points 1 week ago (3 children)

As an old fart you can’t imagine how often I heard or read that.

[–] caseyweederman@lemmy.ca 56 points 1 week ago (1 children)

You should click the link.

[–] cjk@discuss.tchncs.de 42 points 1 week ago

Hehe. Damn, absolutely fell for it. Nice 😂

[–] BestBouclettes@jlai.lu 31 points 1 week ago (1 children)

Yeah but it's different this time!

[–] andioop@programming.dev 15 points 1 week ago* (last edited 1 week ago) (3 children)

I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.

[–] Lightfire228@pawb.social 9 points 1 week ago (1 children)

Quality work will always need human craftsmanship

I'd wager that most revolutionary technologies are either those that expand human knowledge and understanding, and (to a lesser extent) those that increase replicability (like assembly lines)

[–] Transtronaut@lemmy.blahaj.zone 9 points 1 week ago (2 children)

It's tricky, because there's no hard definition for what it means to "change the world", either. To me, it brings to mind technologies like the Internet, the telephone, aviation, or the steam engine. In those cases, it seems like the common thread is to enable us to do something that simply wasn't possible before, and is also reliably useful.

To me, AI fails on both those points. It doesn't really enable us to do anything new. We already had chat bots, we already had Photoshop, we already had search algorithms and auto complete. It can do some of those things a lot more quickly than older technologies, but until they solve the hallucination problems it doesn't seem reliable enough to be consistently useful.

These things make it come off more as a potential incremental improvement that is still too early in it's infancy, than as something truly revolutionary.

[–] zqwzzle@lemmy.ca 9 points 1 week ago

Well it’ll change the world by consuming a shit ton of electricity and using even more precious water to fill the data centres. So changing the world is correct in that regard.

load more comments (1 replies)
load more comments (1 replies)
[–] Kissaki@programming.dev 22 points 1 week ago* (last edited 1 week ago) (3 children)

I'd love to read a list of those instances/claims/tech

I imagine one of them was low-code/no-code?

/edit: I see such a list is what the posted link is about.

I'm surprised there's not low-code/no-code in that list.

"We're gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!"

Several months later...

"Well that was a complete waste of time."

load more comments (2 replies)
[–] jubilationtcornpone@sh.itjust.works 71 points 1 week ago (9 children)

Remember when "The Cloud" was going to put everyone in IT out of a job?

[–] Rusty@lemmy.ca 28 points 1 week ago (2 children)

I don't think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it's a different set of skills, more similar to SDE than to system administrators.

[–] MinFapper@startrek.website 10 points 1 week ago

And some companies (like mine) just have their SDEs do the SRE job as well. Apparently it incentivizes us to write more stable code or something

load more comments (1 replies)
[–] Colonel_Panic_@lemm.ee 19 points 1 week ago

Naming it "The Cloud" and not "Someone else's old computer running in their basement" was a smart move though.

It just sounds better.

[–] Blackmist@feddit.uk 18 points 1 week ago

Many of our customers store their backups in our "cloud storage solution".

I think they'd be rather less impressed to see the cloud is in fact a jumble of PCs scattered all around our office.

load more comments (6 replies)
[–] superkret@feddit.org 49 points 1 week ago

This technology solves every development problem we have had. I can teach you how with my $5000 course.

Yes, I would like to book the $5000 Silverlight course, please.

[–] jsomae@lemmy.ml 35 points 1 week ago (3 children)

I'm not defending AI here, but "people have been wrong about other things in the past" is a completely worthless argument in any circumstance. See: Heuristics that Almost Always Work.

[–] Excrubulent@slrpnk.net 26 points 1 week ago* (last edited 1 week ago) (3 children)

Interesting article, but you have to be aware of the flipside: "people said flight was impossible", "people said the earth didn't revolve around the sun", "people said the internet was a fad, and now people think AI is a fad".

It's cherry-picking. They're taking the relatively rare examples of transformative technology and projecting that level of impact and prestige onto their new favoured fad.

And here's the thing, the "information superhighway" was a fad that also happened to be an important technology.

Also the rock argument vanishes the moment anyone arrives with actual reasoning that goes beyond the heuristic. So here's some actual reasoning:

GenAI is interesting, but it has zero fidelity. Information without fidelity is just noise, so a system that can't solve the fidelity problem can't do information work. Information work requires fidelity.

And "fidelity" is just a fancy way of saying "truth", or maybe "meaning". Even as conscious beings we haven't really cracked that issue, and I don't think you can make a machine that understands meaning without creating AGI.

Saying we can solve the fidelity problem is like Jules Verne in 1867 saying we could get to the moon with a cannon because of "what progress artillery science has made during the last few years". We're just not there yet, and until we are, the cannon might have some uses, but it's not space technology.

Interestingly, artillery science had its role in getting us to the moon, but that was because it gave us the rotating workpiece lathe for making smooth bore holes, which gave us efficient steam engines, which gave us the industrial revolution. Verne didn't know it, but that critical development had already happened nearly a century prior. ~~Cannons weren't really a factor in space beyond that.~~

Edit: actually metallurgy and solid fuel propellants were crucial for space too, and cannons had a lot to do with that as well. This is all beside the point.

load more comments (3 replies)
load more comments (2 replies)
[–] bappity@lemmy.world 29 points 1 week ago (3 children)
[–] sidelove@lemmy.world 20 points 1 week ago (4 children)

Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don't feel like looking up. Any significant generation tends to go off the rails fast.

[–] T156@lemmy.world 8 points 1 week ago

Getting it to format documentation for you seems to work a treat. Nothing too complex, just "move this bit here, split that into points".

load more comments (3 replies)
load more comments (2 replies)
[–] refurbishedrefurbisher@lemmy.sdf.org 29 points 1 week ago (1 children)

I still think PWAs are a good idea instead of needing to download an app on your phone for every website. Like, for example, PWAs can easilly replace most banking apps, which are already just PWAs with added tracking.

[–] Deebster@infosec.pub 15 points 1 week ago

They're great for users, which is why Google and Apple are letting them die from lack of development so apps can make them money.

[–] teodorista@lemm.ee 27 points 1 week ago* (last edited 1 week ago)

Thanks for summing it up so succinctly. As an aging dev, I've seen quite a lot of tech come and go. I wish more people interested in technology would spend more time learning the basics and the history of things.

[–] PlantPowerPhysicist@discuss.tchncs.de 24 points 1 week ago* (last edited 1 week ago) (2 children)

it's funny, but also holy moly do I not trust a "sign in with github" button

load more comments (2 replies)
[–] shiroininja@lemmy.world 22 points 1 week ago

Good thing I hate web development

[–] humanspiral@lemmy.ca 20 points 1 week ago (2 children)

I'm skeptical of author's credibility and vision of the future, if he has not even reached blink tag technology in his progress.

load more comments (2 replies)
[–] daniskarma@lemmy.dbzer0.com 17 points 1 week ago (1 children)

No one can predict the future. One way or the other.

The best way to not be let behind is to be flexible about whatever may come.

[–] rodbiren@midwest.social 9 points 1 week ago

Can't predict the future, but I can see the past. Specifically the part of the past that used standards based implementations and boring technology. Love that I can pull up html with elements using ALL CAPs and table aligned content. It looks like a hot mess but it still works, even on mobile. Plain text keeps trucking along. Sqlite will outlive me. Exciting things are exciting but the world is made of boring.

10/10. No notes.

[–] someacnt@sh.itjust.works 10 points 1 week ago (2 children)

It pains me so much when I see my colleagues pay OpenAI to do programming assignments.. they see it is faster to ask gpt, than learn it properly. Sadly, I can say nothing to them, or I would risk worsening relations with them.

[–] massive_bereavement@fedia.io 16 points 1 week ago (1 children)

I'm glad they do. This is going to generate so much work opportunities to undo their messes.

[–] someacnt@sh.itjust.works 13 points 1 week ago (1 children)

Except that they are research students in PhD course, it would exacerbate code messiness in research paper codebases.

[–] massive_bereavement@fedia.io 12 points 1 week ago

Or open source projects..

[–] fmstrat@lemmy.nowsci.com 10 points 1 week ago

You should probably click the link

[–] Blackmist@feddit.uk 7 points 1 week ago (1 children)

If you're not using Notepad, I don't even know what to tell you.

load more comments (1 replies)
load more comments
view more: next ›