this post was submitted on 17 Mar 2025
1216 points (99.6% liked)

Programmer Humor

34426 readers
1383 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] dojan@lemmy.world 6 points 2 hours ago (1 children)

Was listening to my go-to podcast during morning walkies with my dog. They brought up an example where some couple was using ShatGPT as a couple's therapist, and what a great idea that was. Talking about how one of the podcasters has more of a friend like relationship to "their" GPT.

I usually find this podcast quite entertaining, but this just got me depressed.

ChatGPT is by the same company that stole Scarlett Johansson's voice. The same vein of companies that thinks it's perfectly okay to pirate 81 terabytes of books, despite definitely being able to afford paying the authors. I don't see a reality where it's ethical or indicative of good judgement to trust a product from any of these companies with information.

[–] Bazoogle@lemmy.world 3 points 29 minutes ago (1 children)

I agree with you, but I do wish a lot of conservatives used chatGPT or other AI's more. It, at the very least, will tell them all the batshit stuff they believe is wrong and clear up a lot of the blatant misinformation. With time, will more batshit AI's be released to reinforce their current ideas? Yea. But ChatGPT is trained on enough (granted, stolen) data that it isn't prone to retelling the conspiracy theories. Sure, it will lie to you and make shit up when you get into niche technical subjects, or ask it to do basic counting, but it certainly wouldn't say Ukraine started the war.

[–] ZMoney@lemmy.world 1 points 6 minutes ago

It will even agree that AIs shouldn't controlled by oligarchic tech monopolies and should instead be distributed freely and fairly for the public good, but the international system of nation states competing against each other militarily and economically prevents this. But maybe it will agree to the opposite of that too, I didn't try asking.

[–] bitjunkie@lemmy.world 12 points 5 hours ago

AI can be incredibly useful, but you still need someone with the expertise to verify its output.

[–] Phoenicianpirate@lemm.ee 21 points 7 hours ago

I took a web dev boot camp. If I were to use AI I would use it as a tool and not the motherfucking builder! AI gets even basic math equations wrong!

[–] Treczoks@lemmy.world 39 points 9 hours ago (1 children)

That is the future of AI written code: Broken beyond comprehension.

[–] LiveLM@lemmy.zip 8 points 7 hours ago* (last edited 7 hours ago)

Ooh is that job security I hear????

[–] slappypantsgo@lemm.ee 10 points 7 hours ago

Holy crap, it’s real!

[–] Nangijala 33 points 10 hours ago (1 children)

This feels like the modern version of those people who gave out the numbers on their credit cards back in the 2000s and would freak out when their bank accounts got drained.

[–] thickertoofan@lemm.ee 8 points 9 hours ago

taste of his own medicine

[–] GenosseFlosse@feddit.org 12 points 10 hours ago

But what site is he talking about?

[–] PeriodicallyPedantic@lemmy.ca 36 points 14 hours ago

I hope this is satire 😭

[–] RedSnt 49 points 15 hours ago (1 children)

Yes, yes there are weird people out there. That's the whole point of having humans able to understand the code be able to correct it.

[–] interdimensionalmeme@lemmy.ml 33 points 15 hours ago (2 children)

Chatgpt make this code secure against weird people trying to crash and exploit it ot

[–] Little8Lost@lemmy.world 18 points 9 hours ago* (last edited 9 hours ago)

beep boop
fixed 3 bugs
added 2 known vulnerabilities
added 3 race conditions
boop beeb

[–] knighthawk0811@lemmy.ml 14 points 14 hours ago

Roger Roger

[–] satans_methpipe@lemmy.world 22 points 15 hours ago

Eat my SaaS

[–] M0oP0o@mander.xyz 99 points 19 hours ago (1 children)

Ha, you fools still pay for doors and locks? My house is now 100% done with fake locks and doors, they are so much lighter and easier to install.

Wait! why am I always getting robbed lately, it can not be my fake locks and doors! It has to be weirdos online following what I do.

[–] TheMagicRat@lemm.ee 6 points 8 hours ago

To be fair, it's both.

[–] allo@sh.itjust.works 112 points 20 hours ago (2 children)

Hilarious and true.

last week some new up and coming coder was showing me their tons and tons of sites made with the help of chatGPT. They all look great on the front end. So I tried to use one. Error. Tried to use another. Error. Mentioned the errors and they brushed it off. I am 99% sure they do not have the coding experience to fix the errors. I politely disconnected from them at that point.

What's worse is when a noncoder asks me, a coder, to look over and fix their ai generated code. My response is "no, but if you set aside an hour I will teach you how HTML works so you can fix it yourself." Never has one of these kids asking ai to code things accepted which, to me, means they aren't worth my time. Don't let them use you like that. You aren't another tool they can combine with ai to generate things correctly without having to learn things themselves.

[–] Thoven@lemdro.id 56 points 19 hours ago

100% this. I've gotten to where when people try and rope me into their new million dollar app idea I tell them that there are fantastic resources online to teach yourself to do everything they need. I offer to help them find those resources and even help when they get stuck. I've probably done this dozens of times by now. No bites yet. All those millions wasted...

load more comments (1 replies)
[–] rekabis@programming.dev 54 points 18 hours ago (20 children)

The fact that “AI” hallucinates so extensively and gratuitously just means that the only way it can benefit software development is as a gaggle of coked-up juniors making a senior incapable of working on their own stuff because they’re constantly in janitorial mode.

[–] daniskarma@lemmy.dbzer0.com 12 points 10 hours ago* (last edited 10 hours ago) (1 children)

Plenty of good programmers use AI extensively while working. Me included.

Mostly as an advance autocomplete, template builder or documentation parser.

You obviously need to be good at it so you can see at a glance if the written code is good or if it's bullshit. But if you are good it can really speed things up without any risk as you will only copy cody that you know is good and discard the bullshit.

Obviously you cannot develop without programming knowledge, but with programming knowledge is just another tool.

[–] Nalivai@lemmy.world 5 points 6 hours ago (1 children)

I maintain strong conviction that if a good programmer uses llm in their work, they just add more work for themselves, and if less than good one does it, they add new exciting and difficult to find bugs, while maintaining false confidence in their code and themselves.
I have seen so much code that looks good on first, second, and third glance, but actually is full of shit, and I was able to find that shit by doing external validation like talking to the dev or brainstorming the ways to test it, the things you categorically cannot do with unreliable random words generator.

[–] daniskarma@lemmy.dbzer0.com 1 points 6 hours ago* (last edited 6 hours ago)

That's why you use unit test and integration test.

I can write bad code myself or copy bad code from who-knows where. It's not something introduced by LLM.

Remember famous Linus letter? "You code this function without understanding it and thus you code is shit".

As I said, just a tool like many other before it.

I use it as a regular practice while coding. And to be true, reading my code after that I could not distinguish what parts where LLM and what parts I wrote fully by myself, and, to be honest, I don't think anyone would be able to tell the difference.

It would probably a nice idea to do some kind of turing test, a put a blind test to distinguish the AI written part of some code, and see how precisely people can tell it apart.

I may come back with a particular piece of code that I specifically remember to be an output from deepseek, and probably withing the whole context it would be indistinguishable.

Also, not all LLM usage is for copying from it. Many times you copy to it and ask the thing yo explain it to you, or ask general questions. For instance, to seek for specific functions in C# extensive libraries.

[–] FrostyCaveman@lemm.ee 25 points 18 hours ago (1 children)

So no change to how it was before then

load more comments (1 replies)
load more comments (18 replies)
[–] Charlxmagne@lemmy.world 31 points 18 hours ago (1 children)

This is what happens when you don't know what your own code does, you lose the ability to manage it, that is precisely why AI won't take programmer's jobs.

[–] ILikeBoobies@lemmy.ca 33 points 18 hours ago (1 children)

I don’t need ai to not know what my code does

[–] SkaveRat@discuss.tchncs.de 10 points 15 hours ago (2 children)

but with AI you can not know even faster. So efficient

[–] Little8Lost@lemmy.world 2 points 9 hours ago

You are even freeing up the space that was needed to comprehend and critically think
More space to keep up with the latest brainrot

[–] cronenthal@discuss.tchncs.de 275 points 1 day ago (17 children)

Bonus points if the attackers use ai to script their attacks, too. We can fully automate the SaaS cycle!

load more comments (17 replies)
load more comments
view more: next ›