this post was submitted on 10 Oct 2025
642 points (99.2% liked)

Programmer Humor

26898 readers
522 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] perviouslyiner@lemmy.world 23 points 6 days ago

minerals, maxerals...

[–] Agent641@lemmy.world 23 points 6 days ago

The vibe epoch is the number of milliseconds since Wednesday

[–] Randelung@lemmy.world 4 points 6 days ago

Months, but Day. Sept, as already pointed out. The LLM is just following suit.

[–] emb@lemmy.world 117 points 1 week ago* (last edited 1 week ago) (10 children)

Nice touch making Months plural and Day singular.

I also like how Wednessecond isn't going to be the end of the list, trailing comma is there.

Cursed.

[–] mcv@lemmy.zip 3 points 5 days ago

Well, there's multiple months in a year, but only one day per day, so that makes total sense somehow.

[–] SpaceNoodle@lemmy.world 33 points 1 week ago (6 children)

Wednesmillisecond, Wednesmicrosecond, ...

load more comments (6 replies)
load more comments (8 replies)
[–] pewpew@feddit.it 60 points 1 week ago (2 children)
[–] CanadaPlus@lemmy.sdf.org 3 points 6 days ago* (last edited 6 days ago)

Seems like a reasonable thing to bet a whole economy on. /s

I mean, back when it was a huge, poorly understood leap past previous technology it maybe was, but we know now that this is pretty much as good as it can do, just by scaling.

[–] Lemminary@lemmy.world 3 points 6 days ago

It's maaagic. So much so that sometimes we don't know wtf it's doing.

[–] xtools@programming.dev 53 points 1 week ago (10 children)

Is it just me, or are Github Copilot and ChatGPT getting dumber? I'm quite underwhelmed lately.

[–] Xylight@lemdro.id 13 points 6 days ago (2 children)

There is a reason there is sometimes a notable decrease in quality of the same AI model a while after it's released.

Hosters of the models (like OpenAI or Microsoft) may have switched to a quantized version of their model. Quantization is a common practice to increase power efficiency and make the model easier to run, by essentially rounding the weights of the model to a lower precision. This decreases VRAM and storage usage significantly, at the cost of a bit of quality, where higher quantization results in worse quality.

For example, the base model will likely be in FP16, full floating point precision. They may switch to a Q8 version, which nearly halves the size of the model, with about a 3-7% decrease in quality.

[–] MonkeMischief@lemmy.today 4 points 6 days ago* (last edited 6 days ago)

Expertly explained. Thank you! It's pretty rad what you can get out of a quantized model on home hardware, but I still can't understand why people are trying to use it for anything resembling productivity.

It sounds like the typical tech industry:

"Look how amazing this is!" (Full power)

"Uh...uh oh, that's unsustainable. Let's quietly drop it." (Way reduced power)

"People are saying it's not as good, we can offer them LLM+ plus for better accuracy!" (3/4 power with subscription)

[–] mcv@lemmy.zip 2 points 5 days ago (1 children)

But if that's how you're going to run it, why not also train it in that mode?

[–] Xylight@lemdro.id 2 points 5 days ago (1 children)

That is a thing, and it's called quantization aware training. Some open weight models like Gemma do it.

The problem is that you need to re-train the whole model for that, and if you also want a full-quality version you need to train a lot more.

It is still less precise, so it'll still be worse quality than full precision, but it does reduce the effect.

[–] mudkip@lemdro.id 1 points 3 days ago (1 children)

Your response reeks of AI slop

[–] Xylight@lemdro.id 1 points 2 days ago (1 children)
[–] mudkip@lemdro.id 1 points 2 days ago (2 children)

Is it, or is it not, AI slop? Why are you using so heavily markdown formatting? That is a telltale sign of an LLM being involved

[–] psud@aussie.zone 1 points 1 day ago

heavily markdown formatting

They used one formatting mark, and it's the most common. What are you smoking, and may I have some?

[–] Xylight@lemdro.id 1 points 2 days ago (1 children)

I am not using an llm but holy bait

Hop off the reddit voice

[–] mudkip@lemdro.id 1 points 2 days ago

...You do know what platform you're on? It's a REDDIT alternative

[–] dogs0n@sh.itjust.works 61 points 1 week ago (1 children)

Maybe the more copilot is used, the more code on github is ai garbage, ths more copilot trains on github, the worse it gets.

Probably quite a lot of other things too, but I haven't used it so I don't know if it has got worse.

load more comments (1 replies)
load more comments (8 replies)
[–] ByteJunk@lemmy.world 49 points 1 week ago (12 children)

Really curious in what scenarios people would be writing enums with months and weekdays.

Because short of developing yet another library to handle date and time, everything else is likely a disaster waiting to happen...

[–] deadbeef79000@lemmy.nz 12 points 6 days ago

Wrapping a blackbox/legacy system would be a good reason.

Declare the old API in your new language, warts'n'all.

load more comments (11 replies)
[–] SethranKada@lemmy.ca 33 points 1 week ago (1 children)

Is that the only one with four letters? How cruel. Lol.

[–] dontsayaword@piefed.social 58 points 1 week ago (1 children)

We'll fix it next Wednesminute

[–] Triumph@fedia.io 29 points 1 week ago (4 children)

Just give me a Wednessecond.

load more comments (4 replies)
[–] WanderingThoughts@europe.pub 29 points 1 week ago (2 children)

{god, god, god, season, god, god, ceaser, ceaser, number, number, number, number}

load more comments (2 replies)
load more comments
view more: next ›